Sample records for physical-space statistical analysis

  1. The use and misuse of statistical analyses. [in geophysics and space physics

    NASA Technical Reports Server (NTRS)

    Reiff, P. H.

    1983-01-01

    The statistical techniques most often used in space physics include Fourier analysis, linear correlation, auto- and cross-correlation, power spectral density, and superposed epoch analysis. Tests are presented which can evaluate the significance of the results obtained through each of these. Data presented without some form of error analysis are frequently useless, since they offer no way of assessing whether a bump on a spectrum or on a superposed epoch analysis is real or merely a statistical fluctuation. Among many of the published linear correlations, for instance, the uncertainty in the intercept and slope is not given, so that the significance of the fitted parameters cannot be assessed.

  2. Statistical physics of the symmetric group.

    PubMed

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  3. Statistical physics of the symmetric group

    NASA Astrophysics Data System (ADS)

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  4. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  5. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  6. Statistical analysis and interpolation of compositional data in materials science.

    PubMed

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  7. Precision Cosmology: The First Half Million Years

    NASA Astrophysics Data System (ADS)

    Jones, Bernard J. T.

    2017-06-01

    Cosmology seeks to characterise our Universe in terms of models based on well-understood and tested physics. Today we know our Universe with a precision that once would have been unthinkable. This book develops the entire mathematical, physical and statistical framework within which this has been achieved. It tells the story of how we arrive at our profound conclusions, starting from the early twentieth century and following developments up to the latest data analysis of big astronomical datasets. It provides an enlightening description of the mathematical, physical and statistical basis for understanding and interpreting the results of key space- and ground-based data. Subjects covered include general relativity, cosmological models, the inhomogeneous Universe, physics of the cosmic background radiation, and methods and results of data analysis. Extensive online supplementary notes, exercises, teaching materials, and exercises in Python make this the perfect companion for researchers, teachers and students in physics, mathematics, and astrophysics.

  8. Statistical crystallography of surface micelle spacing

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1992-01-01

    The aggregation of the recently reported surface micelles of block polyelectrolytes is analyzed using techniques of statistical crystallography. A polygonal lattice (Voronoi mosaic) connects center-to-center points, yielding statistical agreement with crystallographic predictions; Aboav-Weaire's law and Lewis's law are verified. This protocol supplements the standard analysis of surface micelles leading to aggregation number determination and, when compared to numerical simulations, allows further insight into the random partitioning of surface films. In particular, agreement with Lewis's law has been linked to the geometric packing requirements of filling two-dimensional space which compete with (or balance) physical forces such as interfacial tension, electrostatic repulsion, and van der Waals attraction.

  9. Generic results of the space physics community survey

    NASA Technical Reports Server (NTRS)

    Sharma, Rikhi R.; Cohen, Nathaniel B.

    1993-01-01

    This report summarizes the results of a survey of the members of the space physics research community conducted in 1990-1991 to ascertain demographic information on the respondents and information on their views on a number of facets of their space physics research. The survey was conducted by questionnaire and the information received was compiled in a database and analyzed statistically. The statistical results are presented for the respondent population as a whole and by four different respondent cross sections: individual disciplines of space physics, type of employers, age groups, and research techniques employed. Data from a brief corresponding survey of the graduate students of respondents are also included.

  10. Autonomous perception and decision making in cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Sarkar, Soumik

    2011-07-01

    The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.

  11. The influence of neighbourhood green space on children's physical activity and screen time: findings from the longitudinal study of Australian children.

    PubMed

    Sanders, Taren; Feng, Xiaoqi; Fahey, Paul P; Lonsdale, Chris; Astell-Burt, Thomas

    2015-09-30

    It is often hypothesised that neighbourhood green space may help prevent well-known declines in physical activity and increases in sedentary behaviour that occur across childhood. As most studies in this regard are cross-sectional, the purpose of our study was to use longitudinal data to examine whether green space promotes active lifestyles as children grow older. Data came from participants (n = 4983; age = 4-5) of the Longitudinal Study of Australian Children, a nationally representative study on health and child development. Physical activity and screen time were measured biennially (2004-2012) using questionnaires and time use diaries. Quantity of neighbourhood green space was objectively measured using Australian Bureau of Statistics mesh block data for each participant's statistical area level 2. Multilevel regression was used to test for associations between physical activity and screen time with green space quantity, adjusting for socio-economic confounders. Boys living in areas with 10% more neighbourhood green space had a: 7% (95% CI = 1.02, 1.13) greater odds of choosing physically active pastimes; 8% (95 % CI = 0.85, 1.00) lower odds of not enjoying physical activity; 2.3 min reduction in weekend television viewing (95% CI = -4.00, -0.69); and 7% (95% CI = 1.02; 1.12) and 9% (95% CI = 1.03; 1.15) greater odds of meeting physical activity guidelines on weekdays and weekends, respectively. No statistically (or practically) significant results were observed for girls. Current provisions of neighbourhood green space may be more amenable to promoting active lifestyles among boys than girls. Research is needed to explore what types of green space promote active lifestyles in all children.

  12. Programme of Indian Centre for Space Physics using Very Low Frequency Radio Waves

    NASA Astrophysics Data System (ADS)

    Chakrabarti, Sandip Kumar; Sasmal, Sudipta; Pal, Sujay; Kanta Maji, Surya; Ray, Suman

    Indian Centre for Space Physics conducted two major VLF campaigns all over Indian Sub-continent to study the propagation effects of VLF radio waves. It made multi-receiver observations during solar eclipse. ICSP not only recorded multitudes of solar flares, it also reproduced VLF observation from ab initio calculation. ICSP extended its study to the field of earthquake predictions using signal anomalies and using case by case studies as well as statistical analysis, showed that anomalies are real and more studies are required to understand them. Using earth as a gigantic detector, it detected ionospheric perturbations due to soft gamma-ray repeaters and gamma-ray bursts.

  13. [Cluster analysis applicability to fitness evaluation of cosmonauts on long-term missions of the International space station].

    PubMed

    Egorov, A D; Stepantsov, V I; Nosovskiĭ, A M; Shipov, A A

    2009-01-01

    Cluster analysis was applied to evaluate locomotion training (running and running intermingled with walking) of 13 cosmonauts on long-term ISS missions by the parameters of duration (min), distance (m) and intensity (km/h). Based on the results of analyses, the cosmonauts were distributed into three steady groups of 2, 5 and 6 persons. Distance and speed showed a statistical rise (p < 0.03) from group 1 to group 3. Duration of physical locomotion training was not statistically different in the groups (p = 0.125). Therefore, cluster analysis is an adequate method of evaluating fitness of cosmonauts on long-term missions.

  14. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  15. Face-space architectures: evidence for the use of independent color-based features.

    PubMed

    Nestor, Adrian; Plaut, David C; Behrmann, Marlene

    2013-07-01

    The concept of psychological face space lies at the core of many theories of face recognition and representation. To date, much of the understanding of face space has been based on principal component analysis (PCA); the structure of the psychological space is thought to reflect some important aspects of a physical face space characterized by PCA applications to face images. In the present experiments, we investigated alternative accounts of face space and found that independent component analysis provided the best fit to human judgments of face similarity and identification. Thus, our results challenge an influential approach to the study of human face space and provide evidence for the role of statistically independent features in face encoding. In addition, our findings support the use of color information in the representation of facial identity, and we thus argue for the inclusion of such information in theoretical and computational constructs of face space.

  16. Space-time models based on random fields with local interactions

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Tsantili, Ivi C.

    2016-08-01

    The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.

  17. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  18. Orbit-Attitude Changes of Objects in Near Earth Space Induced by Natural Charging

    DTIC Science & Technology

    2017-05-02

    depends upon Earth’s magnetosphere. Typically, magneto-sphere models can be grouped under two classes: statistical and physics -based. The Physics ...models were primarily physics -based due to unavailability of sufficient space-data, but over the last three decades, with the availability of huge...Attitude Determination and Control,” Astrophysics and Space Sci- ence Library, Vol. 73, D. Reidel Publishing Company, London, 1978 [17] Fairfield

  19. Parallel Climate Data Assimilation PSAS Package Achieves 18 GFLOPs on 512-Node Intel Paragon

    NASA Technical Reports Server (NTRS)

    Ding, H. Q.; Chan, C.; Gennery, D. B.; Ferraro, R. D.

    1995-01-01

    Several algorithms were added to the Physical-space Statistical Analysis System (PSAS) from Goddard, which assimilates observational weather data by correcting for different levels of uncertainty about the data and different locations for mobile observation platforms. The new algorithms and use of the 512-node Intel Paragon allowed a hundred-fold decrease in processing time.

  20. Strange Quark Magnetic Moment of the Nucleon at the Physical Point.

    PubMed

    Sufian, Raza Sabbir; Yang, Yi-Bo; Alexandru, Andrei; Draper, Terrence; Liang, Jian; Liu, Keh-Fei

    2017-01-27

    We report a lattice QCD calculation of the strange quark contribution to the nucleon's magnetic moment and charge radius. This analysis presents the first direct determination of strange electromagnetic form factors including at the physical pion mass. We perform a model-independent extraction of the strange magnetic moment and the strange charge radius from the electromagnetic form factors in the momentum transfer range of 0.051  GeV^{2}≲Q^{2}≲1.31  GeV^{2}. The finite lattice spacing and finite volume corrections are included in a global fit with 24 valence quark masses on four lattices with different lattice spacings, different volumes, and four sea quark masses including one at the physical pion mass. We obtain the strange magnetic moment G_{M}^{s}(0)=-0.064(14)(09)μ_{N}. The four-sigma precision in statistics is achieved partly due to low-mode averaging of the quark loop and low-mode substitution to improve the statistics of the nucleon propagator. We also obtain the strange charge radius ⟨r_{s}^{2}⟩_{E}=-0.0043(16)(14)  fm^{2}.

  1. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  2. Persistence of discrimination: Revisiting Axtell, Epstein and Young

    NASA Astrophysics Data System (ADS)

    Weisbuch, Gérard

    2018-02-01

    We reformulate an earlier model of the "Emergence of classes..." proposed by Axtell et al. (2001) using more elaborate cognitive processes allowing a statistical physics approach. The thorough analysis of the phase space and of the basins of attraction leads to a reconsideration of the previous social interpretations: our model predicts the reinforcement of discrimination biases and their long term stability rather than the emergence of classes.

  3. Methods for improved forewarning of critical events across multiple data channels

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2007-04-24

    This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.

  4. Attempting to physically explain space-time correlation of extremes

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Gailhard, Joel

    2010-05-01

    Spatial and temporal clustering of hydro-meteorological extreme events is scientific evidence. Moreover, the statistical parameters characterizing their local frequencies of occurrence show clear spatial patterns. Thus, in order to robustly assess the hydro-meteorological hazard, statistical models need to be able to take into account spatial and temporal dependencies. Statistical models considering long term correlation for quantifying and qualifying temporal and spatial dependencies are available, such as multifractal approach. Furthermore, the development of regional frequency analysis techniques allows estimating the frequency of occurrence of extreme events taking into account spatial patterns on the extreme quantiles behaviour. However, in order to understand the origin of spatio-temporal clustering, an attempt to find physical explanation should be done. Here, some statistical evidences of spatio-temporal correlation and spatial patterns of extreme behaviour are given on a large database of more than 400 rainfall and discharge series in France. In particular, the spatial distribution of multifractal and Generalized Pareto distribution parameters shows evident correlation patterns in the behaviour of frequency of occurrence of extremes. It is then shown that the identification of atmospheric circulation pattern (weather types) can physically explain the temporal clustering of extreme rainfall events (seasonality) and the spatial pattern of the frequency of occurrence. Moreover, coupling this information with the hydrological modelization of a watershed (as in the Schadex approach) an explanation of spatio-temporal distribution of extreme discharge can also be provided. We finally show that a hydro-meteorological approach (as the Schadex approach) can explain and take into account space and time dependencies of hydro-meteorological extreme events.

  5. ``Models'' CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Statistics, Polls, Politics, Economics, Elections!!!: GRAPH/Network-Physics: ``Equal-Distribution for All'' TRUMP-ED BEC ``Winner-Take-All'' ``Doctor Livingston I Presume?''

    NASA Astrophysics Data System (ADS)

    Preibus-Norquist, R. N. C.-Grover; Bush-Romney, G. W.-Willard-Mitt; Dimon, J. P.; Adelson-Koch, Sheldon-Charles-David-Sheldon; Krugman-Axelrod, Paul-David; Siegel, Edward Carl-Ludwig; D. N. C./O. F. P./''47''%/50% Collaboration; R. N. C./G. O. P./''53''%/49% Collaboration; Nyt/Wp/Cnn/Msnbc/Pbs/Npr/Ft Collaboration; Ftn/Fnc/Fox/Wsj/Fbn Collaboration; Lb/Jpmc/Bs/Boa/Ml/Wamu/S&P/Fitch/Moodys/Nmis Collaboration

    2013-03-01

    ``Models''? CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Goldenfeld[``The Role of Models in Physics'', in Lects.on Phase-Transitions & R.-G.(92)-p.32-33!!!]: statistics(Silver{[NYTimes; Bensinger, ``Math-Geerks Clearly-Defeated Pundits'', LATimes, (11/9/12)])}, polls, politics, economics, elections!!!: GRAPH/network/net/...-PHYSICS Barabasi-Albert[RMP (02)] (r,t)-space VERSUS(???) [Where's the Inverse/ Dual/Integral-Transform???] (Benjamin)Franklin(1795)-Fourier(1795; 1897;1822)-Laplace(1850)-Mellin (1902) Brillouin(1922)-...(k,)-space, {Hubbard [The World According to Wavelets,Peters (96)-p.14!!!/p.246: refs.-F2!!!]},and then (2) Albert-Barabasi[]Bose-Einstein quantum-statistics(BEQS) Bose-Einstein CONDENSATION (BEC) versus Bianconi[pvt.-comm.; arXiv:cond-mat/0204506; ...] -Barabasi [???] Fermi-Dirac

  6. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  7. The GEOS Ozone Data Assimilation System: Specification of Error Statistics

    NASA Technical Reports Server (NTRS)

    Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.

    2000-01-01

    A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.

  8. Origins and properties of kappa distributions in space plasmas

    NASA Astrophysics Data System (ADS)

    Livadiotis, George

    2016-07-01

    Classical particle systems reside at thermal equilibrium with their velocity distribution function stabilized into a Maxwell distribution. On the contrary, collisionless and correlated particle systems, such as the space and astrophysical plasmas, are characterized by a non-Maxwellian behavior, typically described by the so-called kappa distributions. Empirical kappa distributions have become increasingly widespread across space and plasma physics. However, a breakthrough in the field came with the connection of kappa distributions to the solid statistical framework of Tsallis non-extensive statistical mechanics. Understanding the statistical origin of kappa distributions was the cornerstone of further theoretical developments and applications, some of which will be presented in this talk: (i) The physical meaning of thermal parameters, e.g., temperature and kappa index; (ii) the multi-particle description of kappa distributions; (iii) the phase-space kappa distribution of a Hamiltonian with non-zero potential; (iv) the Sackur-Tetrode entropy for kappa distributions, and (v) the new quantization constant, h _{*}˜10 ^{-22} Js.

  9. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  10. The evaluation of physical exam findings in patients assessed for suspected burn inhalation injury.

    PubMed

    Ching, Jessica A; Shah, Jehan L; Doran, Cody J; Chen, Henian; Payne, Wyatt G; Smith, David J

    2015-01-01

    The purpose of this investigation was to evaluate the utility of singed nasal hair (SN), carbonaceous sputum (CS), and facial burns (FB) as indicators of burn inhalation injury, when compared to the accepted standard of bronchoscopic diagnosis of inhalation injury. An institutional review board approved, retrospective review was conducted. All patients were suspected to have burn inhalation injury and subsequently underwent bronchoscopic evaluation. Data collected included: percent burn TBSA, burn injury mechanism, admission physical exam findings (SN, CS, FB), and bronchoscopy findings. Thirty-five males and twelve females met inclusion criteria (n = 47). Bronchoscopy was normal in 31 patients (66%). Data were analyzed as all patients and in subgroups according to burn TBSA and an enclosed space mechanism of injury. Physical exam findings (SN, CS, FB) were evaluated individually and in combination. Overall, the sensitivities, specificities, positive predictive values, and negative predictive values calculated were poor and inconsistent, and they did not improve within subgroup analysis or when physical findings were combined. Further statistical analysis suggested the physical findings, whether in isolation or in combination, have poor discrimination between patients that have and do not have inhalation injury (AUC < 0.7, P > .05) and poor agreement with the diagnosis made by bronchoscopy (κ < 0.4, P > .05). This remained true in the subgroup analysis as well. Our data demonstrated the findings of SN, CS, and FB are unreliable evidence for inhalation injury, even in the context of an enclosed space mechanism of injury. Thus, these physical findings are not absolute indicators for intubation and should be interpreted as one component of the history and physical.

  11. Tree-space statistics and approximations for large-scale analysis of anatomical trees.

    PubMed

    Feragen, Aasa; Owen, Megan; Petersen, Jens; Wille, Mathilde M W; Thomsen, Laura H; Dirksen, Asger; de Bruijne, Marleen

    2013-01-01

    Statistical analysis of anatomical trees is hard to perform due to differences in the topological structure of the trees. In this paper we define statistical properties of leaf-labeled anatomical trees with geometric edge attributes by considering the anatomical trees as points in the geometric space of leaf-labeled trees. This tree-space is a geodesic metric space where any two trees are connected by a unique shortest path, which corresponds to a tree deformation. However, tree-space is not a manifold, and the usual strategy of performing statistical analysis in a tangent space and projecting onto tree-space is not available. Using tree-space and its shortest paths, a variety of statistical properties, such as mean, principal component, hypothesis testing and linear discriminant analysis can be defined. For some of these properties it is still an open problem how to compute them; others (like the mean) can be computed, but efficient alternatives are helpful in speeding up algorithms that use means iteratively, like hypothesis testing. In this paper, we take advantage of a very large dataset (N = 8016) to obtain computable approximations, under the assumption that the data trees parametrize the relevant parts of tree-space well. Using the developed approximate statistics, we illustrate how the structure and geometry of airway trees vary across a population and show that airway trees with Chronic Obstructive Pulmonary Disease come from a different distribution in tree-space than healthy ones. Software is available from http://image.diku.dk/aasa/software.php.

  12. Empirical analysis of storm-time energetic electron enhancements

    NASA Astrophysics Data System (ADS)

    O'Brien, Thomas Paul, III

    This Ph.D. thesis documents a program for studying the appearance of energetic electrons in the Earth's outer radiation belts that is associated with many geomagnetic storms. The dynamic evolution of the electron radiation belts is an outstanding empirical problem in both theoretical space physics and its applied sibling, space weather. The project emphasizes the development of empirical tools and their use in testing several theoretical models of the energization of the electron belts. First, I develop the Statistical Asynchronous Regression technique to provide proxy electron fluxes throughout the parts of the radiation belts explored by geosynchronous and GPS spacecraft. Next, I show that a theoretical adiabatic model can relate the local time asymmetry of the proxy geosynchronous fluxes to the asymmetry of the geomagnetic field. Then, I perform a superposed epoch analysis on the proxy fluxes at local noon to identify magnetospheric and interplanetary precursors of relativistic electron enhancements. Finally, I use statistical and neural network phase space analyses to determine the hourly evolution of flux at a virtual stationary monitor. The dynamic equation quantitatively identifies the importance of different drivers of the electron belts. This project provides empirical constraints on theoretical models of electron acceleration.

  13. Reentry survivability modeling

    NASA Astrophysics Data System (ADS)

    Fudge, Michael L.; Maher, Robert L.

    1997-10-01

    Statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by re-entering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was recently demonstrated in dramatic fashion by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This paper examines reentry survivability estimation methodology, including the specific methodology used by Caiman Sciences' 'Survive' model. Comparison between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and Survive estimates are presented for selected upper stage or spacecraft components and a Delta launch vehicle second stage.

  14. Using Space Syntax to Assess Safety in Public Areas - Case Study of Tarbiat Pedestrian Area, Tabriz-Iran

    NASA Astrophysics Data System (ADS)

    Cihangir Çamur, Kübra; Roshani, Mehdi; Pirouzi, Sania

    2017-10-01

    In studying the urban complex issues, simulation and modelling of public space use considerably helps in determining and measuring factors such as urban safety. Depth map software for determining parameters of the spatial layout techniques; and Statistical Package for Social Sciences (SPSS) software for analysing and evaluating the views of the pedestrians on public safety were used in this study. Connectivity, integration, and depth of the area in the Tarbiat city blocks were measured using the Space Syntax Method, and these parameters are presented as graphical and mathematical data. The combination of the results obtained from the questionnaire and statistical analysis with the results of spatial arrangement technique represents the appropriate and inappropriate spaces for pedestrians. This method provides a useful and effective instrument for decision makers, planners, urban designers and programmers in order to evaluate public spaces in the city. Prior to physical modification of urban public spaces, space syntax simulates the pedestrian safety to be used as an analytical tool by the city management. Finally, regarding the modelled parameters and identification of different characteristics of the case, this study represents the strategies and policies in order to increase the safety of the pedestrians of Tarbiat in Tabriz.

  15. Solar-Heliospheric-Interstellar Cosmic Ray Tour with the NASA Virtual Energetic Particle Observatory and the Space Physics Data Facility

    NASA Astrophysics Data System (ADS)

    Cooper, John F.; Papitashvili, Natalia E.; Johnson, Rita C.; Lal, Nand; McGuire, Robert E.

    2015-04-01

    NASA now has a large collection of solar, heliospheric, and local interstellar (Voyager 1) cosmic ray particle data sets that can be accessed through the data system services of the NASA Virtual Energetic Particle Observatory (VEPO) in collaboration with the NASA Space Physics Data Facility SPDF), respectively led by the first and last authors. The VEPO services were developed to enhance the long-existing OMNIWeb solar wind and energetic particle services of SPDF for on-line browse, correlative, and statistical analysis of NASA and ESA mission fields, plasma, and energetic particle data. In this presentation we take of tour through VEPO and SPDF of SEP reservoir events, the outer heliosphere earlier surveyed by the Pioneer, Voyager, and Ulysses spacecraft and now being probed by New Horizons, and the heliosheath-heliopause-interstellar regions now being explored by the Voyagers and IBEX. Implications of the latter measurements are also considered for the flux spectra of low to high energy cosmic rays in interstellar space.

  16. A spatially informative optic flow model of bee colony with saccadic flight strategy for global optimization.

    PubMed

    Das, Swagatam; Biswas, Subhodip; Panigrahi, Bijaya K; Kundu, Souvik; Basu, Debabrota

    2014-10-01

    This paper presents a novel search metaheuristic inspired from the physical interpretation of the optic flow of information in honeybees about the spatial surroundings that help them orient themselves and navigate through search space while foraging. The interpreted behavior combined with the minimal foraging is simulated by the artificial bee colony algorithm to develop a robust search technique that exhibits elevated performance in multidimensional objective space. Through detailed experimental study and rigorous analysis, we highlight the statistical superiority enjoyed by our algorithm over a wide variety of functions as compared to some highly competitive state-of-the-art methods.

  17. Disability in physical education textbooks: an analysis of image content.

    PubMed

    Táboas-Pais, María Inés; Rey-Cao, Ana

    2012-10-01

    The aim of this paper is to show how images of disability are portrayed in physical education textbooks for secondary schools in Spain. The sample was composed of 3,316 images published in 36 textbooks by 10 publishing houses. A content analysis was carried out using a coding scheme based on categories employed in other similar studies and adapted to the requirements of this study with additional categories. The variables were camera angle, gender, type of physical activity, field of practice, space, and level. Univariate and bivariate descriptive analyses were also carried out. The Pearson chi-square statistic was used to identify associations between the variables. Results showed a noticeable imbalance between people with disabilities and people without disabilities, and women with disabilities were less frequently represented than men with disabilities. People with disabilities were depicted as participating in a very limited variety of segregated, competitive, and elite sports activities.

  18. Accelerated testing of space batteries

    NASA Technical Reports Server (NTRS)

    Mccallum, J.; Thomas, R. E.; Waite, J. H.

    1973-01-01

    An accelerated life test program for space batteries is presented that fully satisfies empirical, statistical, and physical criteria for validity. The program includes thermal and other nonmechanical stress analyses as well as mechanical stress, strain, and rate of strain measurements.

  19. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach.

    PubMed

    Chertkov, Michael; Chernyak, Vladimir

    2017-08-17

    Thermostatically controlled loads, e.g., air conditioners and heaters, are by far the most widespread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control - changing from on to off, and vice versa, depending on temperature. We considered aggregation of a large group of similar devices into a statistical ensemble, where the devices operate following the same dynamics, subject to stochastic perturbations and randomized, Poisson on/off switching policy. Using theoretical and computational tools of statistical physics, we analyzed how the ensemble relaxes to a stationary distribution and established a relationship between the relaxation and the statistics of the probability flux associated with devices' cycling in the mixed (discrete, switch on/off, and continuous temperature) phase space. This allowed us to derive the spectrum of the non-equilibrium (detailed balance broken) statistical system and uncover how switching policy affects oscillatory trends and the speed of the relaxation. Relaxation of the ensemble is of practical interest because it describes how the ensemble recovers from significant perturbations, e.g., forced temporary switching off aimed at utilizing the flexibility of the ensemble to provide "demand response" services to change consumption temporarily to balance a larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.

  20. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE PAGES

    Chertkov, Michael; Chernyak, Vladimir

    2017-01-17

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  1. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Chernyak, Vladimir

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  2. Multivariate space - time analysis of PRE-STORM precipitation

    NASA Technical Reports Server (NTRS)

    Polyak, Ilya; North, Gerald R.; Valdes, Juan B.

    1994-01-01

    This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.

  3. Statistical Methods in Physical Oceanography: Proceedings of ’Aha Huliko’a Hawaiian Winter Workshop Held in Manoa, Hawaii on January 12-15, 1993

    DTIC Science & Technology

    1993-11-01

    field X(t) at time 1. Ti. is the set of all times when both pi and pi have been observed and ni. is the number of elements in T Definition Eq. (22) is...termed contour analysis, for melding of oceanic data and for space-time interpolation of gappy frontal data sets . The key elements of contour analysis...plane and let fl(1) be the set of all straight lines intersecting F. Directly measuring the number of intersections between a random element W E 11(F) and

  4. The Statistical Mechanics of Ideal MHD Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2003-01-01

    Turbulence is a universal, nonlinear phenomenon found in all energetic fluid and plasma motion. In particular. understanding magneto hydrodynamic (MHD) turbulence and incorporating its effects in the computation and prediction of the flow of ionized gases in space, for example, are great challenges that must be met if such computations and predictions are to be meaningful. Although a general solution to the "problem of turbulence" does not exist in closed form, numerical integrations allow us to explore the phase space of solutions for both ideal and dissipative flows. For homogeneous, incompressible turbulence, Fourier methods are appropriate, and phase space is defined by the Fourier coefficients of the physical fields. In the case of ideal MHD flows, a fairly robust statistical mechanics has been developed, in which the symmetry and ergodic properties of phase space is understood. A discussion of these properties will illuminate our principal discovery: Coherent structure and randomness co-exist in ideal MHD turbulence. For dissipative flows, as opposed to ideal flows, progress beyond the dimensional analysis of Kolmogorov has been difficult. Here, some possible future directions that draw on the ideal results will also be discussed. Our conclusion will be that while ideal turbulence is now well understood, real turbulence still presents great challenges.

  5. Analysis of thrips distribution: application of spatial statistics and Kriging

    Treesearch

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  6. A Handbook of Sound and Vibration Parameters

    DTIC Science & Technology

    1978-09-18

    fixed in space. (Reference 1.) no motion atay node Static Divergence: (See Divergence.) Statistical Energy Analysis (SEA): Statistical energy analysis is...parameters of the circuits come from statistics of the vibrational characteristics of the structure. Statistical energy analysis is uniquely successful

  7. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  8. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  9. The effects of context on multidimensional spatial cognitive models. Ph.D. Thesis - Arizona Univ.

    NASA Technical Reports Server (NTRS)

    Dupnick, E. G.

    1979-01-01

    Spatial cognitive models obtained by multidimensional scaling represent cognitive structure by defining alternatives as points in a coordinate space based on relevant dimensions such that interstimulus dissimilarities perceived by the individual correspond to distances between the respective alternatives. The dependence of spatial models on the context of the judgments required of the individual was investigated. Context, which is defined as a perceptual interpretation and cognitive understanding of a judgment situation, was analyzed and classified with respect to five characteristics: physical environment, social environment, task definition, individual perspective, and temporal setting. Four experiments designed to produce changes in the characteristics of context and to test the effects of these changes upon individual cognitive spaces are described with focus on experiment design, objectives, statistical analysis, results, and conclusions. The hypothesis is advanced that an individual can be characterized as having a master cognitive space for a set of alternatives. When the context changes, the individual appears to change the dimension weights to give a new spatial configuration. Factor analysis was used in the interpretation and labeling of cognitive space dimensions.

  10. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    NASA Astrophysics Data System (ADS)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  11. Integrating Condensed Matter Physics into a Liberal Arts Physics Curriculum

    NASA Astrophysics Data System (ADS)

    Collett, Jeffrey

    2008-03-01

    The emergence of nanoscale science into the popular consciousness presents an opportunity to attract and retain future condensed matter scientists. We inject nanoscale physics into recruiting activities and into the introductory and the core portions of the curriculum. Laboratory involvement and research opportunity play important roles in maintaining student engagement. We use inexpensive scanning tunneling (STM) and atomic force (AFM) microscopes to introduce students to nanoscale structure early in their college careers. Although the physics of tip-surface interactions is sophisticated, the resulting images can be interpreted intuitively. We use the STM in introductory modern physics to explore quantum tunneling and the properties of electrons at surfaces. An interdisciplinary course in nanoscience and nanotechnology course team-taught with chemists looks at nanoscale phenomena in physics, chemistry, and biology. Core quantum and statistical physics courses look at effects of quantum mechanics and quantum statistics in degenerate systems. An upper level solid-state physics course takes up traditional condensed matter topics from a structural perspective by beginning with a study of both elastic and inelastic scattering of x-rays from crystalline solids and liquid crystals. Students encounter reciprocal space concepts through the analysis of laboratory scattering data and by the development of the scattering theory. The course then examines the importance of scattering processes in band structure and in electrical and thermal conduction. A segment of the course is devoted to surface physics and nanostructures where we explore the effects of restricting particles to two-dimensional surfaces, one-dimensional wires, and zero-dimensional quantum dots.

  12. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  13. The space physics analysis network

    NASA Astrophysics Data System (ADS)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  14. Statistical representation of a spray as a point process

    NASA Astrophysics Data System (ADS)

    Subramaniam, S.

    2000-10-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed.

  15. Non-Extensive Statistical Analysis of Magnetic Field and SEPs during the March 2012 ICME event, using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas

    2017-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  16. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  17. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  18. Non-extensive statistical analysis of magnetic field during the March 2012 ICME event using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Malandraki, O. E.; Pavlos, E. G.; Iliopoulos, A. C.; Karakatsanis, L. P.

    2016-12-01

    In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat ,qsen ,qrel) of magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For this, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as ;quiet;, ;shock; and ;aftershock;, while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014a,b; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoilova, N. I.

    Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less

  20. Explaining Gibbsean phase space to second year students

    NASA Astrophysics Data System (ADS)

    Vesely, Franz J.

    2005-03-01

    A new approach to teaching introductory statistical physics is presented. We recommend making extensive use of the fact that even systems with a very few degrees of freedom may display chaotic behaviour. This permits a didactic 'bottom-up' approach, starting out with toy systems whose phase space may be depicted on a screen or blackboard, then proceeding to ever higher dimensions in Gibbsean phase space.

  1. Wood Products Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.

  2. Statistical modeling of optical attenuation measurements in continental fog conditions

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad

    2017-03-01

    Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.

  3. Healing X-ray scattering images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  4. Healing X-ray scattering images

    DOE PAGES

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...

    2017-05-24

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  5. Research in Theoretical High Energy Nuclear Physics at the University of Arizona

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rafelski, Johann

    In the past decade (2004-2015) we addressed the quest for the understanding of how quark confinement works, how it can be dissolved in a limited space-time domain, and what this means: i) for the paradigm of the laws of physics of present day; and, ii) for our understanding of cosmology. The focus of our in laboratory matter formation work has been centered on the understanding of the less frequently produced hadronic particles (e.g. strange antibaryons, charmed and beauty hadrons, massive resonances, charmonium, B c). We have developed a public analysis tool, SHARE (Statistical HAdronization with REsonances) which allows a precisemore » model description of experimental particle yield and fluctuation data. We have developed a charm recombination model to allow for off-equilibrium rate of charmonium production. We have developed methods and techniques which allowed us to study the hadron resonance yield evolution by kinetic theory. We explored entropy, strangeness and charm as signature of QGP addressing the wide range of reaction energy for AGS, SPS, RHIC and LHC energy range. In analysis of experimental data, we obtained both statistical parameters as well as physical properties of the hadron source. The following pages present listings of our primary writing on these questions. The abstracts are included in lieu of more detailed discussion of our research accomplishments in each of the publications.« less

  6. Study of multiple unfolding trajectories and unfolded states of the protein GB1 under the physical property space.

    PubMed

    Wang, Jihua; Zhao, Liling; Dou, Xianghua; Zhang, Zhiyong

    2008-06-01

    Forty nine molecular dynamics simulations of unfolding trajectories of the segment B1 of streptococcal protein G (GB1) provide a direct demonstration of the diversity of unfolding pathway and give a statistically utmost unfolding pathway under the physical property space. Twelve physical properties of the protein were chosen to construct a 12-dimensional property space. Then the 12-dimensional property space was reduced to a 3-dimensional principle component property space. Under the property space, the multiple unfolding trajectories look like "trees", which have some common characters. The "root of the tree" corresponds to the native state, the "bole" homologizes the partially unfolded conformations, and the "crown" is in correspondence to the unfolded state. These unfolding trajectories can be divided into three types. The first one has the characters of straight "bole" and "crown" corresponding to a fast two-state unfolding pathway of GB1. The second one has the character of "the standstill in the middle tree bole", which may correspond to a three-state unfolding pathway. The third one has the character of "the circuitous bole" corresponding to a slow two-state unfolding pathway. The fast two-state unfolding pathway is a statistically utmost unfolding pathway or preferred pathway of GB1, which occupies 53% of 49 unfolding trajectories. In the property space all the unfolding trajectories construct a thermal unfolding pathway ensemble of GB1. The unfolding pathway ensemble resembles a funnel that is gradually emanative from the native state ensemble to the unfolded state ensemble. In the property space, the thermal unfolded state distribution looks like electronic cloud in quantum mechanics. The unfolded states of the independent unfolding simulation trajectories have substantial overlaps, indicating that the thermal unfolded states are confined by the physical property values, and the number of protein unfolded state are much less than that was believed before.

  7. A New Approach to Monte Carlo Simulations in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Landau, David P.

    2002-08-01

    Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  8. Understanding older adults' usage of community green spaces in Taipei, Taiwan.

    PubMed

    Pleson, Eryn; Nieuwendyk, Laura M; Lee, Karen K; Chaddah, Anuradha; Nykiforuk, Candace I J; Schopflocher, Donald

    2014-01-27

    As the world's population ages, there is an increasing need for community environments to support physical activity and social connections for older adults. This exploratory study sought to better understand older adults' usage and perceptions of community green spaces in Taipei, Taiwan, through direct observations of seven green spaces and nineteen structured interviews. Descriptive statistics from observations using the System for Observing Play and Recreation in Communities (SOPARC) confirm that older adults use Taipei's parks extensively. Our analyses of interviews support the following recommendations for age-friendly active living initiatives for older adults: make green spaces accessible to older adults; organize a variety of structured activities that appeal to older adults particularly in the morning; equip green spaces for age-appropriate physical activity; and, promote the health advantages of green spaces to older adults.

  9. Understanding Older Adults’ Usage of Community Green Spaces in Taipei, Taiwan

    PubMed Central

    Pleson, Eryn; Nieuwendyk, Laura M.; Lee, Karen K.; Chaddah, Anuradha; Nykiforuk, Candace I. J.; Schopflocher, Donald

    2014-01-01

    As the world’s population ages, there is an increasing need for community environments to support physical activity and social connections for older adults. This exploratory study sought to better understand older adults’ usage and perceptions of community green spaces in Taipei, Taiwan, through direct observations of seven green spaces and nineteen structured interviews. Descriptive statistics from observations using the System for Observing Play and Recreation in Communities (SOPARC) confirm that older adults use Taipei’s parks extensively. Our analyses of interviews support the following recommendations for age-friendly active living initiatives for older adults: make green spaces accessible to older adults; organize a variety of structured activities that appeal to older adults particularly in the morning; equip green spaces for age-appropriate physical activity; and, promote the health advantages of green spaces to older adults. PMID:24473116

  10. Fast emulation of track reconstruction in the CMS simulation

    NASA Astrophysics Data System (ADS)

    Komm, Matthias; CMS Collaboration

    2017-10-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.

  11. Highlights from the First Ever Demographic Study of Solar Physics, Space Physics, and Upper Atmospheric Physics

    NASA Astrophysics Data System (ADS)

    Moldwin, M.; Morrow, C. A.; White, S. C.; Ivie, R.

    2014-12-01

    Members of the Education & Workforce Working Group and the American Institute of Physics (AIP) conducted the first ever National Demographic Survey of working professionals for the 2012 National Academy of Sciences Solar and Space Physics Decadal Survey to learn about the demographics of this sub-field of space science. The instrument contained questions for participants on: the type of workplace; basic demographic information regarding gender and minority status, educational pathways (discipline of undergrad degree, field of their PhD), how their undergraduate and graduate student researchers are funded, participation in NSF and NASA funded spaceflight missions and suborbital programs, and barriers to career advancement. Using contact data bases from AGU, the American Astronomical Society's Solar Physics Division (AAS-SPD), attendees of NOAA's Space Weather Week and proposal submissions to NSF's Atmospheric, Geospace Science Division, the AIP's Statistical Research Center cross correlated and culled these data bases resulting in 2776 unique email addresses of US based working professionals. The survey received 1305 responses (51%) and generated 125 pages of single space answers to a number of open-ended questions. This talk will summarize the highlights of this first-ever demographic survey including findings extracted from the open-ended responses regarding barriers to career advancement which showed significant gender differences.

  12. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  13. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  14. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    PubMed

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  15. 2+1 flavor lattice QCD toward the physical point

    NASA Astrophysics Data System (ADS)

    Aoki, S.; Ishikawa, K.-I.; Ishizuka, N.; Izubuchi, T.; Kadoh, D.; Kanaya, K.; Kuramashi, Y.; Namekawa, Y.; Okawa, M.; Taniguchi, Y.; Ukawa, A.; Ukita, N.; Yoshié, T.

    2009-02-01

    We present the first results of the PACS-CS project which aims to simulate 2+1 flavor lattice QCD on the physical point with the nonperturbatively O(a)-improved Wilson quark action and the Iwasaki gauge action. Numerical simulations are carried out at β=1.9, corresponding to the lattice spacing of a=0.0907(13)fm, on a 323×64 lattice with the use of the domain-decomposed HMC algorithm to reduce the up-down quark mass. Further algorithmic improvements make possible the simulation whose up-down quark mass is as light as the physical value. The resulting pseudoscalar meson masses range from 702 MeV down to 156 MeV, which clearly exhibit the presence of chiral logarithms. An analysis of the pseudoscalar meson sector with SU(3) chiral perturbation theory reveals that the next-to-leading order corrections are large at the physical strange quark mass. In order to estimate the physical up-down quark mass, we employ the SU(2) chiral analysis expanding the strange quark contributions analytically around the physical strange quark mass. The SU(2) low energy constants lmacr 3 and lmacr 4 are comparable with the recent estimates by other lattice QCD calculations. We determine the physical point together with the lattice spacing employing mπ, mK and mΩ as input. The hadron spectrum extrapolated to the physical point shows an agreement with the experimental values at a few % level of statistical errors, albeit there remain possible cutoff effects. We also find that our results of fπ, fK and their ratio, where renormalization is carries out perturbatively at one loop, are compatible with the experimental values. For the physical quark masses we obtain mudM Smacr and msM Smacr extracted from the axial-vector Ward-Takahashi identity with the perturbative renormalization factors. We also briefly discuss the results for the static quark potential.

  16. Spatiotemporal Analysis of the Ebola Hemorrhagic Fever in West Africa in 2014

    NASA Astrophysics Data System (ADS)

    Xu, M.; Cao, C. X.; Guo, H. F.

    2017-09-01

    Ebola hemorrhagic fever (EHF) is an acute hemorrhagic diseases caused by the Ebola virus, which is highly contagious. This paper aimed to explore the possible gathering area of EHF cases in West Africa in 2014, and identify endemic areas and their tendency by means of time-space analysis. We mapped distribution of EHF incidences and explored statistically significant space, time and space-time disease clusters. We utilized hotspot analysis to find the spatial clustering pattern on the basis of the actual outbreak cases. spatial-temporal cluster analysis is used to analyze the spatial or temporal distribution of agglomeration disease, examine whether its distribution is statistically significant. Local clusters were investigated using Kulldorff's scan statistic approach. The result reveals that the epidemic mainly gathered in the western part of Africa near north Atlantic with obvious regional distribution. For the current epidemic, we have found areas in high incidence of EVD by means of spatial cluster analysis.

  17. KIC 9533489: a genuine γ Doradus - δ Scuti Kepler hybrid pulsator with transit events

    NASA Astrophysics Data System (ADS)

    Bognár, Zs.; Lampens, P.; Frémat, Y.; Southworth, J.; Sódor, Á.; De Cat, P.; Isaacson, H. T.; Marcy, G. W.; Ciardi, D. R.; Gilliland, R. L.; Martín-Fernández, P.

    2015-09-01

    Context. Several hundred candidate hybrid pulsators of type A-F have been identified from space-based observations. Their large number allows both statistical analyses and detailed investigations of individual stars. This offers the opportunity to study the full interior of the genuine hybrids, in which both low radial order p- and high-order g-modes are self-excited at the same time. However, a few other physical processes can also be responsible for the observed hybrid nature, related to binarity or to surface inhomogeneities. The finding that most δ Scuti stars also show long-period light variations represents a real challenge for theory. Aims: We aim at determining the pulsation frequencies of KIC 9533489, to search for regular patterns and spacings among them, and to investigate the stability of the frequencies and the amplitudes. An additional goal is to study the serendipitously detected transit events: is KIC 9533489 the host star? What are the limitations on the physical parameters of the involved bodies? Methods: We performed a Fourier analysis of all the available Kepler light curves. We investigated the frequency and period spacings and determined the stellar physical parameters from spectroscopic observations. We also modelled the transit events. Results: The Fourier analysis of the Kepler light curves revealed 55 significant frequencies clustered into two groups, which are separated by a gap between 15 and 27 d-1. The light variations are dominated by the beating of two dominant frequencies located at around 4 d-1. The amplitudes of these two frequencies show a monotonic long-term trend. The frequency spacing analysis revealed two possibilities: the pulsator is either a highly inclined moderate rotator (v ≈ 70 km s-1, i> 70°) or a fast rotator (v ≈ 200 km s-1) with i ≈ 20°. The transit analysis disclosed that the transit events that occur with a ≈197 d period may be caused by a 1.6 RJup body orbiting a fainter star, which would be spatially coincident with KIC 9533489.

  18. A bibliometric analysis of statistical terms used in American Physical Therapy Association journals (2011-2012): evidence for educating physical therapists.

    PubMed

    Tilson, Julie K; Marshall, Katie; Tam, Jodi J; Fetters, Linda

    2016-04-22

    A primary barrier to the implementation of evidence based practice (EBP) in physical therapy is therapists' limited ability to understand and interpret statistics. Physical therapists demonstrate limited skills and report low self-efficacy for interpreting results of statistical procedures. While standards for physical therapist education include statistics, little empirical evidence is available to inform what should constitute such curricula. The purpose of this study was to conduct a census of the statistical terms and study designs used in physical therapy literature and to use the results to make recommendations for curricular development in physical therapist education. We conducted a bibliometric analysis of 14 peer-reviewed journals associated with the American Physical Therapy Association over 12 months (Oct 2011-Sept 2012). Trained raters recorded every statistical term appearing in identified systematic reviews, primary research reports, and case series and case reports. Investigator-reported study design was also recorded. Terms representing the same statistical test or concept were combined into a single, representative term. Cumulative percentage was used to identify the most common representative statistical terms. Common representative terms were organized into eight categories to inform curricular design. Of 485 articles reviewed, 391 met the inclusion criteria. These 391 articles used 532 different terms which were combined into 321 representative terms; 13.1 (sd = 8.0) terms per article. Eighty-one representative terms constituted 90% of all representative term occurrences. Of the remaining 240 representative terms, 105 (44%) were used in only one article. The most common study design was prospective cohort (32.5%). Physical therapy literature contains a large number of statistical terms and concepts for readers to navigate. However, in the year sampled, 81 representative terms accounted for 90% of all occurrences. These "common representative terms" can be used to inform curricula to promote physical therapists' skills, competency, and confidence in interpreting statistics in their professional literature. We make specific recommendations for curriculum development informed by our findings.

  19. Detecting Selection on Protein Stability through Statistical Mechanical Models of Folding and Evolution

    PubMed Central

    Bastolla, Ugo

    2014-01-01

    The properties of biomolecules depend both on physics and on the evolutionary process that formed them. These two points of view produce a powerful synergism. Physics sets the stage and the constraints that molecular evolution has to obey, and evolutionary theory helps in rationalizing the physical properties of biomolecules, including protein folding thermodynamics. To complete the parallelism, protein thermodynamics is founded on the statistical mechanics in the space of protein structures, and molecular evolution can be viewed as statistical mechanics in the space of protein sequences. In this review, we will integrate both points of view, applying them to detecting selection on the stability of the folded state of proteins. We will start discussing positive design, which strengthens the stability of the folded against the unfolded state of proteins. Positive design justifies why statistical potentials for protein folding can be obtained from the frequencies of structural motifs. Stability against unfolding is easier to achieve for longer proteins. On the contrary, negative design, which consists in destabilizing frequently formed misfolded conformations, is more difficult to achieve for longer proteins. The folding rate can be enhanced by strengthening short-range native interactions, but this requirement contrasts with negative design, and evolution has to trade-off between them. Finally, selection can accelerate functional movements by favoring low frequency normal modes of the dynamics of the native state that strongly correlate with the functional conformation change. PMID:24970217

  20. Space physics education via examples in the undergraduate physics curriculum

    NASA Astrophysics Data System (ADS)

    Martin, R.; Holland, D. L.

    2011-12-01

    The field of space physics is rich with examples of basic physics and analysis techniques, yet it is rarely seen in physics courses or textbooks. As space physicists in an undergraduate physics department we like to use research to inform teaching, and we find that students respond well to examples from magnetospheric science. While we integrate examples into general education courses as well, this talk will focus on physics major courses. Space physics examples are typically selected to illustrate a particular concept or method taught in the course. Four examples will be discussed, from an introductory electricity and magnetism course, a mechanics/nonlinear dynamics course, a computational physics course, and a plasma physics course. Space physics provides examples of many concepts from introductory E&M, including the application of Faraday's law to terrestrial magnetic storm effects and the use of the basic motion of charged particles as a springboard to discussion of the inner magnetosphere and the aurora. In the mechanics and nonlinear dynamics courses, the motion of charged particles in a magnetotail current sheet magnetic field is treated as a Newtonian dynamical system, illustrating the Poincaré surface-of-section technique, the partitioning of phase space, and the KAM theorem. Neural network time series analysis of AE data is used as an example in the computational physics course. Finally, among several examples, current sheet particle dynamics is utilized in the plasma physics course to illustrate the notion of adiabatic/guiding center motion and the breakdown of the adiabatic approximation. We will present short descriptions of our pedagogy and student assignments in this "backdoor" method of space physics education.

  1. Clustered Stomates in "Begonia": An Exercise in Data Collection & Statistical Analysis of Biological Space

    ERIC Educational Resources Information Center

    Lau, Joann M.; Korn, Robert W.

    2007-01-01

    In this article, the authors present a laboratory exercise in data collection and statistical analysis in biological space using clustered stomates on leaves of "Begonia" plants. The exercise can be done in middle school classes by students making their own slides and seeing imprints of cells, or at the high school level through collecting data of…

  2. Characteristics of level-spacing statistics in chaotic graphene billiards.

    PubMed

    Huang, Liang; Lai, Ying-Cheng; Grebogi, Celso

    2011-03-01

    A fundamental result in nonrelativistic quantum nonlinear dynamics is that the spectral statistics of quantum systems that possess no geometric symmetry, but whose classical dynamics are chaotic, are described by those of the Gaussian orthogonal ensemble (GOE) or the Gaussian unitary ensemble (GUE), in the presence or absence of time-reversal symmetry, respectively. For massless spin-half particles such as neutrinos in relativistic quantum mechanics in a chaotic billiard, the seminal work of Berry and Mondragon established the GUE nature of the level-spacing statistics, due to the combination of the chirality of Dirac particles and the confinement, which breaks the time-reversal symmetry. A question is whether the GOE or the GUE statistics can be observed in experimentally accessible, relativistic quantum systems. We demonstrate, using graphene confinements in which the quasiparticle motions are governed by the Dirac equation in the low-energy regime, that the level-spacing statistics are persistently those of GOE random matrices. We present extensive numerical evidence obtained from the tight-binding approach and a physical explanation for the GOE statistics. We also find that the presence of a weak magnetic field switches the statistics to those of GUE. For a strong magnetic field, Landau levels become influential, causing the level-spacing distribution to deviate markedly from the random-matrix predictions. Issues addressed also include the effects of a number of realistic factors on level-spacing statistics such as next nearest-neighbor interactions, different lattice orientations, enhanced hopping energy for atoms on the boundary, and staggered potential due to graphene-substrate interactions.

  3. The NWRA Classification Infrastructure: description and extension to the Discriminant Analysis Flare Forecasting System (DAFFS)

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, Graham; Wagner, Eric

    2018-04-01

    A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.

  4. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5

  5. Equilibrium statistical-thermal models in high-energy physics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2014-05-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.

  6. Neighbourhood green space, physical function and participation in physical activities among elderly men: the Caerphilly Prospective study

    PubMed Central

    2014-01-01

    Background The built environment in which older people live plays an important role in promoting or inhibiting physical activity. Most work on this complex relationship between physical activity and the environment has excluded people with reduced physical function or ignored the difference between groups with different levels of physical function. This study aims to explore the role of neighbourhood green space in determining levels of participation in physical activity among elderly men with different levels of lower extremity physical function. Method Using data collected from the Caerphilly Prospective Study (CaPS) and green space data collected from high resolution Landmap true colour aerial photography, we first investigated the effect of the quantity of neighbourhood green space and the variation in neighbourhood vegetation on participation in physical activity for 1,010 men aged 66 and over in Caerphilly county borough, Wales, UK. Second, we explored whether neighbourhood green space affects groups with different levels of lower extremity physical function in different ways. Results Increasing percentage of green space within a 400 meters radius buffer around the home was significantly associated with more participation in physical activity after adjusting for lower extremity physical function, psychological distress, general health, car ownership, age group, marital status, social class, education level and other environmental factors (OR = 1.21, 95% CI 1.05, 1.41). A statistically significant interaction between the variation in neighbourhood vegetation and lower extremity physical function was observed (OR = 1.92, 95% CI 1.12, 3.28). Conclusion Elderly men living in neighbourhoods with more green space have higher levels of participation in regular physical activity. The association between variation in neighbourhood vegetation and regular physical activity varied according to lower extremity physical function. Subjects reporting poor lower extremity physical function living in neighbourhoods with more homogeneous vegetation (i.e. low variation) were more likely to participate in regular physical activity than those living in neighbourhoods with less homogeneous vegetation (i.e. high variation). Good lower extremity physical function reduced the adverse effect of high variation vegetation on participation in regular physical activity. This provides a basis for the future development of novel interventions that aim to increase levels of physical activity in later life, and has implications for planning policy to design, preserve, facilitate and encourage the use of green space near home. PMID:24646136

  7. Neighbourhood green space, physical function and participation in physical activities among elderly men: the Caerphilly Prospective study.

    PubMed

    Gong, Yi; Gallacher, John; Palmer, Stephen; Fone, David

    2014-03-19

    The built environment in which older people live plays an important role in promoting or inhibiting physical activity. Most work on this complex relationship between physical activity and the environment has excluded people with reduced physical function or ignored the difference between groups with different levels of physical function. This study aims to explore the role of neighbourhood green space in determining levels of participation in physical activity among elderly men with different levels of lower extremity physical function. Using data collected from the Caerphilly Prospective Study (CaPS) and green space data collected from high resolution Landmap true colour aerial photography, we first investigated the effect of the quantity of neighbourhood green space and the variation in neighbourhood vegetation on participation in physical activity for 1,010 men aged 66 and over in Caerphilly county borough, Wales, UK. Second, we explored whether neighbourhood green space affects groups with different levels of lower extremity physical function in different ways. Increasing percentage of green space within a 400 meters radius buffer around the home was significantly associated with more participation in physical activity after adjusting for lower extremity physical function, psychological distress, general health, car ownership, age group, marital status, social class, education level and other environmental factors (OR = 1.21, 95% CI 1.05, 1.41). A statistically significant interaction between the variation in neighbourhood vegetation and lower extremity physical function was observed (OR = 1.92, 95% CI 1.12, 3.28). Elderly men living in neighbourhoods with more green space have higher levels of participation in regular physical activity. The association between variation in neighbourhood vegetation and regular physical activity varied according to lower extremity physical function. Subjects reporting poor lower extremity physical function living in neighbourhoods with more homogeneous vegetation (i.e. low variation) were more likely to participate in regular physical activity than those living in neighbourhoods with less homogeneous vegetation (i.e. high variation). Good lower extremity physical function reduced the adverse effect of high variation vegetation on participation in regular physical activity. This provides a basis for the future development of novel interventions that aim to increase levels of physical activity in later life, and has implications for planning policy to design, preserve, facilitate and encourage the use of green space near home.

  8. The validity of multiphase DNS initialized on the basis of single--point statistics

    NASA Astrophysics Data System (ADS)

    Subramaniam, Shankar

    1999-11-01

    A study of the point--process statistical representation of a spray reveals that single--point statistical information contained in the droplet distribution function (ddf) is related to a sequence of single surrogate--droplet pdf's, which are in general different from the physical single--droplet pdf's. The results of this study have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single--point statistics such as the average number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets.

  9. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  10. Laboratory space physics: Investigating the physics of space plasmas in the laboratory

    NASA Astrophysics Data System (ADS)

    Howes, Gregory G.

    2018-05-01

    Laboratory experiments provide a valuable complement to explore the fundamental physics of space plasmas without the limitations inherent to spacecraft measurements. Specifically, experiments overcome the restriction that spacecraft measurements are made at only one (or a few) points in space, enable greater control of the plasma conditions and applied perturbations, can be reproducible, and are orders of magnitude less expensive than launching spacecraft. Here, I highlight key open questions about the physics of space plasmas and identify the aspects of these problems that can potentially be tackled in laboratory experiments. Several past successes in laboratory space physics provide concrete examples of how complementary experiments can contribute to our understanding of physical processes at play in the solar corona, solar wind, planetary magnetospheres, and the outer boundary of the heliosphere. I present developments on the horizon of laboratory space physics, identifying velocity space as a key new frontier, highlighting new and enhanced experimental facilities, and showcasing anticipated developments to produce improved diagnostics and innovative analysis methods. A strategy for future laboratory space physics investigations will be outlined, with explicit connections to specific fundamental plasma phenomena of interest.

  11. SOME CHARACTERISTICS OF THE ORAL CAVITY AND TEETH OF COSMONAUTS ON MISSIONS TO THE INTERNATIONAL SPACE STATION.

    PubMed

    Ilyin, V K; Shumilina, G A; Solovieva, Z O; Nosovsky, A M; Kaminskaya, E V

    Earlier studies were furthered by examination of parodentium anaerobic microbiota and investigation of gingival liquid immunological factors in space flight. Immunoglobulins were measured using the .enzyme immunoassay (EM). The qualitative content of keya parodentium pathogens is determined with state-of-the-art molecular biology technologies such as the polymerase chain reaction. Statistical data processing was performed using the principle component analysis and ensuing standard statistical analysis. Thereupon, recommendations on cosmonaut's oral and dental hygiene during space mission were developed.

  12. Report of the theory panel. [space physics

    NASA Technical Reports Server (NTRS)

    Ashourabdalla, Maha; Rosner, Robert; Antiochos, Spiro; Curtis, Steven; Fejer, B.; Goertz, Christoph K.; Goldstein, Melvyn L.; Holzer, Thomas E.; Jokipii, J. R.; Lee, Lou-Chuang

    1991-01-01

    The ultimate goal of this research is to develop an understanding which is sufficiently comprehensive to allow realistic predictions of the behavior of the physical systems. Theory has a central role to play in the quest for this understanding. The level of theoretical description is dependent on three constraints: (1) the available computer hardware may limit both the number and the size of physical processes the model system can describe; (2) the fact that some natural systems may only be described in a statistical manner; and (3) the fact that some natural systems may be observable only through remote sensing which is intrinsically limited by spatial resolution and line of sight integration. From this the report discusses present accomplishments and future goals of theoretical space physics. Finally, the development and use of new supercomputer is examined.

  13. Stochastic analysis of surface roughness models in quantum wires

    NASA Astrophysics Data System (ADS)

    Nedjalkov, Mihail; Ellinghaus, Paul; Weinbub, Josef; Sadi, Toufik; Asenov, Asen; Dimov, Ivan; Selberherr, Siegfried

    2018-07-01

    We present a signed particle computational approach for the Wigner transport model and use it to analyze the electron state dynamics in quantum wires focusing on the effect of surface roughness. Usually surface roughness is considered as a scattering model, accounted for by the Fermi Golden Rule, which relies on approximations like statistical averaging and in the case of quantum wires incorporates quantum corrections based on the mode space approach. We provide a novel computational approach to enable physical analysis of these assumptions in terms of phase space and particles. Utilized is the signed particles model of Wigner evolution, which, besides providing a full quantum description of the electron dynamics, enables intuitive insights into the processes of tunneling, which govern the physical evolution. It is shown that the basic assumptions of the quantum-corrected scattering model correspond to the quantum behavior of the electron system. Of particular importance is the distribution of the density: Due to the quantum confinement, electrons are kept away from the walls, which is in contrast to the classical scattering model. Further quantum effects are retardation of the electron dynamics and quantum reflection. Far from equilibrium the assumption of homogeneous conditions along the wire breaks even in the case of ideal wire walls.

  14. Nonequilibrium statistical mechanics Brussels-Austin style

    NASA Astrophysics Data System (ADS)

    Bishop, Robert C.

    The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels-Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statistical mechanics remains the same, their more recent approach addresses the physical features of large Poincaré systems, nonlinear dynamics and the mathematical tools necessary to analyze them.

  15. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE PAGES

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    2016-07-20

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  16. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  17. The Supernovae Analysis Application (SNAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less

  18. The Supernovae Analysis Application (SNAP)

    DOE PAGES

    Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas; ...

    2017-09-06

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less

  19. The Supernovae Analysis Application (SNAP)

    NASA Astrophysics Data System (ADS)

    Bayless, Amanda J.; Fryer, Chris L.; Wollaeger, Ryan; Wiggins, Brandon; Even, Wesley; de la Rosa, Janie; Roming, Peter W. A.; Frey, Lucy; Young, Patrick A.; Thorpe, Rob; Powell, Luke; Landers, Rachel; Persson, Heather D.; Hay, Rebecca

    2017-09-01

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginning to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.

  20. Wiring Damage Analyses for STS OV-103

    NASA Technical Reports Server (NTRS)

    Thomas, Walter, III

    2006-01-01

    This study investigated the Shuttle Program s belief that Space Transportation System (STS) wiring damage occurrences are random, that is, a constant occurrence rate. Using Problem Reporting and Corrective Action (PRACA)-derived data for STS Space Shuttle OV-103, wiring damage was observed to increase over the vehicle s life. Causal factors could include wiring physical deterioration, maintenance and inspection induced damage, and inspection process changes resulting in more damage events being reported. Induced damage effects cannot be resolved with existent data. Growth analysis (using Crow-AMSAA, or CA) resolved maintenance/inspection effects (e.g., heightened awareness) on all wire damages and indicated an overall increase since Challenger Return-to-Flight (RTF). An increasing failure or occurrence rate per flight cycle was seen for each wire damage mode; these (individual) rates were not affected by inspection process effects, within statistical error.

  1. Reflections on Gibbs: From Statistical Physics to the Amistad V3.0

    NASA Astrophysics Data System (ADS)

    Kadanoff, Leo P.

    2014-07-01

    This note is based upon a talk given at an APS meeting in celebration of the achievements of J. Willard Gibbs. J. Willard Gibbs, the younger, was the first American physical sciences theorist. He was one of the inventors of statistical physics. He introduced and developed the concepts of phase space, phase transitions, and thermodynamic surfaces in a remarkably correct and elegant manner. These three concepts form the basis of different areas of physics. The connection among these areas has been a subject of deep reflection from Gibbs' time to our own. This talk therefore celebrated Gibbs by describing modern ideas about how different parts of physics fit together. I finished with a more personal note. Our own J. Willard Gibbs had all his many achievements concentrated in science. His father, also J. Willard Gibbs, also a Professor at Yale, had one great non-academic achievement that remains unmatched in our day. I describe it.

  2. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  3. Physics Teachers and Students: A Statistical and Historical Analysis of Women

    NASA Astrophysics Data System (ADS)

    Gregory, Amanda

    2009-10-01

    Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.

  4. Superposed epoch analysis of physiological fluctuations: possible space weather connections

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events—space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  5. Superposed epoch analysis of physiological fluctuations: possible space weather connections.

    PubMed

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events-space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  6. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  7. The 1984 NASA/ASEE summer faculty fellowship program

    NASA Technical Reports Server (NTRS)

    Mcinnis, B. C.; Duke, M. B.; Crow, B.

    1984-01-01

    An overview is given of the program management and activities. Participants and research advisors are listed. Abstracts give describe and present results of research assignments performed by 31 fellows either at the Johnson Space Center, at the White Sands test Facility, or at the California Space Institute in La Jolla. Disciplines studied include engineering; biology/life sciences; Earth sciences; chemistry; mathematics/statistics/computer sciences; and physics/astronomy.

  8. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  9. Spinorial Regge trajectories and Hagedorn-like temperatures. Spinorial space-time and preons as an alternative to strings

    NASA Astrophysics Data System (ADS)

    Gonzalez-Mestres, Luis

    2016-11-01

    The development of the statistical bootstrap model for hadrons, quarks and nuclear matter occurred during the 1960s and the 1970s in a period of exceptional theoretical creativity. And if the transition from hadrons to quarks and gluons as fundamental particles was then operated, a transition from standard particles to preons and from the standard space-time to a spinorial one may now be necessary, including related pre-Big Bang scenarios. We present here a brief historical analysis of the scientific problematic of the 1960s in Particle Physics and of its evolution until the end of the 1970s, including cosmological issues. Particular attention is devoted to the exceptional role of Rolf Hagedorn and to the progress of the statistical boostrap model until the experimental search for the quark-gluon plasma started being considered. In parallel, we simultaneously expose recent results and ideas concerning Particle Physics and in Cosmology, an discuss current open questions. Assuming preons to be constituents of the physical vacuum and the standard particles excitations of this vacuum (the superbradyon hypothesis we introduced in 1995), together with a spinorial space-time (SST), a new kind of Regge trajectories is expected to arise where the angular momentum spacing will be of 1/2 instead of 1. Standard particles can lie on such Regge trajectories inside associated internal symmetry multiplets, and the preonic vacuum structure can generate a new approach to Quantum Field Theory. As superbradyons are superluminal preons, some of the vacuum excitations can have critical speeds larger than the speed of light c, but the cosmological evolution selects by itself the particles with the smallest critical speed (the speed of light). In the new Particle Physics and Cosmology emerging from the pattern thus developed, Hagedornlike temperatures will naturally be present. As new space, time, momentum and energy scales are expected to be generated by the preonic vacuum dynamics, the Planck scale does not necessarily make sense in the new scenario. It also turns out that two potential evidences for a superbradyonic vacuum with a SST geometry exist already: i) the recent results on quantum entanglement at large distances favoring superluminal propagation of signals and correlations ; ii) the anisotropy of the cosmic microwave background radiation between two hemispheres observed by the Planck Collaboration, in agreement with the predictions of cosmic SST automatically generating a privileged space direction for each comoving observer. Simultaneously to the discussion of the large number of open questions, we comment on the required experimental and observational programs. This paper is dedicated to the memory of Rolf Hagedorn

  10. REU program in Solar Physics at Montana State University

    NASA Astrophysics Data System (ADS)

    Martens, P. C.; Canfield, R. C.; McKenzie, D. M.

    2005-12-01

    I will present an overview of the REU program in Solar Physics and Space Weather that has existed since 1999 at Montana State University, since 2003 with NSF support. I will briefly describe the goals, organization, scientific contents and results, and present statistics on applications, participants, gender balance, and diversity. This will be concluded by an overview of our plans for the future,

  11. The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics

    NASA Astrophysics Data System (ADS)

    Pavlos, George

    2015-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.

  12. Meta-analysis inside and outside particle physics: two traditions that should converge?

    PubMed

    Baker, Rose D; Jackson, Dan

    2013-06-01

    The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle physics. In this paper, we use statistical models to study two major differences in practice. The first is the usefulness of systematic errors, which physicists are now beginning to quote in addition to statistical errors. The second is whether it is better to treat heterogeneity by scaling up errors as do the Particle Data Group or by adding a random effect as does the rest of the community. Besides fitting models, we derive and use an exact test of the error-scaling hypothesis. We also discuss the other methodological differences between the two streams of meta-analysis. Our conclusion is that systematic errors are not currently very useful and that the conventional random effects model, as routinely used in meta-analysis, has a useful role to play in particle physics. The moral we draw for statisticians is that we should be more willing to explore 'grassroots' areas of statistical application, so that good statistical practice can flow both from and back to the statistical mainstream. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  14. Statistical analysis of 59 inspected SSME HPFTP turbine blades (uncracked and cracked)

    NASA Technical Reports Server (NTRS)

    Wheeler, John T.

    1987-01-01

    The numerical results of statistical analysis of the test data of Space Shuttle Main Engine high pressure fuel turbopump second-stage turbine blades, including some with cracks are presented. Several statistical methods use the test data to determine the application of differences in frequency variations between the uncracked and cracked blades.

  15. The prior statistics of object colors.

    PubMed

    Koenderink, Jan J

    2010-02-01

    The prior statistics of object colors is of much interest because extensive statistical investigations of reflectance spectra reveal highly non-uniform structure in color space common to several very different databases. This common structure is due to the visual system rather than to the statistics of environmental structure. Analysis involves an investigation of the proper sample space of spectral reflectance factors and of the statistical consequences of the projection of spectral reflectances on the color solid. Even in the case of reflectance statistics that are translationally invariant with respect to the wavelength dimension, the statistics of object colors is highly non-uniform. The qualitative nature of this non-uniformity is due to trichromacy.

  16. Research in space science and technology. [including X-ray astronomy and interplanetary plasma physics

    NASA Technical Reports Server (NTRS)

    Beckley, L. E.

    1977-01-01

    Progress in various space flight research programs is reported. Emphasis is placed on X-ray astronomy and interplanetary plasma physics. Topics covered include: infrared astronomy, long base line interferometry, geological spectroscopy, space life science experiments, atmospheric physics, and space based materials and structures research. Analysis of galactic and extra-galactic X-ray data from the Small Astronomy Satellite (SAS-3) and HEAO-A and interplanetary plasma data for Mariner 10, Explorers 47 and 50, and Solrad is discussed.

  17. A Monte Carlo Analysis of the Thrust Imbalance for the Space Launch System Booster During Both the Ignition Transient and Steady State Operation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.

    2014-01-01

    This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.

  18. Statistics of link blockage due to cloud cover for free-space optical communications using NCDC surface weather observation data

    NASA Technical Reports Server (NTRS)

    Slobin, S. D.; Piazzolla, S.

    2002-01-01

    Cloud opacity is one of the main atmospheric physical phenomena that can jeopardize the successful completion of an optical link between a spacecraft and a ground station. Hence, the site location chosen for a telescope used for optical communications must rely on knowledge of weather and cloud cover statistics for the geographical area where the telescope itself is located.

  19. Dynamic State Estimation of Terrestrial and Solar Plasmas

    NASA Astrophysics Data System (ADS)

    Kamalabadi, Farzad

    A pervasive problem in virtually all branches of space science is the estimation of multi-dimensional state parameters of a dynamical system from a collection of indirect, often incomplete, and imprecise measurements. Subsequent scientific inference is predicated on rigorous analysis, interpretation, and understanding of physical observations and on the reliability of the associated quantitative statistical bounds and performance characteristics of the algorithms used. In this work, we focus on these dynamic state estimation problems and illustrate their importance in the context of two timely activities in space remote sensing. First, we discuss the estimation of multi-dimensional ionospheric state parameters from UV spectral imaging measurements anticipated to be acquired the recently selected NASA Heliophysics mission, Ionospheric Connection Explorer (ICON). Next, we illustrate that similar state-space formulations provide the means for the estimation of 3D, time-dependent densities and temperatures in the solar corona from a series of white-light and EUV measurements. We demonstrate that, while a general framework for the stochastic formulation of the state estimation problem is suited for systematic inference of the parameters of a hidden Markov process, several challenges must be addressed in the assimilation of an increasing volume and diversity of space observations. These challenges are: (1) the computational tractability when faced with voluminous and multimodal data, (2) the inherent limitations of the underlying models which assume, often incorrectly, linear dynamics and Gaussian noise, and (3) the unavailability or inaccuracy of transition probabilities and noise statistics. We argue that pursuing answers to these questions necessitates cross-disciplinary research that enables progress toward systematically reconciling observational and theoretical understanding of the space environment.

  20. The physics of flocking: Correlation as a compass from experiments to theory

    NASA Astrophysics Data System (ADS)

    Cavagna, Andrea; Giardina, Irene; Grigera, Tomás S.

    2018-01-01

    Collective behavior in biological systems is a complex topic, to say the least. It runs wildly across scales in both space and time, involving taxonomically vastly different organisms, from bacteria and cell clusters, to insect swarms and up to vertebrate groups. It entails concepts as diverse as coordination, emergence, interaction, information, cooperation, decision-making, and synchronization. Amid this jumble, however, we cannot help noting many similarities between collective behavior in biological systems and collective behavior in statistical physics, even though none of these organisms remotely looks like an Ising spin. Such similarities, though somewhat qualitative, are startling, and regard mostly the emergence of global dynamical patterns qualitatively different from individual behavior, and the development of system-level order from local interactions. It is therefore tempting to describe collective behavior in biology within the conceptual framework of statistical physics, in the hope to extend to this new fascinating field at least part of the great predictive power of theoretical physics. In this review we propose that the conceptual cornerstone of this ambitious program be that of correlation. To illustrate this idea we address the case of collective behavior in bird flocks. Two key threads emerge, as two sides of one single story: the presence of scale-free correlations and the dynamical mechanism of information transfer. We discuss first static correlations in starling flocks, in particular the experimental finding of their scale-free nature, the formulation of models that account for this fact using maximum entropy, and the relation of scale-free correlations to information transfer. This is followed by a dynamic treatment of information propagation (propagation of turns across a flock), starting with a discussion of experimental results and following with possible theoretical explanations of those, which require the addition of behavioral inertia to existing theories of flocking. We finish with the definition and analysis of space-time correlations and their relevance to the detection of inertial behavior in the absence of external perturbations.

  1. What Happens in the Arcade Shouldn't Stay in the Arcade: Lessons for Classroom Design

    ERIC Educational Resources Information Center

    Whitmore, Kathryn F.; Laurich, Lindsay

    2010-01-01

    What features of the physical environment in video game arcades lead kids to be so engaged? How can analysis of arcade space inform language arts teachers' decisions about designing classroom environments? This article presents an analysis of physical space in video game arcades and participants' positions therein to suggest how language arts…

  2. Photoresist and stochastic modeling

    NASA Astrophysics Data System (ADS)

    Hansen, Steven G.

    2018-01-01

    Analysis of physical modeling results can provide unique insights into extreme ultraviolet stochastic variation, which augment, and sometimes refute, conclusions based on physical intuition and even wafer experiments. Simulations verify the primacy of "imaging critical" counting statistics (photons, electrons, and net acids) and the image/blur-dependent dose sensitivity in describing the local edge or critical dimension variation. But the failure of simple counting when resist thickness is varied highlights a limitation of this exact analytical approach, so a calibratable empirical model offers useful simplicity and convenience. Results presented here show that a wide range of physical simulation results can be well matched by an empirical two-parameter model based on blurred image log-slope (ILS) for lines/spaces and normalized ILS for holes. These results are largely consistent with a wide range of published experimental results; however, there is some disagreement with the recently published dataset of De Bisschop. The present analysis suggests that the origin of this model failure is an unexpected blurred ILS:dose-sensitivity relationship failure in that resist process. It is shown that a photoresist mechanism based on high photodecomposable quencher loading and high quencher diffusivity can give rise to pitch-dependent blur, which may explain the discrepancy.

  3. Statistical mechanics and thermodynamic limit of self-gravitating fermions in D dimensions.

    PubMed

    Chavanis, Pierre-Henri

    2004-06-01

    We discuss the statistical mechanics of a system of self-gravitating fermions in a space of dimension D. We plot the caloric curves of the self-gravitating Fermi gas giving the temperature as a function of energy and investigate the nature of phase transitions as a function of the dimension of space. We consider stable states (global entropy maxima) as well as metastable states (local entropy maxima). We show that for D> or =4, there exists a critical temperature (for sufficiently large systems) and a critical energy below which the system cannot be found in statistical equilibrium. Therefore, for D> or =4, quantum mechanics cannot stabilize matter against gravitational collapse. This is similar to a result found by Ehrenfest (1917) at the atomic level for Coulomb forces. This makes the dimension D=3 of our Universe very particular with possible implications regarding the anthropic principle. Our study joins a long tradition of scientific and philosophical papers that examined how the dimension of space affects the laws of physics.

  4. Recent enhancements of the PMCC infrasound signal detector

    NASA Astrophysics Data System (ADS)

    Brachet, N.; Mialle, P.; Matoza, R. S.; Le Pichon, A.; Cansi, Y.; Ceranna, L.

    2010-12-01

    The Progressive Multi-Channel Correlation (PMCC) is an antenna technique that is commonly being used by the scientific community for detecting coherent signals recorded on infrasound arrays. The PMCC detector, originally developed by CEA/DASE (Cansi, 1995), was installed in 2004 in the operational environment of the International Data Centre (IDC) of the Comprehensive nuclear Test Ban Treaty Organization (CTBTO) in Vienna. During the last 5 years, several changes have been made by the IDC to enhance the PMCC source code and parameter configuration, and the detector has exhibited good performance in terms of detection sensitivity and robustness. Recent studies performed at the CEA/DASE have shown that the IDC version (DFX/Geotool-PMCC) and the DASE version (WinPMCC) of PMCC software benefit from the implementation of the adaptive processing window duration and a log-spaced frequency bands. This tested configuration enables better detection and characterization of all received signals in their wave-front parameter space (e.g., frequency-azimuth space, frequency-trace-velocity space). A new release of the WinPMCC software - running under Windows or Linux operating systems - including a fully configurable filtering and detection parameters is now available upon request. We present the results of a statistical analysis on 10 years of infrasound data recorded at the IMS stations IS26, Germany and IS22, New Caledonia. A comparison is made between the automatic detections produced by the IDC, and the reprocessed detections using the optimized filtering and detection configuration parameters. Work is also underway at the CEA/DASE to determine more rigorously the azimuth and speed uncertainties. The current algorithm estimates the uncertainties based on statistical analysis of the distribution of PMCC detection pixels in the azimuth-speed space. The new code that is being considered performs the calculation of infrasound measurement errors as a function of physical parameters, i.e. dependant on the array geometry and the wave properties.

  5. Monte Carlo modeling of spatial coherence: free-space diffraction

    PubMed Central

    Fischer, David G.; Prahl, Scott A.; Duncan, Donald D.

    2008-01-01

    We present a Monte Carlo method for propagating partially coherent fields through complex deterministic optical systems. A Gaussian copula is used to synthesize a random source with an arbitrary spatial coherence function. Physical optics and Monte Carlo predictions of the first- and second-order statistics of the field are shown for coherent and partially coherent sources for free-space propagation, imaging using a binary Fresnel zone plate, and propagation through a limiting aperture. Excellent agreement between the physical optics and Monte Carlo predictions is demonstrated in all cases. Convergence criteria are presented for judging the quality of the Monte Carlo predictions. PMID:18830335

  6. Computing Interactions Of Free-Space Radiation With Matter

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Cucinotta, F. A.; Shinn, J. L.; Townsend, L. W.; Badavi, F. F.; Tripathi, R. K.; Silberberg, R.; Tsao, C. H.; Badwar, G. D.

    1995-01-01

    High Charge and Energy Transport (HZETRN) computer program computationally efficient, user-friendly package of software adressing problem of transport of, and shielding against, radiation in free space. Designed as "black box" for design engineers not concerned with physics of underlying atomic and nuclear radiation processes in free-space environment, but rather primarily interested in obtaining fast and accurate dosimetric information for design and construction of modules and devices for use in free space. Computational efficiency achieved by unique algorithm based on deterministic approach to solution of Boltzmann equation rather than computationally intensive statistical Monte Carlo method. Written in FORTRAN.

  7. Geometry of Theory Space and RG Flows

    NASA Astrophysics Data System (ADS)

    Kar, Sayan

    The space of couplings of a given theory is the arena of interest in this article. Equipped with a metric ansatz akin to the Fisher information matrix in the space of parameters in statistics (similar metrics in physics are the Zamolodchikov metric or the O'Connor-Stephens metric) we investigate the geometry of theory space through a study of specific examples. We then look into renormalisation group flows in theory space and make an attempt to characterise such flows via its isotropic expansion, rotation and shear. Consequences arising from the evolution equation for the isotropic expansion are discussed. We conclude by pointing out generalisations and pose some open questions.

  8. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    NASA Astrophysics Data System (ADS)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics

  9. Challenges in Teaching Space Physics to Different Target Groups From Space Weather Forecasters to Heavy-weight Theorists

    NASA Astrophysics Data System (ADS)

    Koskinen, H. E.

    2008-12-01

    Plasma physics as the backbone of space physics is difficult and thus the space physics students need to have strong foundations in general physics, in particular in classical electrodynamics and thermodynamics, and master the basic mathematical tools for physicists. In many universities the number of students specializing in space physics at Master's and Doctoral levels is rather small and the students may have quite different preferences ranging from experimental approach to hard-core space plasma theory. This poses challenges in building up a study program that has both the variety and depth needed to motivate the best students to choose this field. At the University of Helsinki we require all beginning space physics students, regardless whether they enter the field as Master's or Doctoral degree students, to take a one-semester package consisting of plasma physics and its space applications. However, some compromises are necessary. For example, it is not at all clear, how thoroughly Landau damping should be taught at the first run or how deeply should the intricacies of collisionless reconnection be discussed. In both cases we have left the details to an optional course in advanced space physics, even with the risk that the student's appreciation of, e.g., reconnection may remain at the level of a magic wand. For learning experimental work, data analysis or computer simulations we have actively pursued arrangements for the Master's degree students to get a summer employments in active research groups, which usually lead to the Master's theses. All doctoral students are members of research groups and participate in experimental work, data analysis, simulation studies or theory development, or any combination of these. We emphasize strongly "learning by doing" all the way from the weekly home exercises during the lecture courses to the PhD theses which in Finland consist typically of 4-6 peer-reviewed articles with a comprehensive introductory part.

  10. Examining the Role of Environment in a Comprehensive Sample of Compact Groups

    NASA Astrophysics Data System (ADS)

    Walker, Lisa May; Johnson, Kelsey E.; Gallagher, Sarah C.; Charlton, Jane C.; Hornschemeier, Ann E.; Hibbard, John E.

    2012-03-01

    Compact groups, with their high number densities, small velocity dispersions, and an interstellar medium that has not been fully processed, provide a local analog to conditions of galaxy interactions in the earlier universe. The frequent and prolonged gravitational encounters that occur in compact groups affect the evolution of the constituent galaxies in a myriad of ways, for example, gas processing and star formation. Recently, a statistically significant "gap" has been discovered in the mid-infrared (MIR: 3.6-8 μm) IRAC color space of compact group galaxies. This gap is not seen in field samples and is a new example of how the compact group environment may affect the evolution of member galaxies. In order to investigate the origin and nature of this gap, we have compiled a larger sample of 37 compact groups in addition to the original 12 groups studied by Johnson et al. (yielding 174 individual galaxies with reliable MIR photometry). We find that a statistically significant deficit of galaxies in this gap region of IRAC color space is persistent in the full sample, lending support to the hypothesis that the compact group environment inhibits moderate specific star formation rates. Using this expanded sample, we have more fully characterized the distribution of galaxies in this color space and quantified the low-density region more fully with respect to MIR bluer and MIR redder colors. We note a curvature in the color-space distribution, which is fully consistent with increasing dust temperature as the activity in a galaxy increases. This full sample of 49 compact groups allows us to subdivide the data according to physical properties of the groups. An analysis of these subsamples indicates that neither projected physical diameter nor density shows a trend in color space within the values represented by this sample. We hypothesize that the apparent lack of a trend is due to the relatively small range of properties in this sample, whose groups have already been pre-selected to be compact and dense. Thus, the relative influence of stochastic effects (such as the particular distribution and amount of star formation in individual galaxies) becomes dominant. We analyze spectral energy distributions of member galaxies as a function of their location in color space and find that galaxies in different regions of MIR color space contain dust with varying temperatures and/or polycyclic aromatic hydrocarbon emission.

  11. Examining the Role of Environment in a Comprehensive Sample of Compact Groups

    NASA Technical Reports Server (NTRS)

    Walker, Lisa May; Johnson, Kelsey E.; Gallagher, Sarah C.; Charlton, Jane C.; Hornschemeier, Ann E.; Hibbard, John E.

    2012-01-01

    Compact groups, with their high number densities, small velocity dispersions, and an interstellar medium that has not been fully processed, provide a local analog to conditions of galaxy interactions in the earlier universe. The frequent and prolonged gravitational encounters that occur in compact groups affect the evolution of the constituent galaxies in a myriad of ways, for example, gas processing and star formation. Recently, a statistically significant "gap" has been discovered in the mid-infrared (MIR: 3.6-8 µm) IRAC color space of compact group galaxies. This gap is not seen in field samples and is a new example of how the compact group environment may affect the evolution of member galaxies. In order to investigate the origin and nature of this gap, we have compiled a larger sample of 37 compact groups in addition to the original 12 groups studied by Johnson et al. (yielding 174 individual galaxies with reliable MIR photometry). We find that a statistically significant deficit of galaxies in this gap region of IRAC color space is persistent in the full sample, lending support to the hypothesis that the compact group environment inhibits moderate specific star formation rates. Using this expanded sample, we have more fully characterized the distribution of galaxies in this color space and quantified the low-density region more fully with respect to MIR bluer and MIR redder colors. We note a curvature in the color-space distribution, which is fully consistent with increasing dust temperature as the activity in a galaxy increases. This full sample of 49 compact groups allows us to subdivide the data according to physical properties of the groups. An analysis of these subsamples indicates that neither projected physical diameter nor density shows a trend in color space within the values represented by this sample. We hypothesize that the apparent lack of a trend is due to the relatively small range of properties in this sample, whose groups have already been pre-selected to be compact and dense. Thus, the relative influence of stochastic effects (such as the particular distribution and amount of star formation in individual galaxies) becomes dominant. We analyze spectral energy distributions of member galaxies as a function of their location in color space and find that galaxies in different regions of MIR color space contain dust with varying temperatures and/or polycyclic aromatic hydrocarbon emission.

  12. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  13. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE PAGES

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    2018-03-01

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  14. Joining the yellow hub: Uses of the Simple Application Messaging Protocol in Space Physics analysis tools

    NASA Astrophysics Data System (ADS)

    Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.

    2014-11-01

    The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.

  15. Technology, Data Bases and System Analysis for Space-to-Ground Optical Communications

    NASA Technical Reports Server (NTRS)

    Lesh, James

    1995-01-01

    Optical communications is becoming an ever-increasingly important option for designers of space-to- ground communications links, whether it be for government or commercial applications. In this paper the technology being developed by NASA for use in space-to-ground optical communications is presented. Next, a program which is collecting a long term data base of atmospheric visibility statistics for optical propagation through the atmosphere will be described. Finally, a methodology for utilizing the statistics of the atmospheric data base in the analysis of space-to-ground links will be presented. This methodology takes into account the effects of station availability, is useful when comparing optical communications with microwave systems, and provides a rationale establishing the recommended link margin.

  16. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  17. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  18. Reflections on Gibbs: From Critical Phenomena to the Amistad

    NASA Astrophysics Data System (ADS)

    Kadanoff, Leo P.

    2003-03-01

    J. Willard Gibbs, the younger was the first American theorist. He was one of the inventors of statistical physics. His introduction and development of the concepts of phase space, phase transitions, and thermodynamic surfaces was remarkably correct and elegant. These three concepts form the basis of different but related areas of physics. The connection among these areas has been a subject of deep reflection from Gibbs' time to our own. I shall talk about these connections by using concepts suggested by the work of Michael Berry and explicitly put forward by the philosopher Robert Batterman. This viewpoint relates theory-connection to the applied mathematics concepts of asymptotic analysis and singular perturbations. J. Willard Gibbs, the younger, had all his achievements concentrated in science. His father, also J. Willard Gibbs, also a Professor at Yale, had one great achievement that remains unmatched in our day. I shall describe it.

  19. Socioeconomic Inequalities in Green Space Quality and Accessibility-Evidence from a Southern European City.

    PubMed

    Hoffimann, Elaine; Barros, Henrique; Ribeiro, Ana Isabel

    2017-08-15

    Background : The provision of green spaces is an important health promotion strategy to encourage physical activity and to improve population health. Green space provision has to be based on the principle of equity. This study investigated the presence of socioeconomic inequalities in geographic accessibility and quality of green spaces across Porto neighbourhoods (Portugal). Methods : Accessibility was evaluated using a Geographic Information System and all the green spaces were audited using the Public Open Space Tool. Kendall's tau-b correlation coefficients and ordinal regression were used to test whether socioeconomic differences in green space quality and accessibility were statistically significant. Results : Although the majority of the neighbourhoods had an accessible green space, mean distance to green space increased with neighbourhood deprivation. Additionally, green spaces in the more deprived neighbourhoods presented significantly more safety concerns, signs of damage, lack of equipment to engage in active leisure activities, and had significantly less amenities such as seating, toilets, cafés, etc. Conclusions : Residents from low socioeconomic positions seem to suffer from a double jeopardy; they lack both individual and community resources. Our results have important planning implications and might contribute to understanding why deprived communities have lower physical activity levels and poorer health.

  20. Measurement-induced-nonlocality for Dirac particles in Garfinkle-Horowitz-Strominger dilation space-time

    NASA Astrophysics Data System (ADS)

    He, Juan; Xu, Shuai; Ye, Liu

    2016-05-01

    We investigate the quantum correlation via measurement-induced-nonlocality (MIN) for Dirac particles in Garfinkle-Horowitz-Strominger (GHS) dilation space-time. It is shown that the physical accessible quantum correlation decreases as the dilation parameter increases monotonically. Unlike the case of scalar fields, the physical accessible correlation is not zero when the Hawking temperature is infinite owing to the Pauli exclusion principle and the differences between Fermi-Dirac and Bose-Einstein statistics. Meanwhile, the boundary of MIN related to Bell-violation is derived, which indicates that MIN is more general than quantum nonlocality captured by the violation of Bell-inequality. As a by-product, a tenable quantitative relation about MIN redistribution is obtained whatever the dilation parameter is. In addition, it is worth emphasizing that the underlying reason why the physical accessible correlation and mutual information decrease is that they are redistributed to the physical inaccessible regions.

  1. Artificial Intelligence in planetary spectroscopy

    NASA Astrophysics Data System (ADS)

    Waldmann, Ingo

    2017-10-01

    The field of exoplanetary spectroscopy is as fast moving as it is new. Analysing currently available observations of exoplanetary atmospheres often invoke large and correlated parameter spaces that can be difficult to map or constrain. This is true for both: the data analysis of observations as well as the theoretical modelling of their atmospheres.Issues of low signal-to-noise data and large, non-linear parameter spaces are nothing new and commonly found in many fields of engineering and the physical sciences. Recent years have seen vast improvements in statistical data analysis and machine learning that have revolutionised fields as diverse as telecommunication, pattern recognition, medical physics and cosmology.In many aspects, data mining and non-linearity challenges encountered in other data intensive fields are directly transferable to the field of extrasolar planets. In this conference, I will discuss how deep neural networks can be designed to facilitate solving said issues both in exoplanet atmospheres as well as for atmospheres in our own solar system. I will present a deep belief network, RobERt (Robotic Exoplanet Recognition), able to learn to recognise exoplanetary spectra and provide artificial intelligences to state-of-the-art atmospheric retrieval algorithms. Furthermore, I will present a new deep convolutional network that is able to map planetary surface compositions using hyper-spectral imaging and demonstrate its uses on Cassini-VIMS data of Saturn.

  2. Electromagnetic sinc Schell-model beams and their statistical properties.

    PubMed

    Mei, Zhangrong; Mao, Yonghua

    2014-09-22

    A class of electromagnetic sources with sinc Schell-model correlations is introduced. The conditions on source parameters guaranteeing that the source generates a physical beam are derived. The evolution behaviors of statistical properties for the electromagnetic stochastic beams generated by this new source on propagating in free space and in atmosphere turbulence are investigated with the help of the weighted superposition method and by numerical simulations. It is demonstrated that the intensity distributions of such beams exhibit unique features on propagating in free space and produce a double-layer flat-top profile of being shape-invariant in the far field. This feature makes this new beam particularly suitable for some special laser processing applications. The influences of the atmosphere turbulence with a non-Kolmogorov power spectrum on statistical properties of the new beams are analyzed in detail.

  3. High-Citation Papers in Space Physics: Examination of Gender, Country, and Paper Characteristics

    NASA Astrophysics Data System (ADS)

    Moldwin, Mark B.; Liemohn, Michael W.

    2018-04-01

    The number of citations to a refereed journal article from other refereed journal articles is a measure of its impact. Papers, individuals, journals, departments, and institutions are increasingly judged by the impact they have in their disciplines, and citation counts are now a relatively easy (though not necessarily accurate or straightforward) way of attempting to quantify impact. This study examines papers published in the Journal of Geophysical Research—Space Physics in the year 2012 (n = 705) and analyzes the characteristics of high-citation papers compared to low-citation papers. We find that high-citation papers generally have a large number of authors (>5) and cite significantly more articles in the reference section than low-citation papers. We also examined the gender and country of institution of the first author and found that there is not a statistically significant gender bias, but there are some significant differences in citation statistics between articles based on the country of first-author institution.

  4. Natural world physical, brain operational, and mind phenomenal space-time

    NASA Astrophysics Data System (ADS)

    Fingelkurts, Andrew A.; Fingelkurts, Alexander A.; Neves, Carlos F. H.

    2010-06-01

    Concepts of space and time are widely developed in physics. However, there is a considerable lack of biologically plausible theoretical frameworks that can demonstrate how space and time dimensions are implemented in the activity of the most complex life-system - the brain with a mind. Brain activity is organized both temporally and spatially, thus representing space-time in the brain. Critical analysis of recent research on the space-time organization of the brain's activity pointed to the existence of so-called operational space-time in the brain. This space-time is limited to the execution of brain operations of differing complexity. During each such brain operation a particular short-term spatio-temporal pattern of integrated activity of different brain areas emerges within related operational space-time. At the same time, to have a fully functional human brain one needs to have a subjective mental experience. Current research on the subjective mental experience offers detailed analysis of space-time organization of the mind. According to this research, subjective mental experience (subjective virtual world) has definitive spatial and temporal properties similar to many physical phenomena. Based on systematic review of the propositions and tenets of brain and mind space-time descriptions, our aim in this review essay is to explore the relations between the two. To be precise, we would like to discuss the hypothesis that via the brain operational space-time the mind subjective space-time is connected to otherwise distant physical space-time reality.

  5. Statistical Analysis of Bus Networks in India

    PubMed Central

    2016-01-01

    In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future. PMID:27992590

  6. Introduction to the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, J. L. (Editor); Peters, D. J. (Editor)

    1985-01-01

    The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.

  7. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  8. Investigation of Pre-Earthquake Ionospheric Disturbances by 3D Tomographic Analysis

    NASA Astrophysics Data System (ADS)

    Yagmur, M.

    2016-12-01

    Ionospheric variations before earthquakes have been widely discussed phenomena in ionospheric studies. To clarify the source and mechanism of these phenomena is highly important for earthquake forecasting. To well understanding the mechanical and physical processes of pre-seismic Ionospheric anomalies that might be related even with Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling, both statistical and 3D modeling analysis are needed. For these purpose, firstly we have investigated the relation between Ionospheric TEC Anomalies and potential source mechanisms such as space weather activity and lithospheric phenomena like positive surface electric charges. To distinguish their effects on Ionospheric TEC, we have focused on pre-seismically active days. Then, we analyzed the statistical data of 54 earthquakes that M≽6 between 2000 and 2013 as well as the 2011 Tohoku and the 2016 Kumamoto Earthquakes in Japan. By comparing TEC anomaly and Solar activity by Dst Index, we have found that 28 events that might be related with Earthquake activity. Following the statistical analysis, we also investigate the Lithospheric effect on TEC change on selected days. Among those days, we have chosen two case studies as the 2011 Tohoku and the 2016 Kumamoto Earthquakes to make 3D reconstructed images by utilizing 3D Tomography technique with Neural Networks. The results will be presented in our presentation. Keywords : Earthquake, 3D Ionospheric Tomography, Positive and Negative Anomaly, Geomagnetic Storm, Lithosphere

  9. Statistical analysis of excitation energies in actinide and rare-earth nuclei

    NASA Astrophysics Data System (ADS)

    Levon, A. I.; Magner, A. G.; Radionov, S. V.

    2018-04-01

    Statistical analysis of distributions of the collective states in actinide and rare-earth nuclei is performed in terms of the nearest-neighbor spacing distribution (NNSD). Several approximations, such as the linear approach to the level repulsion density and that suggested by Brody to the NNSDs were applied for the analysis. We found an intermediate character of the experimental spectra between the order and the chaos for a number of rare-earth and actinide nuclei. The spectra are closer to the Wigner distribution for energies limited by 3 MeV, and to the Poisson distribution for data including higher excitation energies and higher spins. The latter result is in agreement with the theoretical calculations. These features are confirmed by the cumulative distributions, where the Wigner contribution dominates at smaller spacings while the Poisson one is more important at larger spacings, and our linear approach improves the comparison with experimental data at all desired spacings.

  10. Multifractal analysis of geophysical time series in the urban lake of Créteil (France).

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Bonhomme, Celine; Schertzer, Daniel; Lemaire, Bruno Jacques; Vinçon leite, Brigitte; Lovejoy, Shaun

    2013-04-01

    Urban water bodies take part in the environmental quality of the cities. They regulate heat, contribute to the beauty of landscape and give some space for leisure activities (aquatic sports, swimming). As they are often artificial they are only a few meters deep. It confers them some specific properties. Indeed, they are particularly sensitive to global environmental changes, including climate change, eutrophication and contamination by micro-pollutants due to the urbanization of the watershed. Monitoring their quality has become a major challenge for urban areas. The need for a tool for predicting short-term proliferation of potentially toxic phytoplankton therefore arises. In lakes, the behavior of biological and physical (temperature) fields is mainly driven by the turbulence regime in the water. Turbulence is highly non linear, nonstationary and intermittent. This is why statistical tools are needed to characterize the evolution of the fields. The knowledge of the probability distribution of all the statistical moments of a given field is necessary to fully characterize it. This possibility is offered by the multifractal analysis based on the assumption of scale invariance. To investigate the effect of space-time variability of temperature, chlorophyll and dissolved oxygen on the cyanobacteria proliferation in the urban lake of Creteil (France), a spectral analysis is first performed on each time series (or on subsamples) to have an overall estimate of their scaling behaviors. Then a multifractal analysis (Trace Moment, Double Trace Moment) estimates the statistical moments of different orders. This analysis is adapted to the specific properties of the studied time series, i. e. the presence of large scale gradients. The nonlinear behavior of the scaling functions K(q) confirms that the investigated aquatic time series are indeed multifractal and highly intermittent .The knowledge of the universal multifractal parameters is the key to calculate the different statistical moments and thus make some predictions on the fields. As a conclusion, the relationships between the fields will be highlighted with a discussion on the cross predictability of the different fields. This draws a prospective for the use of this kind of time series analysis in the field of limnology. The authors acknowledge the financial support from the R2DS-PLUMMME and Climate-KIC BlueGreenDream projects.

  11. Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.

    NASA Astrophysics Data System (ADS)

    Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.

    2017-12-01

    A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.

  12. Experimental Investigation on Thermal Physical Properties of an Advanced Polyester Material

    NASA Astrophysics Data System (ADS)

    Guangfa, Gao; Shujie, Yuan; Ruiyuan, Huang; Yongchi, Li

    Polyester materials were applied widely in aircraft and space vehicles engineering. Aimed to an advanced polyester material, a series of experiments for thermal physical properties of this material were conducted, and the corresponding performance curves were obtained through statistic analyzing. The experimental results showed good consistency. And then the thermal physical parameters such as thermal expansion coefficient, engineering specific heat and sublimation heat were solved and calculated. This investigation provides an important foundation for the further research on the heat resistance and thermodynamic performance of this material.

  13. A Monte Carlo Analysis of the Thrust Imbalance for the RSRMV Booster During Both the Ignition Transient and Steady State Operation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.

    2014-01-01

    This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle

  14. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  15. Granger causality--statistical analysis under a configural perspective.

    PubMed

    von Eye, Alexander; Wiedermann, Wolfgang; Mun, Eun-Young

    2014-03-01

    The concept of Granger causality can be used to examine putative causal relations between two series of scores. Based on regression models, it is asked whether one series can be considered the cause for the second series. In this article, we propose extending the pool of methods available for testing hypotheses that are compatible with Granger causation by adopting a configural perspective. This perspective allows researchers to assume that effects exist for specific categories only or for specific sectors of the data space, but not for other categories or sectors. Configural Frequency Analysis (CFA) is proposed as the method of analysis from a configural perspective. CFA base models are derived for the exploratory analysis of Granger causation. These models are specified so that they parallel the regression models used for variable-oriented analysis of hypotheses of Granger causation. An example from the development of aggression in adolescence is used. The example shows that only one pattern of change in aggressive impulses over time Granger-causes change in physical aggression against peers.

  16. Improving Metallic Thermal Protection System Hypervelocity Impact Resistance Through Design of Experiments Approach

    NASA Technical Reports Server (NTRS)

    Poteet, Carl C.; Blosser, Max L.

    2001-01-01

    A design of experiments approach has been implemented using computational hypervelocity impact simulations to determine the most effective place to add mass to an existing metallic Thermal Protection System (TPS) to improve hypervelocity impact protection. Simulations were performed using axisymmetric models in CTH, a shock-physics code developed by Sandia National Laboratories, and validated by comparison with existing test data. The axisymmetric models were then used in a statistical sensitivity analysis to determine the influence of five design parameters on degree of hypervelocity particle dispersion. Several damage metrics were identified and evaluated. Damage metrics related to the extent of substructure damage were seen to produce misleading results, however damage metrics related to the degree of dispersion of the hypervelocity particle produced results that corresponded to physical intuition. Based on analysis of variance results it was concluded that the most effective way to increase hypervelocity impact resistance is to increase the thickness of the outer foil layer. Increasing the spacing between the outer surface and the substructure is also very effective at increasing dispersion.

  17. NASA payload data book: Payload analysis for space shuttle applications, volume 2

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Data describing the individual NASA payloads for the space shuttle are presented. The document represents a complete issue of the original payload data book. The subjects discussed are: (1) astronomy, (2) space physics, (3) planetary exploration, (4) earth observations (earth and ocean physics), (5) communications and navigation, (6) life sciences, (7) international rendezvous and docking, and (8) lunar exploration.

  18. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  19. #AltPlanets: Exploring the Exoplanet Catalogue with Neural Networks

    NASA Astrophysics Data System (ADS)

    Laneuville, M.; Tasker, E. J.; Guttenberg, N.

    2017-12-01

    The launch of Kepler in 2009 brought the number of known exoplanets into the thousands, in a growth explosion that shows no sign of abating. While the data available for individual planets is presently typically restricted to orbital and bulk properties, the quantity of data points allows the potential for meaningful statistical analysis. It is not clear how planet mass, radius, orbital path, stellar properties and neighbouring planets influence one another, therefore it seems inevitable that patterns will be missed simply due to the difficulty of including so many dimensions. Even simple trends may be overlooked if they fall outside our expectation of planet formation; a strong risk in a field where new discoveries have destroyed theories from the first observations of hot Jupiters. A possible way forward is to take advantage of the capabilities of neural network autoencoders. The idea of such algorithms is to learn a representation (encoding) of the data in a lower dimension space, without a priori knowledge about links between the elements. This encoding space can then be used to discover the strongest correlations in the original dataset.The key point is that trends identified by a neural network are independent of any previous analysis and pre-conceived ideas about physical processes. Results can reveal new relationships between planet properties and verify existing trends. We applied this concept to study data from the NASA Exoplanet Archive and while we have begun to explore the potential use of neural networks for exoplanet data, there are many possible extensions. For example, the network can produce a large number of 'alternative planets' whose statistics should match the current distribution. This larger dataset could highlight gaps in the parameter space or indicate observations are missing particular regimes. This could guide instrument proposals towards objects liable to yield the most information.

  20. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  1. Meteorological regimes for the classification of aerospace air quality predictions for NASA-Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Stephens, J. B.; Sloan, J. C.

    1976-01-01

    A method is described for developing a statistical air quality assessment for the launch of an aerospace vehicle from the Kennedy Space Center in terms of existing climatological data sets. The procedure can be refined as developing meteorological conditions are identified for use with the NASA-Marshall Space Flight Center Rocket Exhaust Effluent Diffusion (REED) description. Classical climatological regimes for the long range analysis can be narrowed as the synoptic and mesoscale structure is identified. Only broad synoptic regimes are identified at this stage of analysis. As the statistical data matrix is developed, synoptic regimes will be refined in terms of the resulting eigenvectors as applicable to aerospace air quality predictions.

  2. The space of ultrametric phylogenetic trees.

    PubMed

    Gavryushkin, Alex; Drummond, Alexei J

    2016-08-21

    The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Implementation of the Boston University Space Physics Acquisition Center

    NASA Technical Reports Server (NTRS)

    Spence, Harlan E.

    1998-01-01

    The tasks carried out during this grant achieved the goals as set forth in the initial proposal. The Boston University Space Physics Acquisition CEnter (BUSPACE) now provides World Wide Web access to data from a large suite of both space-based and ground-based instruments, archived from different missions, experiments, or campaigns in which researchers associated with the Center for Space Physics (CSP) at Boston University have been involved. These archival data sets are in digital form and are valuable for retrospective data analysis studies of magnetospheric as well as ionospheric, thermospheric, and mesospheric physics. We have leveraged our grass-roots effort with the NASA seed money to establish dedicated hardware (computer and hard disk augmentation) and student support to grow and maintain the system. This leveraging of effort now permits easy access by the space physics community to many underutilized, yet important data sets, one example being that of the SCATHA satellite.

  4. Influence of Computational Drop Representation in LES of a Droplet-Laden Mixing Layer

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Radhakrishnan, Senthilkumaran

    2013-01-01

    Multiphase turbulent flows are encountered in many practical applications including turbine engines or natural phenomena involving particle dispersion. Numerical computations of multiphase turbulent flows are important because they provide a cheaper alternative to performing experiments during an engine design process or because they can provide predictions of pollutant dispersion, etc. Two-phase flows contain millions and sometimes billions of particles. For flows with volumetrically dilute particle loading, the most accurate method of numerically simulating the flow is based on direct numerical simulation (DNS) of the governing equations in which all scales of the flow including the small scales that are responsible for the overwhelming amount of dissipation are resolved. DNS, however, requires high computational cost and cannot be used in engineering design applications where iterations among several design conditions are necessary. Because of high computational cost, numerical simulations of such flows cannot track all these drops. The objective of this work is to quantify the influence of the number of computational drops and grid spacing on the accuracy of predicted flow statistics, and to possibly identify the minimum number, or, if not possible, the optimal number of computational drops that provide minimal error in flow prediction. For this purpose, several Large Eddy Simulation (LES) of a mixing layer with evaporating drops have been performed by using coarse, medium, and fine grid spacings and computational drops, rather than physical drops. To define computational drops, an integer NR is introduced that represents the ratio of the number of existing physical drops to the desired number of computational drops; for example, if NR=8, this means that a computational drop represents 8 physical drops in the flow field. The desired number of computational drops is determined by the available computational resources; the larger NR is, the less computationally intensive is the simulation. A set of first order and second order flow statistics, and of drop statistics are extracted from LES predictions and are compared to results obtained by filtering a DNS database. First order statistics such as Favre averaged stream-wise velocity, Favre averaged vapor mass fraction, and the drop stream-wise velocity, are predicted accurately independent of the number of computational drops and grid spacing. Second order flow statistics depend both on the number of computational drops and on grid spacing. The scalar variance and turbulent vapor flux are predicted accurately by the fine mesh LES only when NR is less than 32, and by the coarse mesh LES reasonably accurately for all NR values. This is attributed to the fact that when the grid spacing is coarsened, the number of drops in a computational cell must not be significantly lower than that in the DNS.

  5. WFIRST Microlensing Exoplanet Characterization with HST Follow up

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Aparna; David Bennett, Jay Anderson, J.P. Beaulieu.

    2018-01-01

    More than 50 planets are discovered with the different ground based telescopes available for microlensing. But the analysis of ground based data fails to provide a complete solution. To fulfill that gap, space based telescopes, like Hubble space telescope and Spitzer are used. My research work focuses on extracting the planet mass, host star mass, their separation and their distance in physical units from HST Follow-up observations. I will present the challenges faced in developing this method.This is the primary method to be used for NASA's top priority project (according to 2010 decadal survey) Wide Field InfraRed Survey Telescope (WFIRST) Exoplanet microlensing space observatory, to be launched in 2025. The unique ability of microlensing is that with WFIRST it can detect sub-earth- mass planets beyond the reach of Kepler at separation 1 AU to infinity. This will provide us the necessary statistics to study the formation and evolution of planetary systems. This will also provide us with necessary initial conditions to model the formation of planets and the habitable zones around M dwarf stars.

  6. Koszul information geometry and Souriau Lie group thermodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbaresco, Frédéric, E-mail: frederic.barbaresco@thalesgroup.com

    The François Massieu 1869 idea to derive some mechanical and thermal properties of physical systems from 'Characteristic Functions', was developed by Gibbs and Duhem in thermodynamics with the concept of potentials, and introduced by Poincaré in probability. This paper deals with generalization of this Characteristic Function concept by Jean-Louis Koszul in Mathematics and by Jean-Marie Souriau in Statistical Physics. The Koszul-Vinberg Characteristic Function (KVCF) on convex cones will be presented as cornerstone of 'Information Geometry' theory, defining Koszul Entropy as Legendre transform of minus the logarithm of KVCF, and Fisher Information Metrics as hessian of these dual functions, invariant bymore » their automorphisms. In parallel, Souriau has extended the Characteristic Function in Statistical Physics looking for other kinds of invariances through co-adjoint action of a group on its momentum space, defining physical observables like energy, heat and momentum as pure geometrical objects. In covariant Souriau model, Gibbs equilibriums states are indexed by a geometric parameter, the Geometric (Planck) Temperature, with values in the Lie algebra of the dynamical Galileo/Poincaré groups, interpreted as a space-time vector, giving to the metric tensor a null Lie derivative. Fisher Information metric appears as the opposite of the derivative of Mean 'Moment map' by geometric temperature, equivalent to a Geometric Capacity or Specific Heat. These elements has been developed by author in [10][11].« less

  7. [Factors associated with physical activity among Chinese immigrant women].

    PubMed

    Cho, Sung-Hye; Lee, Hyeonkyeong

    2013-12-01

    This study was done to assess the level of physical activity among Chinese immigrant women and to determine the relationships of physical activity with individual characteristics and behavior-specific cognition. A cross-sectional descriptive study was conducted with 161 Chinese immigrant women living in Busan. A health promotion model of physical activity adapted from Pender's Health Promotion Model was used. Self-administered questionnaires were used to collect data during the period from September 25 to November 20, 2012. Using SPSS 18.0 program, descriptive statistics, t-test, analysis of variance, correlation analysis, and multiple regression analysis were done. The average level of physical activity of the Chinese immigrant women was 1,050.06 ± 686.47 MET-min/week and the minimum activity among types of physical activity was most dominant (59.6%). As a result of multiple regression analysis, it was confirmed that self-efficacy and acculturation were statistically significant variables in the model (p<.001), with an explanatory power of 23.7%. The results indicate that the development and application of intervention strategies to increase acculturation and self-efficacy for immigrant women will aid in increasing the physical activity in Chinese immigrant women.

  8. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  10. Data management, archiving, visualization and analysis of space physics data

    NASA Technical Reports Server (NTRS)

    Russell, C. T.

    1995-01-01

    A series of programs for the visualization and analysis of space physics data has been developed at UCLA. In the course of those developments, a number of lessons have been learned regarding data management and data archiving, as well as data analysis. The issues now facing those wishing to develop such software, as well as the lessons learned, are reviewed. Modern media have eased many of the earlier problems of the physical volume required to store data, the speed of access, and the permanence of the records. However, the ultimate longevity of these media is still a question of debate. Finally, while software development has become easier, cost is still a limiting factor in developing visualization and analysis software.

  11. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  12. Socioeconomic Inequalities in Green Space Quality and Accessibility—Evidence from a Southern European City

    PubMed Central

    Hoffimann, Elaine; Barros, Henrique; Ribeiro, Ana Isabel

    2017-01-01

    Background: The provision of green spaces is an important health promotion strategy to encourage physical activity and to improve population health. Green space provision has to be based on the principle of equity. This study investigated the presence of socioeconomic inequalities in geographic accessibility and quality of green spaces across Porto neighbourhoods (Portugal). Methods: Accessibility was evaluated using a Geographic Information System and all the green spaces were audited using the Public Open Space Tool. Kendall’s tau-b correlation coefficients and ordinal regression were used to test whether socioeconomic differences in green space quality and accessibility were statistically significant. Results: Although the majority of the neighbourhoods had an accessible green space, mean distance to green space increased with neighbourhood deprivation. Additionally, green spaces in the more deprived neighbourhoods presented significantly more safety concerns, signs of damage, lack of equipment to engage in active leisure activities, and had significantly less amenities such as seating, toilets, cafés, etc. Conclusions: Residents from low socioeconomic positions seem to suffer from a double jeopardy; they lack both individual and community resources. Our results have important planning implications and might contribute to understanding why deprived communities have lower physical activity levels and poorer health. PMID:28809798

  13. The Importance of Physical Fitness versus Physical Activity for Coronary Artery Disease Risk Factors: A Cross-Sectional Analysis.

    ERIC Educational Resources Information Center

    Young, Deborah Rohm; Steinhardt, Mary A.

    1993-01-01

    This cross-sectional study examined relationships among physical fitness, physical activity, and risk factors for coronary artery disease (CAD) in male police officers. Data from screenings and physical fitness assessments indicated physical activity must be sufficient to influence fitness before obtaining statistically significant risk-reducing…

  14. Planetary atmospheric physics and solar physics research

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An overview is presented on current and planned research activities in the major areas of solar physics, planetary atmospheres, and space astronomy. The approach to these unsolved problems involves experimental techniques, theoretical analysis, and the use of computers to analyze the data from space experiments. The point is made that the research program is characterized by each activity interacting with the other activities in the laboratory.

  15. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  16. Physical soil quality indicators for monitoring British soils

    NASA Astrophysics Data System (ADS)

    Corstanje, Ron; Mercer, Theresa G.; Rickson, Jane R.; Deeks, Lynda K.; Newell-Price, Paul; Holman, Ian; Kechavarsi, Cedric; Waine, Toby W.

    2017-09-01

    Soil condition or quality determines its ability to deliver a range of functions that support ecosystem services, human health and wellbeing. The increasing policy imperative to implement successful soil monitoring programmes has resulted in the demand for reliable soil quality indicators (SQIs) for physical, biological and chemical soil properties. The selection of these indicators needs to ensure that they are sensitive and responsive to pressure and change, e.g. they change across space and time in relation to natural perturbations and land management practices. Using a logical sieve approach based on key policy-related soil functions, this research assessed whether physical soil properties can be used to indicate the quality of British soils in terms of their capacity to deliver ecosystem goods and services. The resultant prioritised list of physical SQIs was tested for robustness, spatial and temporal variability, and expected rate of change using statistical analysis and modelling. Seven SQIs were prioritised: soil packing density, soil water retention characteristics, aggregate stability, rate of soil erosion, depth of soil, soil structure (assessed by visual soil evaluation) and soil sealing. These all have direct relevance to current and likely future soil and environmental policy and are appropriate for implementation in soil monitoring programmes.

  17. Block observations of neighbourhood physical disorder are associated with neighbourhood crime, firearm injuries and deaths, and teen births.

    PubMed

    Wei, Evelyn; Hipwell, Alison; Pardini, Dustin; Beyers, Jennifer M; Loeber, Rolf

    2005-10-01

    To provide reliability information for a brief observational measure of physical disorder and determine its relation with neighbourhood level crime and health variables after controlling for census based measures of concentrated poverty and minority concentration. Psychometric analysis of block observation data comprising a brief measure of neighbourhood physical disorder, and cross sectional analysis of neighbourhood physical disorder, neighbourhood crime and birth statistics, and neighbourhood level poverty and minority concentration. Pittsburgh, Pennsylvania, US (2000 population=334 563). Pittsburgh neighbourhoods (n=82) and their residents (as reflected in neighbourhood level statistics). The physical disorder index showed adequate reliability and validity and was associated significantly with rates of crime, firearm injuries and homicides, and teen births, while controlling for concentrated poverty and minority population. This brief measure of neighbourhood physical disorder may help increase our understanding of how community level factors reflect health and crime outcomes.

  18. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  19. Statistical approaches to account for missing values in accelerometer data: Applications to modeling physical activity.

    PubMed

    Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki

    2018-04-01

    Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.

  20. System Analysis for the Huntsville Operation Support Center, Distributed Computer System

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Massey, D.

    1985-01-01

    HOSC as a distributed computing system, is responsible for data acquisition and analysis during Space Shuttle operations. HOSC also provides computing services for Marshall Space Flight Center's nonmission activities. As mission and nonmission activities change, so do the support functions of HOSC change, demonstrating the need for some method of simulating activity at HOSC in various configurations. The simulation developed in this work primarily models the HYPERchannel network. The model simulates the activity of a steady state network, reporting statistics such as, transmitted bits, collision statistics, frame sequences transmitted, and average message delay. These statistics are used to evaluate such performance indicators as throughout, utilization, and delay. Thus the overall performance of the network is evaluated, as well as predicting possible overload conditions.

  1. Markov-switching multifractal models as another class of random-energy-like models in one-dimensional space

    NASA Astrophysics Data System (ADS)

    Saakian, David B.

    2012-03-01

    We map the Markov-switching multifractal model (MSM) onto the random energy model (REM). The MSM is, like the REM, an exactly solvable model in one-dimensional space with nontrivial correlation functions. According to our results, four different statistical physics phases are possible in random walks with multifractal behavior. We also introduce the continuous branching version of the model, calculate the moments, and prove multiscaling behavior. Different phases have different multiscaling properties.

  2. The statistical kinematical theory of X-ray diffraction as applied to reciprocal-space mapping

    PubMed

    Nesterets; Punegov

    2000-11-01

    The statistical kinematical X-ray diffraction theory is developed to describe reciprocal-space maps (RSMs) from deformed crystals with defects of the structure. The general solutions for coherent and diffuse components of the scattered intensity in reciprocal space are derived. As an example, the explicit expressions for intensity distributions in the case of spherical defects and of a mosaic crystal were obtained. The theory takes into account the instrumental function of the triple-crystal diffractometer and can therefore be used for experimental data analysis.

  3. Experimental Investigation on Thermal Physical Properties of an Advanced Glass Fiber Composite Material

    NASA Astrophysics Data System (ADS)

    Guangfa, Gao; Yongchi, Li; Zheng, Jing; Shujie, Yuan

    Fiber reinforced composite materials were applied widely in aircraft and space vehicles engineering. Aimed to an advanced glass fiber reinforced composite material, a series of experiments for measuring thermal physical properties of this material were conducted, and the corresponding performance curves were obtained through statistic analyzing. The experimental results showed good consistency. And then the thermal physical parameters such as thermal expansion coefficient, engineering specific heat and sublimation heat were solved and calculated. This investigation provides an important foundation for the further research on the heat resistance and thermodynamic performance of this material.

  4. A Methodology to Seperate and Analyze a Seismic Wide Angle Profile

    NASA Astrophysics Data System (ADS)

    Weinzierl, Wolfgang; Kopp, Heidrun

    2010-05-01

    General solutions of inverse problems can often be obtained through the introduction of probability distributions to sample the model space. We present a simple approach of defining an a priori space in a tomographic study and retrieve the velocity-depth posterior distribution by a Monte Carlo method. Utilizing a fitting routine designed for very low statistics to setup and analyze the obtained tomography results, it is possible to statistically separate the velocity-depth model space derived from the inversion of seismic refraction data. An example of a profile acquired in the Lesser Antilles subduction zone reveals the effectiveness of this approach. The resolution analysis of the structural heterogeneity includes a divergence analysis which proves to be capable of dissecting long wide-angle profiles for deep crust and upper mantle studies. The complete information of any parameterised physical system is contained in the a posteriori distribution. Methods for analyzing and displaying key properties of the a posteriori distributions of highly nonlinear inverse problems are therefore essential in the scope of any interpretation. From this study we infer several conclusions concerning the interpretation of the tomographic approach. By calculating a global as well as singular misfits of velocities we are able to map different geological units along a profile. Comparing velocity distributions with the result of a tomographic inversion along the profile we can mimic the subsurface structures in their extent and composition. The possibility of gaining a priori information for seismic refraction analysis by a simple solution to an inverse problem and subsequent resolution of structural heterogeneities through a divergence analysis is a new and simple way of defining a priori space and estimating the a posteriori mean and covariance in singular and general form. The major advantage of a Monte Carlo based approach in our case study is the obtained knowledge of velocity depth distributions. Certainly the decision of where to extract velocity information on the profile for setting up a Monte Carlo ensemble is limiting the a priori space. However, the general conclusion of analyzing the velocity field according to distinct reference distributions gives us the possibility to define the covariance according to any geological unit if we have a priori information on the velocity depth distributions. Using the wide angle data recorded across the Lesser Antilles arc, we are able to resolve a shallow feature like the backstop by a robust and simple divergence analysis. We demonstrate the effectiveness of the new methodology to extract some key features and properties from the inversion results by including information concerning the confidence level of results.

  5. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  6. Innovative techniques to analyze time series of geomagnetic activity indices

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  7. Mean-field approximation for spacing distribution functions in classical systems.

    PubMed

    González, Diego Luis; Pimpinelli, Alberto; Einstein, T L

    2012-01-01

    We propose a mean-field method to calculate approximately the spacing distribution functions p((n))(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p((n))(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed. © 2012 American Physical Society

  8. Deep Strong Coupling Regime of the Jaynes-Cummings Model

    NASA Astrophysics Data System (ADS)

    Casanova, J.; Romero, G.; Lizuain, I.; García-Ripoll, J. J.; Solano, E.

    2010-12-01

    We study the quantum dynamics of a two-level system interacting with a quantized harmonic oscillator in the deep strong coupling regime (DSC) of the Jaynes-Cummings model, that is, when the coupling strength g is comparable or larger than the oscillator frequency ω (g/ω≳1). In this case, the rotating-wave approximation cannot be applied or treated perturbatively in general. We propose an intuitive and predictive physical frame to describe the DSC regime where photon number wave packets bounce back and forth along parity chains of the Hilbert space, while producing collapse and revivals of the initial population. We exemplify our physical frame with numerical and analytical considerations in the qubit population, photon statistics, and Wigner phase space.

  9. Re-entry survivability and risk

    NASA Astrophysics Data System (ADS)

    Fudge, Michael L.

    1998-11-01

    This paper is the culmination of the research effort which was reported on last year while still in-progress. As previously reported, statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by reentering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was demonstrated in dramatic fashion in January 1997 by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This report details reentry survivability estimation methodology, including the specific methodology used by ITT Systems' (formerly Kaman Sciences) 'SURVIVE' model. The major change to the model in the last twelve months has been the increase in the fidelity with which upper- atmospheric aerodynamics has been modeled. This has resulted in an adjustment in the factor relating the amount of kinetic energy loss to the amount of heating entering and reentering body, and also validated and removed the necessity for certain empirically-based adjustments made to the theoretical heating expressions. Comparisons between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and SURVIVE estimates are presented for selected generic upper stage or spacecraft components, a Soyuz launch vehicle second stage, and for a Delta II launch vehicle second stage and its significant components. Significant similarity is demonstrated between the type and dispersion pattern of the recovered debris from the January 1997 Delta II 2nd stage event and the simulation of that reentry and breakup.

  10. Precision Cosmology

    NASA Astrophysics Data System (ADS)

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  11. Space Science at Los Alamos National Laboratory

    NASA Astrophysics Data System (ADS)

    Smith, Karl

    2017-09-01

    The Space Science and Applications group (ISR-1) in the Intelligence and Space Research (ISR) division at the Los Alamos National Laboratory lead a number of space science missions for civilian and defense-related programs. In support of these missions the group develops sensors capable of detecting nuclear emissions and measuring radiations in space including γ-ray, X-ray, charged-particle, and neutron detection. The group is involved in a number of stages of the lifetime of these sensors including mission concept and design, simulation and modeling, calibration, and data analysis. These missions support monitoring of the atmosphere and near-Earth space environment for nuclear detonations as well as monitoring of the local space environment including space-weather type events. Expertise in this area has been established over a long history of involvement with cutting-edge projects continuing back to the first space based monitoring mission Project Vela. The group's interests cut across a large range of topics including non-proliferation, space situational awareness, nuclear physics, material science, space physics, astrophysics, and planetary physics.

  12. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  13. Dimensional Analysis in Physics and the Buckingham Theorem

    ERIC Educational Resources Information Center

    Misic, Tatjana; Najdanovic-Lukic, Marina; Nesic, Ljubisa

    2010-01-01

    Dimensional analysis is a simple, clear and intuitive method for determining the functional dependence of physical quantities that are of importance to a certain process. However, in physics textbooks, very little space is usually given to this approach and it is often presented only as a diagnostic tool used to determine the validity of…

  14. The joint space-time statistics of macroweather precipitation, space-time statistical factorization and macroweather models.

    PubMed

    Lovejoy, S; de Lima, M I P

    2015-07-01

    Over the range of time scales from about 10 days to 30-100 years, in addition to the familiar weather and climate regimes, there is an intermediate "macroweather" regime characterized by negative temporal fluctuation exponents: implying that fluctuations tend to cancel each other out so that averages tend to converge. We show theoretically and numerically that macroweather precipitation can be modeled by a stochastic weather-climate model (the Climate Extended Fractionally Integrated Flux, model, CEFIF) first proposed for macroweather temperatures and we show numerically that a four parameter space-time CEFIF model can approximately reproduce eight or so empirical space-time exponents. In spite of this success, CEFIF is theoretically and numerically difficult to manage. We therefore propose a simplified stochastic model in which the temporal behavior is modeled as a fractional Gaussian noise but the spatial behaviour as a multifractal (climate) cascade: a spatial extension of the recently introduced ScaLIng Macroweather Model, SLIMM. Both the CEFIF and this spatial SLIMM model have a property often implicitly assumed by climatologists that climate statistics can be "homogenized" by normalizing them with the standard deviation of the anomalies. Physically, it means that the spatial macroweather variability corresponds to different climate zones that multiplicatively modulate the local, temporal statistics. This simplified macroweather model provides a framework for macroweather forecasting that exploits the system's long range memory and spatial correlations; for it, the forecasting problem has been solved. We test this factorization property and the model with the help of three centennial, global scale precipitation products that we analyze jointly in space and in time.

  15. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  16. Documentation and Validation of the Goddard Earth Observing System (GEOS) Data Assimilation System, Version 4

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); daSilva, Arlindo; Dee, Dick; Bloom, Stephen; Bosilovich, Michael; Pawson, Steven; Schubert, Siegfried; Wu, Man-Li; Sienkiewicz, Meta; Stajner, Ivanka

    2005-01-01

    This document describes the structure and validation of a frozen version of the Goddard Earth Observing System Data Assimilation System (GEOS DAS): GEOS-4.0.3. Significant features of GEOS-4 include: version 3 of the Community Climate Model (CCM3) with the addition of a finite volume dynamical core; version two of the Community Land Model (CLM2); the Physical-space Statistical Analysis System (PSAS); and an interactive retrieval system (iRET) for assimilating TOVS radiance data. Upon completion of the GEOS-4 validation in December 2003, GEOS-4 became operational on 15 January 2004. Products from GEOS-4 have been used in supporting field campaigns and for reprocessing several years of data for CERES.

  17. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  18. Beyond the plane-parallel approximation for redshift surveys

    NASA Astrophysics Data System (ADS)

    Castorina, Emanuele; White, Martin

    2018-06-01

    Redshift -space distortions privilege the location of the observer in cosmological redshift surveys, breaking the translational symmetry of the underlying theory. This violation of statistical homogeneity has consequences for the modelling of clustering observables, leading to what are frequently called `wide-angle effects'. We study these effects analytically, computing their signature in the clustering of the multipoles in configuration and Fourier space. We take into account both physical wide-angle contributions as well as the terms generated by the galaxy selection function. Similar considerations also affect the way power spectrum estimators are constructed. We quantify in an analytical way the biases that enter and clarify the relation between what we measure and the underlying theoretical modelling. The presence of an angular window function is also discussed. Motivated by this analysis, we present new estimators for the three dimensional Cartesian power spectrum and bispectrum multipoles written in terms of spherical Fourier-Bessel coefficients. We show how the latter have several interesting properties, allowing in particular a clear separation between angular and radial modes.

  19. Statistics at the Chinese Universities.

    DTIC Science & Technology

    1981-09-01

    education in China in the postwar years is pro- vided to give some perspective. My observa- tions on statistics at the Chinese universities are necessarily...has been accepted as a member society of ISI. 3. Education in China Understanding of statistics in universities in China will be enhanced through some...programaming), Statistical Mathematics (infer- ence, data analysis, industrial statistics , information theory), tiathematical Physics (dif- ferential

  20. SAVS: A Space and Atmospheric Visualization Science system

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P.; Mankofsky, A.; Blanchard, P.; Goodrich, C.; McNabb, D.; Kamins, D.

    1995-01-01

    The research environment faced by space and atmospheric scientists in the 1990s is characterized by unprecedented volumes of new data, by ever-increasing repositories of unexploited mission files, and by the widespread use of empirical and large-scale computational models needed for the synthesis of understanding across data sets and discipline boundaries. The effective analysis and interpretation of such massive amounts of information have become the subjects of legitimate concern. With SAVS (a Space and Atmospheric Visualization Science System), we address these issues by creating a 'push-button' software environment that mimics the logical scientific processes in data acquisition, reduction, and analysis without requiring a detailed understanding of the methods, networks, and modules that link the tools and effectively execute the functions. SAVS provides (1) a customizable framework for accessing a powerful set of visualization tools based on the popular AVS visualization software with hooks to PV-Wave and access to Khoros modules, (2) a set of mathematical and statistical tools, (3) an extensible library of discipline-specific functions and models (e.g., MSIS, IRI, Feldstein Oval, IGRF, satellite tracking with CADRE-3, etc.), and (4) capabilities for local and remote data base access. The system treats scalar, vector, and image data, and runs on most common Unix workstations. We present a description of SAVS and its components, followed by several applications based on generic research interests in interplanetary and magnetospheric physics (IMP/ISTP), active experiments in space (CRRES), and mission planning focused on the Earth's thermospheric, ionospheric, and mesospheric domains (TIMED).

  1. Spectral analysis of groove spacing on Ganymede

    NASA Technical Reports Server (NTRS)

    Grimm, R. E.

    1984-01-01

    The technique used to analyze groove spacing on Ganymede is presented. Data from Voyager images are used determine the surface topography and position of the grooves. Power spectal estimates are statistically analyzed and sample data is included.

  2. Monte Carlo investigation of transient acoustic fields in partially or completely bounded medium. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Thanedar, B. D.

    1972-01-01

    A simple repetitive calculation was used to investigate what happens to the field in terms of the signal paths of disturbances originating from the energy source. The computation allowed the field to be reconstructed as a function of space and time on a statistical basis. The suggested Monte Carlo method is in response to the need for a numerical method to supplement analytical methods of solution which are only valid when the boundaries have simple shapes, rather than for a medium that is bounded. For the analysis, a suitable model was created from which was developed an algorithm for the estimation of acoustic pressure variations in the region under investigation. The validity of the technique was demonstrated by analysis of simple physical models with the aid of a digital computer. The Monte Carlo method is applicable to a medium which is homogeneous and is enclosed by either rectangular or curved boundaries.

  3. Interplay of weak interactions in the atom-by-atom condensation of xenon within quantum boxes

    PubMed Central

    Nowakowska, Sylwia; Wäckerlin, Aneliia; Kawai, Shigeki; Ivas, Toni; Nowakowski, Jan; Fatayer, Shadi; Wäckerlin, Christian; Nijs, Thomas; Meyer, Ernst; Björk, Jonas; Stöhr, Meike; Gade, Lutz H.; Jung, Thomas A.

    2015-01-01

    Condensation processes are of key importance in nature and play a fundamental role in chemistry and physics. Owing to size effects at the nanoscale, it is conceptually desired to experimentally probe the dependence of condensate structure on the number of constituents one by one. Here we present an approach to study a condensation process atom-by-atom with the scanning tunnelling microscope, which provides a direct real-space access with atomic precision to the aggregates formed in atomically defined ‘quantum boxes’. Our analysis reveals the subtle interplay of competing directional and nondirectional interactions in the emergence of structure and provides unprecedented input for the structural comparison with quantum mechanical models. This approach focuses on—but is not limited to—the model case of xenon condensation and goes significantly beyond the well-established statistical size analysis of clusters in atomic or molecular beams by mass spectrometry. PMID:25608225

  4. Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report

    NASA Technical Reports Server (NTRS)

    Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick

    2009-01-01

    The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts as well as performing major probabilistic assessments used to support flight rationale and help establish program requirements. During 2008, the Analysis Group performed more than 70 assessments. Although all these assessments were important, some were instrumental in the decisionmaking processes for the Shuttle and Constellation Programs. Two of the more significant tasks were the Space Transportation System (STS)-122 Low Level Cutoff PRA for the SSP and the Orion Pad Abort One (PA-1) PRA for the CxP. These two activities, along with the numerous other tasks the Analysis Group performed in 2008, are summarized in this report. This report also highlights several ongoing and upcoming efforts to provide crucial statistical and probabilistic assessments, such as the Extravehicular Activity (EVA) PRA for the Hubble Space Telescope service mission and the first fully integrated PRAs for the CxP's Lunar Sortie and ISS missions.

  5. Solar and Space Physics PhD Production and Job Availability: Implications for the Future of the Space Weather Research Workforce

    NASA Astrophysics Data System (ADS)

    Moldwin, M.; Morrow, C. A.; Moldwin, L. A.; Torrence, J.

    2012-12-01

    To assess the state-of-health of the field of Solar and Space Physics an analysis of the number of Ph.D.s produced and number of Job Postings each year was done for the decade 2001-2010. To determine the number of Ph.D's produced in the field, the University of Michigan Ph.D. Dissertation Archive (Proquest) was queried for Solar and Space Physics dissertations produced in North America. The field generated about 30 Ph.D. per year from 2001 to 2006, but then saw the number increase to 50 to 70 per year for the rest of the decade. Only 14 institutions account for the majority of Solar and Space Physics PhDs. To estimate the number of jobs available each year in the field, a compilation of the job advertisements listed in the American Astronomical Society's Solar Physics Division (SPD) and the American Geophysical Union's Space Physics and Aeronomy (SPA) electronic newsletters was done. The positions were sorted into four types (Faculty, Post-doctoral Researcher, and Scientist/Researcher or Staff), institution type (academic, government lab, or industry) and if the position was located inside or outside the United States. Overall worldwide, 943 Solar and Space Physics positions were advertised over the decade. Of this total, 52% were for positions outside the US. Within Solar Physics, 44% of the positions were in the US, while in Space Physics 57% of the positions were for US institutions. The annual average for positions in the US were 26.9 for Solar Physics and 31.5 for Space Physics though there is much variability year-to-year particularly in Solar Physics positions outside the US. A disconcerting trend is a decline in job advertisements in the last two years for Solar Physics positions and between 2009 and 2010 for Space Physics positions. For both communities within the US in 2010, the total job ads reached their lowest levels in the decade (14), approximately half the decadal average number of job advertisements.

  6. The Auroral Planetary Imaging and Spectroscopy (APIS) service

    NASA Astrophysics Data System (ADS)

    Lamy, L.; Prangé, R.; Henry, F.; Le Sidaner, P.

    2015-06-01

    The Auroral Planetary Imaging and Spectroscopy (APIS) service, accessible online, provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro-imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multi-spectral combined analysis.

  7. Space Radiation Effects on Human Cells: Modeling DNA Breakage, DNA Damage Foci Distribution, Chromosomal Aberrations and Tissue Effects

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Huff, J. L.; Cucinotta, F. A.

    2011-01-01

    Future long-tem space travel will face challenges from radiation concerns as the space environment poses health risk to humans in space from radiations with high biological efficiency and adverse post-flight long-term effects. Solar particles events may dramatically affect the crew performance, while Galactic Cosmic Rays will induce a chronic exposure to high-linear-energy-transfer (LET) particles. These types of radiation, not present on the ground level, can increase the probability of a fatal cancer later in astronaut life. No feasible shielding is possible from radiation in space, especially for the heavy ion component, as suggested solutions will require a dramatic increase in the mass of the mission. Our research group focuses on fundamental research and strategic analysis leading to better shielding design and to better understanding of the biological mechanisms of radiation damage. We present our recent effort to model DNA damage and tissue damage using computational models based on the physics of heavy ion radiation, DNA structure and DNA damage and repair in human cells. Our particular area of expertise include the clustered DNA damage from high-LET radiation, the visualization of DSBs (DNA double strand breaks) via DNA damage foci, image analysis and the statistics of the foci for different experimental situations, chromosomal aberration formation through DSB misrepair, the kinetics of DSB repair leading to a model-derived spectrum of chromosomal aberrations, and, finally, the simulation of human tissue and the pattern of apoptotic cell damage. This compendium of theoretical and experimental data sheds light on the complex nature of radiation interacting with human DNA, cells and tissues, which can lead to mutagenesis and carcinogenesis later in human life after the space mission.

  8. National Space Science Data Center and World Data Center A for Rockets and Satellites - Ionospheric data holdings and services

    NASA Technical Reports Server (NTRS)

    Bilitza, D.; King, J. H.

    1988-01-01

    The activities and services of the National Space Science data Center (NSSDC) and the World Data Center A for Rockets and Satellites (WDC-A-R and S) are described with special emphasis on ionospheric physics. The present catalog/archive system is explained and future developments are indicated. In addition to the basic data acquisition, archiving, and dissemination functions, ongoing activities include the Central Online Data Directory (CODD), the Coordinated Data Analysis Workshopps (CDAW), the Space Physics Analysis Network (SPAN), advanced data management systems (CD/DIS, NCDS, PLDS), and publication of the NSSDC News, the SPACEWARN Bulletin, and several NSSD reports.

  9. Design and Development of the Observation and Analysis of Smectic Islands in Space Experiment

    NASA Technical Reports Server (NTRS)

    Hall, Nancy Rabel; Tin, Padetha; Sheehan, C. C.; Stannarius, R.; Trittel, T.; Clark, N.; Maclennan, J.; Glaser, M.; Park, C.

    2012-01-01

    The primary objective of Observation and Analysis of Smectic Islands in Space (OASIS) experiment is to exploit the unique characteristics of freely suspended liquid crystals in a microgravity environment to advance the understanding of fluid state physics

  10. Networks In Real Space: Characteristics and Analysis for Biology and Mechanics

    NASA Astrophysics Data System (ADS)

    Modes, Carl; Magnasco, Marcelo; Katifori, Eleni

    Functional networks embedded in physical space play a crucial role in countless biological and physical systems, from the efficient dissemination of oxygen, blood sugars, and hormonal signals in vascular systems to the complex relaying of informational signals in the brain to the distribution of stress and strain in architecture or static sand piles. Unlike their more-studied abstract cousins, such as the hyperlinked internet, social networks, or economic and financial connections, these networks are both constrained by and intimately connected to the physicality of their real, embedding space. We report on the results of new computational and analytic approaches tailored to these physical networks with particular implications and insights for mammalian organ vasculature.

  11. Research in space physics at the University of Iowa. [astronomical observatories, spaceborne astronomy, satellite observation

    NASA Technical Reports Server (NTRS)

    Vanallen, J. A.

    1974-01-01

    Various research projects in space physics are summarized. Emphasis is placed on: (1) the study of energetic particles in outer space and their relationships to electric, magnetic, and electromagnetic fields associated with the earth, the sun, the moon, the planets, and interplanetary medium; (2) observational work on satellites of the earth and the moon, and planetary and interplanetary spacecraft; (3) phenomenological analysis and interpretation; (4) observational work by ground based radio-astronomical and optical techniques; and (5) theoretical problems in plasma physics. Specific fields of current investigations are summarized.

  12. Space weather: Why are magnetospheric physicists interested in solar explosive phenomena

    NASA Astrophysics Data System (ADS)

    Koskinen, H. E. J.; Pulkkinen, T. I.

    That solar activity drives magnetospheric dynamics has for a long time been the basis of solar-terrestrial physics. Numerous statistical studies correlating sunspots, 10.7 cm radiation, solar flares, etc., with various magnetospheric and geomagnetic parameters have been performed. However, in studies of magnetospheric dynamics the role of the Sun has often remained in the background and only the actual solar wind impinging the magnetosphere has gained most of the attention. During the last few years a new applied field of solar-terrestrial physics, space weather, has emerged. The term refers to variable particle and field conditions in our space environment, which may be hazardous to space-borne or ground-based technological systems and can endanger human life and health. When the modern society is becoming increasingly dependent on space technology, the need for better modelling and also forecasting of space weather becomes urgent. While for post analysis of magnetospheric phenomena it is quite sufficient to include observations from the magnetospheric boundaries out to L1 where SOHO is located, these observations do not provide enough lead-time to run space weather forecasting models and to distribute the forecasts to potential customers. For such purposes we need improved physical understanding and models to predict which active processes on the Sun will impact the magnetosphere and what their expected consequences are. An important change of view on the role of the Sun as the origin of magnetospheric disturbances has taken place during last 10--20 years. For a long time, the solar flares were thought to be the most geoeffective solar phenomena. Now the attention has shifted much more towards coronal mass ejections and the SOHO coronal observations seem to have turned the epoch irreversibly. However, we are not yet ready to make reliable perdictions of the terrestrial environment based on CME observations. From the space weather viewpoint, the key questions are when a CME will be ejected, will it hit the Earth, what will its density and speed be, and how the magnetic field will be wrapped around the plasma cloud. This is clearly an enormous modelling task, but very forthwhile to carry further. Also forecasting of the solar energetic particle events would be very usefule as they form the most hazardous single effect on spaceflight, be that on the Space Station, on the Moon, or even further. We illustrate the chain of effects from the solar atmosphere to near-Earth space using some of the CME-associated magnetic storm events from the SOHO era.

  13. Retrospective space-time cluster analysis of whooping cough, re-emergence in Barcelona, Spain, 2000-2011.

    PubMed

    Solano, Rubén; Gómez-Barroso, Diana; Simón, Fernando; Lafuente, Sarah; Simón, Pere; Rius, Cristina; Gorrindo, Pilar; Toledo, Diana; Caylà, Joan A

    2014-05-01

    A retrospective, space-time study of whooping cough cases reported to the Public Health Agency of Barcelona, Spain between the years 2000 and 2011 is presented. It is based on 633 individual whooping cough cases and the 2006 population census from the Spanish National Statistics Institute, stratified by age and sex at the census tract level. Cluster identification was attempted using space-time scan statistic assuming a Poisson distribution and restricting temporal extent to 7 days and spatial distance to 500 m. Statistical calculations were performed with Stata 11 and SatScan and mapping was performed with ArcGis 10.0. Only clusters showing statistical significance (P <0.05) were mapped. The most likely cluster identified included five census tracts located in three neighbourhoods in central Barcelona during the week from 17 to 23 August 2011. This cluster included five cases compared with the expected level of 0.0021 (relative risk = 2436, P <0.001). In addition, 11 secondary significant space-time clusters were detected with secondary clusters occurring at different times and localizations. Spatial statistics is felt to be useful by complementing epidemiological surveillance systems through visualizing excess in the number of cases in space and time and thus increase the possibility of identifying outbreaks not reported by the surveillance system.

  14. Evidence of nonextensive statistical physics behavior in the watershed distribution in active tectonic areas: examples from Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Kouli, Maria

    2013-08-01

    The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.

  15. Quantum work in the Bohmian framework

    NASA Astrophysics Data System (ADS)

    Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.

    2018-01-01

    At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.

  16. A statistical physics viewpoint on the dynamics of the bouncing ball

    NASA Astrophysics Data System (ADS)

    Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric

    2016-06-01

    We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.

  17. Space biology initiative program definition review. Trade study 3: Hardware miniaturization versus cost

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry

    1989-01-01

    The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided.

  18. Fundamental physical theories: Mathematical structures grounded on a primitive ontology

    NASA Astrophysics Data System (ADS)

    Allori, Valia

    In my dissertation I analyze the structure of fundamental physical theories. I start with an analysis of what an adequate primitive ontology is, discussing the measurement problem in quantum mechanics and theirs solutions. It is commonly said that these theories have little in common. I argue instead that the moral of the measurement problem is that the wave function cannot represent physical objects and a common structure between these solutions can be recognized: each of them is about a clear three-dimensional primitive ontology that evolves according to a law determined by the wave function. The primitive ontology is what matter is made of while the wave function tells the matter how to move. One might think that what is important in the notion of primitive ontology is their three-dimensionality. If so, in a theory like classical electrodynamics electromagnetic fields would be part of the primitive ontology. I argue that, reflecting on what the purpose of a fundamental physical theory is, namely to explain the behavior of objects in three-dimensional space, one can recognize that a fundamental physical theory has a particular architecture. If so, electromagnetic fields play a different role in the theory than the particles and therefore should be considered, like the wave function, as part of the law. Therefore, we can characterize the general structure of a fundamental physical theory as a mathematical structure grounded on a primitive ontology. I explore this idea to better understand theories like classical mechanics and relativity, emphasizing that primitive ontology is crucial in the process of building new theories, being fundamental in identifying the symmetries. Finally, I analyze what it means to explain the word around us in terms of the notion of primitive ontology in the case of regularities of statistical character. Here is where the notion of typicality comes into play: we have explained a phenomenon if the typical histories of the primitive ontology give rise to the statistical regularities we observe.

  19. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution.

    PubMed

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-08

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.

  20. RooStatsCms: A tool for analysis modelling, combination and statistical studies

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Schott, G.; Quast, G.

    2010-04-01

    RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.

  1. Parallelization of the Physical-Space Statistical Analysis System (PSAS)

    NASA Technical Reports Server (NTRS)

    Larson, J. W.; Guo, J.; Lyster, P. M.

    1999-01-01

    Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational reproducibility is well known in the parallel computing community. It is a requirement that the parallel code perform calculations in a fashion that will yield identical results on different configurations of processing elements on the same platform. In some cases this problem can be solved by sacrificing performance. Meeting this requirement and still achieving high performance is very difficult. Topics to be discussed include: current PSAS design and parallelization strategy; reproducibility issues; load balance vs. database memory demands, possible solutions to these problems.

  2. Probing the space of toric quiver theories

    NASA Astrophysics Data System (ADS)

    Hewlett, Joseph; He, Yang-Hui

    2010-03-01

    We demonstrate a practical and efficient method for generating toric Calabi-Yau quiver theories, applicable to both D3 and M2 brane world-volume physics. A new analytic method is presented at low order parametres and an algorithm for the general case is developed which has polynomial complexity in the number of edges in the quiver. Using this algorithm, carefully implemented, we classify the quiver diagram and assign possible superpotentials for various small values of the number of edges and nodes. We examine some preliminary statistics on this space of toric quiver theories.

  3. Statistical analysis of electroconvection near an ion-selective membrane in the highly chaotic regime

    NASA Astrophysics Data System (ADS)

    Druzgalski, Clara; Mani, Ali

    2016-11-01

    We investigate electroconvection and its impact on ion transport in a model system comprised of an ion-selective membrane, an aqueous electrolyte, and an external electric field applied normal to the membrane. We develop a direct numerical simulation code to solve the governing Poisson-Nernst-Planck and Navier-Stokes equations in three dimensions using a specialized parallel numerical algorithm and sufficient resolution to capture the high frequency and high wavenumber physics. We show a comprehensive statistical analysis of the transport phenomena in the highly chaotic regime. Qualitative and quantitative comparisons of two-dimensional (2D) and 3D simulations include prediction of the mean concentration fields as well as the spectra of concentration, charge density, and velocity signals. Our analyses reveal a significant quantitative difference between 2D and 3D electroconvection. Furthermore, we show that high-intensity yet short-lived current density hot spots appear randomly on the membrane surface, contributing significantly to the mean current density. By examining cross correlations between current density on the membrane and other field quantities we explore the physical mechanisms leading to current hot spots. We also present analysis of transport fluxes in the context of ensemble-averaged equations. Our analysis reveals that in the highly chaotic regime the mixing layer (ML), which spans the majority of the domain extent, is governed by advective fluctuations. Furthermore, we show that in the ML the mean electromigration fluxes cancel out for positive and negative ions, indicating that the mean transport of total salt content within the ML can be represented via the electroneutral approximation. Finally, we present an assessment of the importance of different length scales in enhancing transport by computing the cross covariance of concentration and velocity fluctuations in the wavenumber space. Our analysis indicates that in the majority of the domain the large scales contribute most significantly to transport, while the effects of small scales become more appreciable in regions very near the membrane.

  4. The joint space-time statistics of macroweather precipitation, space-time statistical factorization and macroweather models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovejoy, S., E-mail: lovejoy@physics.mcgill.ca; Lima, M. I. P. de; Department of Civil Engineering, University of Coimbra, 3030-788 Coimbra

    2015-07-15

    Over the range of time scales from about 10 days to 30–100 years, in addition to the familiar weather and climate regimes, there is an intermediate “macroweather” regime characterized by negative temporal fluctuation exponents: implying that fluctuations tend to cancel each other out so that averages tend to converge. We show theoretically and numerically that macroweather precipitation can be modeled by a stochastic weather-climate model (the Climate Extended Fractionally Integrated Flux, model, CEFIF) first proposed for macroweather temperatures and we show numerically that a four parameter space-time CEFIF model can approximately reproduce eight or so empirical space-time exponents. In spitemore » of this success, CEFIF is theoretically and numerically difficult to manage. We therefore propose a simplified stochastic model in which the temporal behavior is modeled as a fractional Gaussian noise but the spatial behaviour as a multifractal (climate) cascade: a spatial extension of the recently introduced ScaLIng Macroweather Model, SLIMM. Both the CEFIF and this spatial SLIMM model have a property often implicitly assumed by climatologists that climate statistics can be “homogenized” by normalizing them with the standard deviation of the anomalies. Physically, it means that the spatial macroweather variability corresponds to different climate zones that multiplicatively modulate the local, temporal statistics. This simplified macroweather model provides a framework for macroweather forecasting that exploits the system's long range memory and spatial correlations; for it, the forecasting problem has been solved. We test this factorization property and the model with the help of three centennial, global scale precipitation products that we analyze jointly in space and in time.« less

  5. Dependency graph for code analysis on emerging architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shashkov, Mikhail Jurievich; Lipnikov, Konstantin

    Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.

  6. Statistical analysis of experimental data for mathematical modeling of physical processes in the atmosphere

    NASA Astrophysics Data System (ADS)

    Karpushin, P. A.; Popov, Yu B.; Popova, A. I.; Popova, K. Yu; Krasnenko, N. P.; Lavrinenko, A. V.

    2017-11-01

    In this paper, the probabilities of faultless operation of aerologic stations are analyzed, the hypothesis of normality of the empirical data required for using the Kalman filter algorithms is tested, and the spatial correlation functions of distributions of meteorological parameters are determined. The results of a statistical analysis of two-term (0, 12 GMT) radiosonde observations of the temperature and wind velocity components at some preset altitude ranges in the troposphere in 2001-2016 are presented. These data can be used in mathematical modeling of physical processes in the atmosphere.

  7. Exploring the Structure of Library and Information Science Web Space Based on Multivariate Analysis of Social Tags

    ERIC Educational Resources Information Center

    Joo, Soohyung; Kipp, Margaret E. I.

    2015-01-01

    Introduction: This study examines the structure of Web space in the field of library and information science using multivariate analysis of social tags from the Website, Delicious.com. A few studies have examined mathematical modelling of tags, mainly examining tagging in terms of tripartite graphs, pattern tracing and descriptive statistics. This…

  8. Book Review:

    NASA Astrophysics Data System (ADS)

    Vespignani, A.

    2004-09-01

    Networks have been recently recognized as playing a central role in understanding a wide range of systems spanning diverse scientific domains such as physics and biology, economics, computer science and information technology. Specific examples run from the structure of the Internet and the World Wide Web to the interconnections of finance agents and ecological food webs. These networked systems are generally made by many components whose microscopic interactions give rise to global structures characterized by emergent collective behaviour and complex topological properties. In this context the statistical physics approach finds a natural application since it attempts to explain the various large-scale statistical properties of networks in terms of local interactions governing the dynamical evolution of the constituent elements of the system. It is not by chance then that many of the seminal papers in the field have been published in the physics literature, and have nevertheless made a considerable impact on other disciplines. Indeed, a truly interdisciplinary approach is required in order to understand each specific system of interest, leading to a very interesting cross-fertilization between different scientific areas defining the emergence of a new research field sometimes called network science. The book of Dorogovtsev and Mendes is the first comprehensive monograph on this new scientific field. It provides a thorough presentation of the forefront research activities in the area of complex networks, with an extensive sampling of the disciplines involved and the kinds of problems that form the subject of inquiry. The book starts with a short introduction to graphs and network theory that introduces the tools and mathematical background needed for the rest of the book. The following part is devoted to an extensive presentation of the empirical analysis of real-world networks. While for obvious reasons of space the authors cannot analyse in every detail all the various examples, they provide the reader with a general vista that makes clear the relevance of network science to a wide range of natural and man-made systems. Two chapters are then committed to the detailed exposition of the statistical physics approach to equilibrium and non-equilibrium networks. The authors are two leading players in the area of network theory and offer a very careful and complete presentation of the statistical physics theory of evolving networks. Finally, in the last two chapters, the authors focus on various consequences of network topology for dynamical and physical phenomena occurring in these kinds of structures. The book is completed by a very extensive bibliography and some useful appendices containing some technical points arising in the mathematical discussion and data analysis. The book's mathematical level is fairly advanced and allows a coherent and unified framework for the study of networked structure. The book is targeted at mathematicians, physicists and social scientists alike. It will be appreciated by everybody working in the network area, and especially by any researcher or student entering the field that would like to have a reference text on the latest developments in network science.

  9. Intermittency Statistics in the Expanding Solar Wind

    NASA Astrophysics Data System (ADS)

    Cuesta, M. E.; Parashar, T. N.; Matthaeus, W. H.

    2017-12-01

    The solar wind is observed to be turbulent. One of the open questions in solar wind research is how the turbulence evolves as the solar wind expands to great distances. Some studies have focused on evolution of the outer scale but not much has been done to understand how intermittency evolves in the expanding wind beyond 1 AU (see [1,2]). We use magnetic field data from Voyager I spacecraft from 1 to 10AU to study the evolution of statistics of magnetic discontinuities. We perform various statistical tests on these discontinuities and make connections to the physical processes occurring in the expanding wind.[1] Tsurutani, Bruce T., and Edward J. Smith. "Interplanetary discontinuities: Temporal variations and the radial gradient from 1 to 8.5 AU." Journal of Geophysical Research: Space Physics 84.A6 (1979): 2773-2787.[2] Greco, A., et al. "Evidence for nonlinear development of magnetohydrodynamic scale intermittency in the inner heliosphere." The Astrophysical Journal 749.2 (2012): 105.

  10. Scientific, statistical, practical, and regulatory considerations in design space development.

    PubMed

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  11. Emulating Simulations of Cosmic Dawn for 21 cm Power Spectrum Constraints on Cosmology, Reionization, and X-Ray Heating

    NASA Astrophysics Data System (ADS)

    Kern, Nicholas S.; Liu, Adrian; Parsons, Aaron R.; Mesinger, Andrei; Greig, Bradley

    2017-10-01

    Current and upcoming radio interferometric experiments are aiming to make a statistical characterization of the high-redshift 21 cm fluctuation signal spanning the hydrogen reionization and X-ray heating epochs of the universe. However, connecting 21 cm statistics to the underlying physical parameters is complicated by the theoretical challenge of modeling the relevant physics at computational speeds quick enough to enable exploration of the high-dimensional and weakly constrained parameter space. In this work, we use machine learning algorithms to build a fast emulator that can accurately mimic an expensive simulation of the 21 cm signal across a wide parameter space. We embed our emulator within a Markov Chain Monte Carlo framework in order to perform Bayesian parameter constraints over a large number of model parameters, including those that govern the Epoch of Reionization, the Epoch of X-ray Heating, and cosmology. As a worked example, we use our emulator to present an updated parameter constraint forecast for the Hydrogen Epoch of Reionization Array experiment, showing that its characterization of a fiducial 21 cm power spectrum will considerably narrow the allowed parameter space of reionization and heating parameters, and could help strengthen Planck's constraints on {σ }8. We provide both our generalized emulator code and its implementation specifically for 21 cm parameter constraints as publicly available software.

  12. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  13. Inference for the physical sciences

    PubMed Central

    Jones, Nick S.; Maccarone, Thomas J.

    2013-01-01

    There is a disconnect between developments in modern data analysis and some parts of the physical sciences in which they could find ready use. This introduction, and this issue, provides resources to help experimental researchers access modern data analysis tools and exposure for analysts to extant challenges in physical science. We include a table of resources connecting statistical and physical disciplines and point to appropriate books, journals, videos and articles. We conclude by highlighting the relevance of each of the articles in the associated issue. PMID:23277613

  14. Measurement and Simulation of the Variation in Proton-Induced Energy Deposition in Large Silicon Diode Arrays

    NASA Technical Reports Server (NTRS)

    Howe, Christina L.; Weller, Robert A.; Reed, Robert A.; Sierawski, Brian D.; Marshall, Paul W.; Marshall, Cheryl J.; Mendenhall, Marcus H.; Schrimpf, Ronald D.

    2007-01-01

    The proton induced charge deposition in a well characterized silicon P-i-N focal plane array is analyzed with Monte Carlo based simulations. These simulations include all physical processes, together with pile up, to accurately describe the experimental data. Simulation results reveal important high energy events not easily detected through experiment due to low statistics. The effects of each physical mechanism on the device response is shown for a single proton energy as well as a full proton space flux.

  15. Skylab experiments. Volume 5: Astronomy and space physics. [Skylab observations of galactic radiation, solar energy, and interplanetary composition for high school level education

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The astronomy and space physics investigations conducted in the Skylab program include over 20 experiments in four categories to explore space phenomena that cannot be observed from earth. The categories of space research are as follows: (1) phenomena within the solar system, such as the effect of solar energy on Earth's atmosphere, the composition of interplanetary space, the possibility of an inner planet, and the X-ray radiation from Jupiter, (2) analysis of energetic particles such as cosmic rays and neutrons in the near-earth space, (3) stellar and galactic astronomy, and (4) self-induced environment surrounding the Skylab spacecraft.

  16. Modeling subjective evaluation of soundscape quality in urban open spaces: An artificial neural network approach.

    PubMed

    Yu, Lei; Kang, Jian

    2009-09-01

    This research aims to explore the feasibility of using computer-based models to predict the soundscape quality evaluation of potential users in urban open spaces at the design stage. With the data from large scale field surveys in 19 urban open spaces across Europe and China, the importance of various physical, behavioral, social, demographical, and psychological factors for the soundscape evaluation has been statistically analyzed. Artificial neural network (ANN) models have then been explored at three levels. It has been shown that for both subjective sound level and acoustic comfort evaluation, a general model for all the case study sites is less feasible due to the complex physical and social environments in urban open spaces; models based on individual case study sites perform well but the application range is limited; and specific models for certain types of location/function would be reliable and practical. The performance of acoustic comfort models is considerably better than that of sound level models. Based on the ANN models, soundscape quality maps can be produced and this has been demonstrated with an example.

  17. Phenomenology of small violations of Fermi and Bose statistics

    NASA Astrophysics Data System (ADS)

    Greenberg, O. W.; Mohapatra, Rabindra N.

    1989-04-01

    In a recent paper, we proposed a ``paronic'' field-theory framework for possible small deviations from the Pauli exclusion principle. This theory cannot be represented in a positive-metric (Hilbert) space. Nonetheless, the issue of possible small violations of the exclusion principle can be addressed in the framework of quantum mechanics, without being connected with a local quantum field theory. In this paper, we discuss the phenomenology of small violations of both Fermi and Bose statistics. We consider the implications of such violations in atomic, nuclear, particle, and condensed-matter physics and in astrophysics and cosmology. We also discuss experiments that can detect small violations of Fermi and Bose statistics or place stringent bounds on their validity.

  18. Green space definition affects associations of green space with overweight and physical activity.

    PubMed

    Klompmaker, Jochem O; Hoek, Gerard; Bloemsma, Lizan D; Gehring, Ulrike; Strak, Maciej; Wijga, Alet H; van den Brink, Carolien; Brunekreef, Bert; Lebret, Erik; Janssen, Nicole A H

    2018-01-01

    In epidemiological studies, exposure to green space is inconsistently associated with being overweight and physical activity, possibly because studies differ widely in their definition of green space exposure, inclusion of important confounders, study population and data analysis. We evaluated whether the association of green space with being overweight and physical activity depended upon definition of greenspace. We conducted a cross-sectional study using data from a Dutch national health survey of 387,195 adults. Distance to the nearest park entrance and surrounding green space, based on the Normalized Difference Vegetation Index (NDVI) or a detailed Dutch land-use database (TOP10NL), was calculated for each residential address. We used logistic regression analyses to study the association of green space exposure with being overweight and being moderately or vigorously physically active outdoors at least 150min/week (self-reported). To study the shape of the association, we specified natural splines and quintiles. The distance to the nearest park entrance was not associated with being overweight or outdoor physical activity. Associations of surrounding green space with being overweight or outdoor physical activity were highly non-linear. For NDVI surrounding greenness, we observed significantly decreased odds of being overweight [300m buffer, odds ratio (OR) = 0.88; 95% CI: 0.86, 0.91] and increased odds for outdoor physical activity [300m buffer, OR = 1.14; 95% CI: 1.10, 1.17] in the highest quintile compared to the lowest quintile. For TOP10NL surrounding green space, associations were mostly non-significant. Associations were generally stronger for subjects living in less urban areas and for the smaller buffers. Associations of green space with being overweight and outdoor physical activity differed considerably between different green space definitions. Associations were strongest for NDVI surrounding greenness. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Observational limitations of Bose-Einstein photon statistics and radiation noise in thermal emission

    NASA Astrophysics Data System (ADS)

    Lee, Y.-J.; Talghader, J. J.

    2018-01-01

    For many decades, theory has predicted that Bose-Einstein statistics are a fundamental feature of thermal emission into one or a few optical modes; however, the resulting Bose-Einstein-like photon noise has never been experimentally observed. There are at least two reasons for this: (1) Relationships to describe the thermal radiation noise for an arbitrary mode structure have yet to be set forth, and (2) the mode and detector constraints necessary for the detection of such light is extremely hard to fulfill. Herein, photon statistics and radiation noise relationships are developed for systems with any number of modes and couplings to an observing space. The results are shown to reproduce existing special cases of thermal emission and are then applied to resonator systems to discuss physically realizable conditions under which Bose-Einstein-like thermal statistics might be observed. Examples include a single isolated cavity and an emitter cavity coupled to a small detector space. Low-mode-number noise theory shows major deviations from solely Bose-Einstein or Poisson treatments and has particular significance because of recent advances in perfect absorption and subwavelength structures both in the long-wave infrared and terahertz regimes. These microresonator devices tend to utilize a small volume with few modes, a regime where the current theory of thermal emission fluctuations and background noise, which was developed decades ago for free-space or single-mode cavities, has no derived solutions.

  20. The NASA Physics of the Cosmos Program

    NASA Astrophysics Data System (ADS)

    Bock, Jamie

    2015-04-01

    The NASA Physics of the Cosmos program is a portfolio of space-based investigations for studying fundamental processes in the universe. Areas of focus include: probing the physical process of inflation associated with the birth of the universe, studying the nature of the dark energy that dominates the mass-energy of the modern universe, advancing new ways to observe the universe through gravitational-wave astronomy, studying the universe in X-rays and gamma rays to probe energetic astrophysical processes and to study the formation and behavior of black holes in strong gravity, and determining the energetic origins and history of cosmic rays. The program is supported by an analysis group called the PhysPAG that serves as a forum for community input and analysis. Space offers unique advantages for these exciting investigations, and the program seeks to guide the development of future space missions through observations from current facilities, and by formulating new technologies and capabilities.

  1. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  2. Using real-time ultrasound imaging as adjunct teaching tools to enhance physical therapist students' ability and confidence to perform traction of the knee joint.

    PubMed

    Markowski, Alycia; Watkins, Maureen K; Burnett, Todd; Ho, Melissa; Ling, Michael

    2018-04-01

    Often, physical therapy students struggle with the skill and the confidence to perform manual techniques for musculoskeletal examination. Current teaching methods lack concurrent objective feedback. Real-time ultrasound imaging (RTUI) has the advantage of generating visualization of anatomical structures in real-time in an efficient and safe manner. We hypothesize that the use of RTUI to augment teaching with concurrent objective visual feedback will result in students' improved ability to create a change in joint space when performing a manual knee traction and higher confidence scores. Eighty-six students were randomly allocated to a control or an experimental group. All participants received baseline instructions on how to perform knee traction. The control group received standardized lab instruction (visual, video, and instructor/partner feedback). The experimental group received standardized lab instruction augmented with RTUI feedback. Pre-data and post-data collection consisted of measuring participants' ability to create changes in joint space when performing knee traction, a confidence survey evaluating perceived ability and a reflection paper. Joint space changes between groups were compared using a paired t-test. Surveys were analyzed with descriptive statistics and compared using Wilcoxon Rank Sum and for the reflection papers, themes were identified and descriptive statistics reported. Although there were no statistically significant differences between the control and the experimental group, overall scores improved. Qualitative data suggests students found the use of ultrasound imaging beneficial and would like more exposure. This novel approach to teaching knee traction with RTUI has potential and may be a basis for further studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Space station needs, attributes and architectural options. Volume 1, attachment 1: Executive summary NASA

    NASA Technical Reports Server (NTRS)

    1983-01-01

    User alignment plan, physical and life sciences and applications, commercial requirements national security, space operations, user needs, foreign contacts, mission scenario analysis and architectural concepts, alternative systems concepts, mission operations architectural development, architectural analysis trades, evolution, configuration, and technology development are discussed.

  4. NASA Lighting Research, Test, & Analysis

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2015-01-01

    The Habitability and Human Factors Branch, at Johnson Space Center, in Houston, TX, provides technical guidance for the development of spaceflight lighting requirements, verification of light system performance, analysis of integrated environmental lighting systems, and research of lighting-related human performance issues. The Habitability & Human Factors Lighting Team maintains two physical facilities that are integrated to provide support. The Lighting Environment Test Facility (LETF) provides a controlled darkroom environment for physical verification of lighting systems with photometric and spetrographic measurement systems. The Graphics Research & Analysis Facility (GRAF) maintains the capability for computer-based analysis of operational lighting environments. The combined capabilities of the Lighting Team at Johnson Space Center have been used for a wide range of lighting-related issues.

  5. Study of binary asteroids with three space missions

    NASA Astrophysics Data System (ADS)

    Kovalenko, Irina; Doressoundiram, Alain; Hestroffer, Daniel

    Binary and multiple asteroids are common in the Solar system and encountered in various places going from Near-Earth region, to the main-belt, Trojans and Centaurs, and beyond Neptune. Their study can provide insight on the Solar System formation and its subsequent dynamical evolution. Binaries are also objects of high interest because they provide fundamental physical parameters such as mass and density, and hence clues on the early Solar System, or other processes that are affecting asteroid over time. We will present our current project on analysis of such systems based on three space missions. The first one is the Herschel space observatory (ESA), the largest infrared telescope ever launched. Thirty Centaurs and trans-Neptunian binaries were observed by Herschel and the measurement allowed to define size, albedo and thermal properties [1]. The second one is the satellite Gaia (ESA). This mission is designed to chart a three-dimensional map of the Galaxy. Gaia will provide positional measurements of Solar System Objects - including asteroid binaries - with unprecedented accuracy [2]. And the third one is the proposed mission AIDA, which would study the effects of crashing a spacecraft into an asteroid [3]. The objectives are to demonstrate the ability to modify the trajectory of an asteroid, to precisely measure its trajectory change, and to characterize its physical properties. The target of this mission is a binary system: (65803) Didymos. This encompasses orbital characterisations for both astrometric and resolved binaries, as well as unbound orbit, study of astrometric binaries, derivation of densities, and general statistical analysis of physical and orbital properties of trans-Neptunian and other asteroid binaries. Acknowledgements : work supported by Labex ESEP (ANR N° 2011-LABX-030) [1] Müller T., Lellouch E., Stansberry J. et al. 2009. TNOs are Cool: A Survey of the Transneptunian Region. EM&P 105, 209-219. [2] Mignard F., Cellino A., Muinonen K. et al. 2007. The Gaia Mission: Expected Applications to Asteroid Science. EM&P 1001, 97-125. [3] Galvez A., Carnelli I. et al. 2013. AIDA: The Asteroid Impact & Deflection Assessment Mission. EPSC 2013 - 1043.

  6. Statistical wind analysis for near-space applications

    NASA Astrophysics Data System (ADS)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  7. Jet Noise Physics and Modeling Using First-principles Simulations

    NASA Technical Reports Server (NTRS)

    Freund, Jonathan B.

    2003-01-01

    An extensive analysis of our jet DNS database has provided for the first time the complex correlations that are the core of many statistical jet noise models, including MGBK. We have also for the first time explicitly computed the noise from different components of a commonly used noise source as proposed in many modeling approaches. Key findings are: (1) While two-point (space and time) velocity statistics are well-fitted by decaying exponentials, even for our low-Reynolds-number jet, spatially integrated fourth-order space/retarded-time correlations, which constitute the noise "source" in MGBK, are instead well-fitted by Gaussians. The width of these Gaussians depends (by a factor of 2) on which components are considered. This is counter to current modeling practice, (2) A standard decomposition of the Lighthill source is shown by direct evaluation to be somewhat artificial since the noise from these nominally separate components is in fact highly correlated. We anticipate that the same will be the case for the Lilley source, and (3) The far-field sound is computed in a way that explicitly includes all quadrupole cancellations, yet evaluating the Lighthill integral for only a small part of the jet yields a far-field noise far louder than that from the whole jet due to missing nonquadrupole cancellations. Details of this study are discussed in a draft of a paper included as appendix A.

  8. Exploiting Molecular Weight Distribution Shape to Tune Domain Spacing in Block Copolymer Thin Films.

    PubMed

    Gentekos, Dillon T; Jia, Junteng; Tirado, Erika S; Barteau, Katherine P; Smilgies, Detlef-M; DiStasio, Robert A; Fors, Brett P

    2018-04-04

    We report a method for tuning the domain spacing ( D sp ) of self-assembled block copolymer thin films of poly(styrene- block-methyl methacrylate) (PS- b-PMMA) over a large range of lamellar periods. By modifying the molecular weight distribution (MWD) shape (including both the breadth and skew) of the PS block via temporal control of polymer chain initiation in anionic polymerization, we observe increases of up to 41% in D sp for polymers with the same overall molecular weight ( M n ≈ 125 kg mol -1 ) without significantly changing the overall morphology or chemical composition of the final material. In conjunction with our experimental efforts, we have utilized concepts from population statistics and least-squares analysis to develop a model for predicting D sp based on the first three moments of the MWDs. This statistical model reproduces experimental D sp values with high fidelity (with mean absolute errors of 1.2 nm or 1.8%) and provides novel physical insight into the individual and collective roles played by the MWD moments in determining this property of interest. This work demonstrates that both MWD breadth and skew have a profound influence over D sp , thereby providing an experimental and conceptual platform for exploiting MWD shape as a simple and modular handle for fine-tuning D sp in block copolymer thin films.

  9. Validating Future Force Performance Measures (Army Class): Concluding Analyses

    DTIC Science & Technology

    2016-06-01

    32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness

  10. Assessing the Associations Between Types of Green Space, Physical Activity, and Health Indicators Using GIS and Participatory Survey

    NASA Astrophysics Data System (ADS)

    Akpinar, A.

    2017-11-01

    This study explores whether specific types of green spaces (i.e. urban green spaces, forests, agricultural lands, rangelands, and wetlands) are associated with physical activity, quality of life, and cardiovascular disease prevalence. A sample of 8,976 respondents from the Behavioral Risk Factor Surveillance System, conducted in 2006 in Washington State across 291 zip-codes, was analyzed. Measures included physical activity status, quality of life, and cardiovascular disease prevalence (i.e. heart attack, angina, and stroke). Percentage of green spaces was derived from the National Land Cover Dataset and measured with Geographical Information System. Multilevel regression analyses were conducted to analyze the data while controlling for age, sex, race, weight, marital status, occupation, income, education level, and zip-code population and socio-economic situation. Regression results reveal that no green space types were associated with physical activity, quality of life, and cardiovascular disease prevalence. On the other hand, the analysis shows that physical activity was associated with general health, quality of life, and cardiovascular disease prevalence. The findings suggest that other factors such as size, structure and distribution (sprawled or concentrated, large or small), quality, and characteristics of green space might be important in general health, quality of life, and cardiovascular disease prevalence rather than green space types. Therefore, further investigations are needed.

  11. mvMapper: statistical and geographical data exploration and visualization of multivariate analysis of population structure

    USDA-ARS?s Scientific Manuscript database

    Characterizing population genetic structure across geographic space is a fundamental challenge in population genetics. Multivariate statistical analyses are powerful tools for summarizing genetic variability, but geographic information and accompanying metadata is not always easily integrated into t...

  12. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  13. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  14. Fast inner-volume imaging of the lumbar spine with a spatially focused excitation using a 3D-TSE sequence.

    PubMed

    Riffel, Philipp; Michaely, Henrik J; Morelli, John N; Paul, Dominik; Kannengiesser, Stephan; Schoenberg, Stefan O; Haneder, Stefan

    2015-04-01

    The purpose of this study was to evaluate the feasibility and technical quality of a zoomed three-dimensional (3D) turbo spin-echo (TSE) sampling perfection with application optimized contrasts using different flip-angle evolutions (SPACE) sequence of the lumbar spine. In this prospective feasibility study, nine volunteers underwent a 3-T magnetic resonance examination of the lumbar spine including 1) a conventional 3D T2-weighted (T2w) SPACE sequence with generalized autocalibrating partially parallel acquisition technique acceleration factor 2 and 2) a zoomed 3D T2w SPACE sequence with a reduced field of view (reduction factor 2). Images were evaluated with regard to image sharpness, signal homogeneity, and the presence of artifacts by two experienced radiologists. For quantitative analysis, signal-to-noise ratio (SNR) values were calculated. Image sharpness of anatomic structures was statistically significantly greater with zoomed SPACE (P < .0001), whereas the signal homogeneity was statistically significantly greater with conventional SPACE (cSPACE; P = .0003). There were no statistically significant differences in extent of artifacts. Acquisition times were 8:20 minutes for cSPACE and 6:30 minutes for zoomed SPACE. Readers 1 and 2 selected zSPACE as the preferred sequence in five of nine cases. In two of nine cases, both sequences were rated as equally preferred by both the readers. SNR values were statistically significantly greater with cSPACE. In comparison to a cSPACE sequences, zoomed SPACE imaging of the lumbar spine provides sharper images in conjunction with a 25% reduction in acquisition time. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  15. How the built environment affects change in older people's physical activity: A mixed- methods approach using longitudinal health survey data in urban China.

    PubMed

    Zhou, Peiling; Grady, Sue C; Chen, Guo

    2017-11-01

    Although the general population in China is physically active, only 45% of older adults meet the World Health Organization's recommendation for weekly moderate-to-vigorous exercise, to achieve health benefits. This percentage is even lower (9.8%) in urban China. It is, therefore, important to understand the pathways by which physical activity behaviors are impacted by the built environment. This study utilized a mixed methods approach-interviews (n = 42) and longitudinal (2010-2015) health survey data (n = 3094) for older people residing in three neighborhoods in Huainan, a mid-sized city in Anhui Province, central eastern China. First, a content analysis of interview data was used to identify individual and built environment factors (motivators and barriers) that impacted physical activity within older people's activity spaces. Second, a multilevel path analysis was conducted using the health survey data to demonstrate the pathways by which these motivators and barriers contributed to the initiation, regulation, and maintenance of physical activity. This study found (a) that the liveliness of an apartment building and its proximity to functional spaces (fast-food stores, farmer's markets, supermarkets, pharmacies, schools, hospitals, PA facilities and natural and man-made water bodies) were important factors in attracting sedentary older people to initiate physical activity; (b) the social networks of apartment neighbors helped to initiate, regulate, and maintain physical activity; and housing closeness to functional spaces was important in maintaining physical activity, particularly for those older people with chronic diseases. To increase older people's overall physical activity, future interventions should focus on residential form and access to functional spaces, prior to investing in large-scale urban design interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Space, body, time and relationship experiences of recess physical activity: a qualitative case study among the least physical active schoolchildren.

    PubMed

    Pawlowski, Charlotte Skau; Andersen, Henriette Bondo; Tjørnhøj-Thomsen, Tine; Troelsen, Jens; Schipperijn, Jasper

    2016-01-06

    Increasing recess physical activity has been the aim of several interventions, as this setting can provide numerous physical activity opportunities. However, it is unclear if these interventions are equally effective for all children, or if they only appeal to children who are already physically active. This study was conducted to explore the least physically active children's "lived experiences" within four existential lifeworlds linked to physical activity during recess: space, body, time, and relations. The study builds on ethnographic fieldwork in a public school in Denmark using a combination of participatory photo interviews and participant observation. Thirty-seven grade five children (11-12 years old) were grouped in quartiles based on their objectively measured daily physical activity levels. Eight children in the lowest activity quartile (six girls) were selected to participate in the study. To avoid stigmatising and to make generalisations more reliable we further recruited eight children from the two highest activity quartiles (four girls) to participate. An analysis of the least physically active children's "lived experiences" of space, body, time and relations revealed several key factors influencing their recess physical activity: perceived classroom safety, indoor cosiness, lack of attractive outdoor facilities, bodily dissatisfaction, bodily complaints, tiredness, feeling bored, and peer influence. We found that the four existential lifeworlds provided an in-depth understanding of the least physically active children's "lived experiences" of recess physical activity. Our findings imply that specific intervention strategies might be needed to increase the least physically active children's physical activity level. For example, rethinking the classroom as a space for physical activity, designing schoolyards with smaller secluded spaces and varied facilities, improving children's self-esteem and body image, e.g., during physical education, and creating teacher organised play activities during recess.

  17. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  18. Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  19. Representation of the contextual statistical model by hyperbolic amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. Wemore » also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.« less

  20. Representation of the contextual statistical model by hyperbolic amplitudes

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2005-06-01

    We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. We also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.

  1. Clinical Validation of the "Sedentary Lifestyle" Nursing Diagnosis in Secondary School Students.

    PubMed

    de Oliveira, Marcos Renato; da Silva, Viviane Martins; Guedes, Nirla Gomes; de Oliveira Lopes, Marcos Venícios

    2016-06-01

    This study clinically validated the nursing diagnosis of "sedentary lifestyle" (SL) among 564 Brazilian adolescents. Measures of diagnostic accuracy were calculated for defining characteristics, and Mantel-Haenszel analysis was used to identify related factors. The measures of diagnostic accuracy showed that the following defining characteristics were statistically significant: "average daily physical activity less than recommended for gender and age," "preference for activity low in physical activity," "nonengagement in leisure time physical activities," and "diminished respiratory capacity." An SL showed statistically significant associations with the following related factors: insufficient motivation for physical activity; insufficient interest in physical activity; insufficient resources for physical activity; insufficient social support for physical activity; attitudes, beliefs, and health habits that hinder physical activity; and insufficient confidence for practicing physical exercises. The study highlighted the four defining characteristics and six related factors for making decisions related to SL among adolescents. © The Author(s) 2015.

  2. Statistical prediction of space motion sickness

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.

    1990-01-01

    Studies designed to empirically examine the etiology of motion sickness to develop a foundation for enhancing its prediction are discussed. Topics addressed include early attempts to predict space motion sickness, multiple test data base that uses provocative and vestibular function tests, and data base subjects; reliability of provocative tests of motion sickness susceptibility; prediction of space motion sickness using linear discriminate analysis; and prediction of space motion sickness susceptibility using the logistic model.

  3. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  4. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  5. Using Perturbed Physics Ensembles and Machine Learning to Select Parameters for Reducing Regional Biases in a Global Climate Model

    NASA Astrophysics Data System (ADS)

    Li, S.; Rupp, D. E.; Hawkins, L.; Mote, P.; McNeall, D. J.; Sarah, S.; Wallom, D.; Betts, R. A.

    2017-12-01

    This study investigates the potential to reduce known summer hot/dry biases over Pacific Northwest in the UK Met Office's atmospheric model (HadAM3P) by simultaneously varying multiple model parameters. The bias-reduction process is done through a series of steps: 1) Generation of perturbed physics ensemble (PPE) through the volunteer computing network weather@home; 2) Using machine learning to train "cheap" and fast statistical emulators of climate model, to rule out regions of parameter spaces that lead to model variants that do not satisfy observational constraints, where the observational constraints (e.g., top-of-atmosphere energy flux, magnitude of annual temperature cycle, summer/winter temperature and precipitation) are introduced sequentially; 3) Designing a new PPE by "pre-filtering" using the emulator results. Steps 1) through 3) are repeated until results are considered to be satisfactory (3 times in our case). The process includes a sensitivity analysis to find dominant parameters for various model output metrics, which reduces the number of parameters to be perturbed with each new PPE. Relative to observational uncertainty, we achieve regional improvements without introducing large biases in other parts of the globe. Our results illustrate the potential of using machine learning to train cheap and fast statistical emulators of climate model, in combination with PPEs in systematic model improvement.

  6. Quantum Entanglement in Random Physical States

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo

    2012-07-01

    Most states in the Hilbert space are maximally entangled. This fact has proven useful to investigate—among other things—the foundations of statistical mechanics. Unfortunately, most states in the Hilbert space of a quantum many-body system are not physically accessible. We define physical ensembles of states acting on random factorized states by a circuit of length k of random and independent unitaries with local support. We study the typicality of entanglement by means of the purity of the reduced state. We find that for a time k=O(1), the typical purity obeys the area law. Thus, the upper bounds for area law are actually saturated, on average, with a variance that goes to zero for large systems. Similarly, we prove that by means of local evolution a subsystem of linear dimensions L is typically entangled with a volume law when the time scales with the size of the subsystem. Moreover, we show that for large values of k the reduced state becomes very close to the completely mixed state.

  7. Exploring parameter space effects on structure-property relationships of surfactants at liquid-liquid interfaces.

    PubMed

    Emborsky, Christopher P; Cox, Kenneth R; Chapman, Walter G

    2011-08-28

    The ubiquitous use of surfactants in commercial and industrial applications has led to many experimental, theoretical, and simulation based studies. These efforts seek to provide a molecular level understanding of the effects on structuring behavior and the corresponding impacts on observable properties (e.g., interfacial tension). With such physical detail, targeted system design can be improved over typical techniques of observational trends and phenomenological correlations by taking advantage of predictive system response. This research provides a systematic study of part of the broad parameter space effects on equilibrium microstructure and interfacial properties of amphiphiles at a liquid-liquid interface using the interfacial statistical associating fluid theory density functional theory as a molecular model for the system from the bulk to the interface. Insights into the molecular level physics and thermodynamics governing the system behavior are discussed as they relate to both predictions qualitatively consistent with experimental observations and extensions beyond currently available studies. © 2011 American Institute of Physics

  8. Preface: Solar energetic particles, solar modulation and space radiation: New opportunities in the AMS-02 Era

    NASA Astrophysics Data System (ADS)

    Bindi, Veronica

    2017-08-01

    Solar Energetic Particle (SEP) acceleration at high energies and their propagation through the heliosphere and into the magnetosphere are not well understood and are still a matter of debate. Our understanding of solar modulation and transport of different species of galactic cosmic rays (GCR) inside the heliosphere has been significantly improved; however, a lot of work still needs to be done. GCR and SEPs pose a significant radiation risk for people and technology in space, and thus it is becoming increasingly important to understand the space radiation environment. AMS-02 will provide brand new information with unprecedented statistics about GCR and SEPs. Both GCR and heliophysics experiments will contribute to the increased understanding of acceleration physics, and transport of particles in space with improved models. This will inevitably lead to better predictions of space weather and safer operations in space.

  9. Near Surface Investigation of Agricultural Soils using a Multi-Frequency Electromagnetic Sensor

    NASA Astrophysics Data System (ADS)

    Sadatcharam, K.; Unc, A.; Krishnapillai, M.; Cheema, M.; Galagedara, L.

    2017-12-01

    Electromagnetic induction (EMI) sensors have been used as precision agricultural tools over decades. They are being used to measure spatiotemporal variability of soil properties and soil stratification in the sense of apparent electrical conductivity (ECa). We mapped the ECa variability by horizontal coplanar (HCP) and by vertical coplanar (VCP) orientation of a multi-frequency EMI sensor and identified its interrelation with physical properties of soil. A broadband, multi-frequency handheld EMI sensor (GEM-2) was used on a loamy sand soil cultivated with silage-corn in western Newfoundland, Canada. Log and line spaced, three frequency ranges (weak, low, and high), based on the factory calibration were tested using HCP and VCP orientation to produce spatiotemporal data of ECa. In parallel, we acquired data on soil moisture content, texture and bulk density. We then assessed the statistical significance of the relationship between ECa and soil physical properties. The test site had three areas of distinct soil properties corresponding to the elevation, in particular. The same spatial variability was also identified by ECa mapping at different frequencies and the two modes of coil orientations. Data analysis suggested that the high range frequency (38 kHz (log-spaced) and 49 kHz (line-spaced)) for both HCP and VCP orientations produced accurate ECa maps, better than the weak and low range frequencies tested. Furthermore, results revealed that the combined effects of soil texture, moisture content and bulk density affect ECameasurements as obtained by both frequencies and two coil orientations. Keywords: Apparent electrical conductivity, Electromagnetic induction, Horizontal coplanar, Soil properties, Vertical coplanar

  10. Spatial Analysis in Determining Physical Factors of Pedestrian Space Livability, Case Study: Pedestrian Space on Jalan Kemasan, Yogyakarta

    NASA Astrophysics Data System (ADS)

    Fauzi, A. F.; Aditianata, A.

    2018-02-01

    The existence of street as a place to perform various human activities becomes an important issue nowadays. In the last few decades, cars and motorcycles dominate streets in various cities in the world. On the other hand, human activity on the street is the determinant of the city livability. Previous research has pointed out that if there is lots of human activity in the street, then the city will be interesting. Otherwise, if the street has no activity, then the city will be boring. Learning from that statement, now various cities in the world are developing the concept of livable streets. Livable streets shown by diversity of human activities conducted in the streets’ pedestrian space. In Yogyakarta, one of the streets shown diversity of human activities is Jalan Kemasan. This study attempts to determine the physical factors of pedestrian space affecting the livability in Jalan Kemasan Yogyakarta through spatial analysis. Spatial analysis was performed by overlay technique between liveable point (activity diversity) distribution map and variable distribution map. Those physical pedestrian space research variable included element of shading, street vendors, building setback, seat location, divider between street and pedestrian way, and mixed use building function. More diverse the activity of one variable, then those variable are more affected then others. Overlay result then strengthened by field observation to qualitatively ensure the deduction. In the end, this research will provide valuable input for street and pedestrian space planning that is comfortable for human activities.

  11. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay and radiation belt enhancements. The work leading to this paper has received funding from the European Union's Seventh Framework Programme (FP7-SPACE-2011-1) under grant agreement no. 284520 for the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project.

  12. The extent of visual space inferred from perspective angles

    PubMed Central

    Erkelens, Casper J.

    2015-01-01

    Retinal images are perspective projections of the visual environment. Perspective projections do not explain why we perceive perspective in 3-D space. Analysis of underlying spatial transformations shows that visual space is a perspective transformation of physical space if parallel lines in physical space vanish at finite distance in visual space. Perspective angles, i.e., the angle perceived between parallel lines in physical space, were estimated for rails of a straight railway track. Perspective angles were also estimated from pictures taken from the same point of view. Perspective angles between rails ranged from 27% to 83% of their angular size in the retinal image. Perspective angles prescribe the distance of vanishing points of visual space. All computed distances were shorter than 6 m. The shallow depth of a hypothetical space inferred from perspective angles does not match the depth of visual space, as it is perceived. Incongruity between the perceived shape of a railway line on the one hand and the experienced ratio between width and length of the line on the other hand is huge, but apparently so unobtrusive that it has remained unnoticed. The incompatibility between perspective angles and perceived distances casts doubt on evidence for a curved visual space that has been presented in the literature and was obtained from combining judgments of distances and angles with physical positions. PMID:26034567

  13. Cyber threat impact assessment and analysis for space vehicle architectures

    NASA Astrophysics Data System (ADS)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  14. Comment on “Two statistics for evaluating parameter identifiability and error reduction” by John Doherty and Randall J. Hunt

    USGS Publications Warehouse

    Hill, Mary C.

    2010-01-01

    Doherty and Hunt (2009) present important ideas for first-order-second moment sensitivity analysis, but five issues are discussed in this comment. First, considering the composite-scaled sensitivity (CSS) jointly with parameter correlation coefficients (PCC) in a CSS/PCC analysis addresses the difficulties with CSS mentioned in the introduction. Second, their new parameter identifiability statistic actually is likely to do a poor job of parameter identifiability in common situations. The statistic instead performs the very useful role of showing how model parameters are included in the estimated singular value decomposition (SVD) parameters. Its close relation to CSS is shown. Third, the idea from p. 125 that a suitable truncation point for SVD parameters can be identified using the prediction variance is challenged using results from Moore and Doherty (2005). Fourth, the relative error reduction statistic of Doherty and Hunt is shown to belong to an emerging set of statistics here named perturbed calculated variance statistics. Finally, the perturbed calculated variance statistics OPR and PPR mentioned on p. 121 are shown to explicitly include the parameter null-space component of uncertainty. Indeed, OPR and PPR results that account for null-space uncertainty have appeared in the literature since 2000.

  15. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  16. Spectral enstrophy budget in a shear-less flow with turbulent/non-turbulent interface

    NASA Astrophysics Data System (ADS)

    Cimarelli, Andrea; Cocconi, Giacomo; Frohnapfel, Bettina; De Angelis, Elisabetta

    2015-12-01

    A numerical analysis of the interaction between decaying shear free turbulence and quiescent fluid is performed by means of global statistical budgets of enstrophy, both, at the single-point and two point levels. The single-point enstrophy budget allows us to recognize three physically relevant layers: a bulk turbulent region, an inhomogeneous turbulent layer, and an interfacial layer. Within these layers, enstrophy is produced, transferred, and finally destroyed while leading to a propagation of the turbulent front. These processes do not only depend on the position in the flow field but are also strongly scale dependent. In order to tackle this multi-dimensional behaviour of enstrophy in the space of scales and in physical space, we analyse the spectral enstrophy budget equation. The picture consists of an inviscid spatial cascade of enstrophy from large to small scales parallel to the interface moving towards the interface. At the interface, this phenomenon breaks, leaving place to an anisotropic cascade where large scale structures exhibit only a cascade process normal to the interface thus reducing their thickness while retaining their lengths parallel to the interface. The observed behaviour could be relevant for both the theoretical and the modelling approaches to flow with interacting turbulent/nonturbulent regions. The scale properties of the turbulent propagation mechanisms highlight that the inviscid turbulent transport is a large-scale phenomenon. On the contrary, the viscous diffusion, commonly associated with small scale mechanisms, highlights a much richer physics involving small lengths, normal to the interface, but at the same time large scales, parallel to the interface.

  17. High Energy Astrophysics and Cosmology from Space: NASA's Physics of the Cosmos Program

    NASA Astrophysics Data System (ADS)

    Bautz, Marshall

    2017-01-01

    We summarize currently-funded NASA activities in high energy astrophysics and cosmology embodied in the NASA Physics of the Cosmos program, including updates on technology development and mission studies. The portfolio includes participation in a space mission to measure gravitational waves from a variety of astrophysical sources, including binary black holes, throughout most of cosmic history, and in another to map the evolution of black hole accretion by means of the accompanying X-ray emission. These missions are envisioned as collaborations with the European Space Agency's Large 3 (L3) and Athena programs, respectively. It also features definition of a large, NASA-led X-ray Observatory capable of tracing the surprisingly rapid growth of supermassive black holes during the first billion years of cosmic history. The program also includes the study of cosmic rays and high-energy gamma-ray photons resulting from range of physical processes, and efforts to characterize both the physics of inflation associated with the birth of the universe and the nature of the dark energy that dominates its mass-energy content today. Finally, we describe the activities of the Physics of the Cosmos Program Analysis Group, which serves as a forum for community analysis and input to NASA.

  18. G14A-06- Analysis of the DORIS, GNSS, SLR, VLBI and Gravimetric Time Series at the GGOS Core Sites

    NASA Technical Reports Server (NTRS)

    Moreaux, G.; Lemoine, F.; Luceri, V.; Pavlis, E.; MacMillan, D.; Bonvalot, S.; Saunier, J.

    2017-01-01

    Analysis of the time series at the 3-4 multi-technique GGOS sites to analyze and compare the spectral content of the space geodetic and gravity time series. Evaluate the level of agreement between the space geodesy measurements and the physical tie vectors.

  19. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    NASA Astrophysics Data System (ADS)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and automation in the near term, and longer term investments in virtualized environments for improved scalability, tolerance for lossy data compression, novel data-centric memory and storage technologies, and tools for peer reviewing, preserving and sharing workflows, as well as fundamental statistical and machine learning algorithms.

  20. Physical Attacks: An Analysis of Teacher Characteristics Using the Schools and Staffing Survey

    ERIC Educational Resources Information Center

    Williams, Thomas O., Jr.; Ernst, Jeremy V.

    2016-01-01

    This study investigated physical attacks as reported by public school teachers on the most recent Schools and Staffing Survey (SASS) from the National Center for Education Statistics administered by the Institute of Educational Sciences. For this study, characteristics of teachers who responded affirmatively to having been physically attacked in…

  1. Status Quo and Outlook of the Studies of Entrepreneurship Education in China: Statistics and Analysis Based on Papers Indexed in CSSCI (2004-2013)

    ERIC Educational Resources Information Center

    Xia, Tian; Shumin, Zhang; Yifeng, Wu

    2016-01-01

    We utilized cross tabulation statistics, word frequency counts, and content analysis of research output to conduct a bibliometric study, and used CiteSpace software to depict a knowledge map for research on entrepreneurship education in China from 2004 to 2013. The study shows that, in this duration, the study of Chinese entrepreneurship education…

  2. Weighting Statistical Inputs for Data Used to Support Effective Decision Making During Severe Emergency Weather and Environmental Events

    NASA Technical Reports Server (NTRS)

    Gardner, Adrian

    2010-01-01

    National Aeronautical and Space Administration (NASA) weather and atmospheric environmental organizations are insatiable consumers of geophysical, hydrometeorological and solar weather statistics. The expanding array of internet-worked sensors producing targeted physical measurements has generated an almost factorial explosion of near real-time inputs to topical statistical datasets. Normalizing and value-based parsing of such statistical datasets in support of time-constrained weather and environmental alerts and warnings is essential, even with dedicated high-performance computational capabilities. What are the optimal indicators for advanced decision making? How do we recognize the line between sufficient statistical sampling and excessive, mission destructive sampling ? How do we assure that the normalization and parsing process, when interpolated through numerical models, yields accurate and actionable alerts and warnings? This presentation will address the integrated means and methods to achieve desired outputs for NASA and consumers of its data.

  3. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  4. Arrows as anchors: An analysis of the material features of electric field vector arrows

    NASA Astrophysics Data System (ADS)

    Gire, Elizabeth; Price, Edward

    2014-12-01

    Representations in physics possess both physical and conceptual aspects that are fundamentally intertwined and can interact to support or hinder sense making and computation. We use distributed cognition and the theory of conceptual blending with material anchors to interpret the roles of conceptual and material features of representations in students' use of representations for computation. We focus on the vector-arrows representation of electric fields and describe this representation as a conceptual blend of electric field concepts, physical space, and the material features of the representation (i.e., the physical writing and the surface upon which it is drawn). In this representation, spatial extent (e.g., distance on paper) is used to represent both distances in coordinate space and magnitudes of electric field vectors. In conceptual blending theory, this conflation is described as a clash between the input spaces in the blend. We explore the benefits and drawbacks of this clash, as well as other features of this representation. This analysis is illustrated with examples from clinical problem-solving interviews with upper-division physics majors. We see that while these intermediate physics students make a variety of errors using this representation, they also use the geometric features of the representation to add electric field contributions and to organize the problem situation productively.

  5. Estimating the size of the solution space of metabolic networks

    PubMed Central

    Braunstein, Alfredo; Mulet, Roberto; Pagnani, Andrea

    2008-01-01

    Background Cellular metabolism is one of the most investigated system of biological interactions. While the topological nature of individual reactions and pathways in the network is quite well understood there is still a lack of comprehension regarding the global functional behavior of the system. In the last few years flux-balance analysis (FBA) has been the most successful and widely used technique for studying metabolism at system level. This method strongly relies on the hypothesis that the organism maximizes an objective function. However only under very specific biological conditions (e.g. maximization of biomass for E. coli in reach nutrient medium) the cell seems to obey such optimization law. A more refined analysis not assuming extremization remains an elusive task for large metabolic systems due to algorithmic limitations. Results In this work we propose a novel algorithmic strategy that provides an efficient characterization of the whole set of stable fluxes compatible with the metabolic constraints. Using a technique derived from the fields of statistical physics and information theory we designed a message-passing algorithm to estimate the size of the affine space containing all possible steady-state flux distributions of metabolic networks. The algorithm, based on the well known Bethe approximation, can be used to approximately compute the volume of a non full-dimensional convex polytope in high dimensions. We first compare the accuracy of the predictions with an exact algorithm on small random metabolic networks. We also verify that the predictions of the algorithm match closely those of Monte Carlo based methods in the case of the Red Blood Cell metabolic network. Then we test the effect of gene knock-outs on the size of the solution space in the case of E. coli central metabolism. Finally we analyze the statistical properties of the average fluxes of the reactions in the E. coli metabolic network. Conclusion We propose a novel efficient distributed algorithmic strategy to estimate the size and shape of the affine space of a non full-dimensional convex polytope in high dimensions. The method is shown to obtain, quantitatively and qualitatively compatible results with the ones of standard algorithms (where this comparison is possible) being still efficient on the analysis of large biological systems, where exact deterministic methods experience an explosion in algorithmic time. The algorithm we propose can be considered as an alternative to Monte Carlo sampling methods. PMID:18489757

  6. Criteria for Public Open Space Enhancement to Achieve Social Interaction: a Review Paper

    NASA Astrophysics Data System (ADS)

    Salih, S. A.; Ismail, S.

    2017-12-01

    A This paper presents a various literatures, studies, transcripts and papers aiming to provide an overview of some theories and existing research on the significance of natural environments and green open spaces to achieve social interaction and outdoor recreation. The main objective of the paper is to identify the factors that affecting social interaction in green open spaces, through proving that an appropriate open spaces is important to enhance social interaction and community. This study employs (qualitative) summarizing content analysis method which mainly focused on collect and summarizing of documentation such as transcripts, articles, papers, and books from more than 25 source, regarding the importance of public open spaces for the community. The summarizing content analysis of this paper is the fundament for a qualitative oriented procedure of text interpretation used to analyse the information gathered. Results of this study confirms that sound social interaction need an appropriate physical space including criteria of: design, activities, access and linkage, administration and maintenance, place attachment and users’ characteristics, also previous studies in this area have a health perspective with measures of physical activity of open spaces in general.

  7. System Learning via Exploratory Data Analysis: Seeing Both the Forest and the Trees

    NASA Astrophysics Data System (ADS)

    Habash Krause, L.

    2014-12-01

    As the amount of observational Earth and Space Science data grows, so does the need for learning and employing data analysis techniques that can extract meaningful information from those data. Space-based and ground-based data sources from all over the world are used to inform Earth and Space environment models. However, with such a large amount of data comes a need to organize those data in a way such that trends within the data are easily discernible. This can be tricky due to the interaction between physical processes that lead to partial correlation of variables or multiple interacting sources of causality. With the suite of Exploratory Data Analysis (EDA) data mining codes available at MSFC, we have the capability to analyze large, complex data sets and quantitatively identify fundamentally independent effects from consequential or derived effects. We have used these techniques to examine the accuracy of ionospheric climate models with respect to trends in ionospheric parameters and space weather effects. In particular, these codes have been used to 1) Provide summary "at-a-glance" surveys of large data sets through categorization and/or evolution over time to identify trends, distribution shapes, and outliers, 2) Discern the underlying "latent" variables which share common sources of causality, and 3) Establish a new set of basis vectors by computing Empirical Orthogonal Functions (EOFs) which represent the maximum amount of variance for each principal component. Some of these techniques are easily implemented in the classroom using standard MATLAB functions, some of the more advanced applications require the statistical toolbox, and applications to unique situations require more sophisiticated levels of programming. This paper will present an overview of the range of tools available and how they might be used for a variety of time series Earth and Space Science data sets. Examples of feature recognition from both 1D and 2D (e.g. imagery) time series data sets will be presented.

  8. The Relationship Between Sediment Properties and Sedimentation Patterns on a Macrotidal Gravel Beach over a Semi-lunar Tidal Cycle.

    NASA Astrophysics Data System (ADS)

    Buscombe, D.; Masselink, G.

    2007-12-01

    Detailed measurements of profile and sediment dynamics have been obtained from a macrotidal gravel barrier beach in southern England. Surface and sub-surface sediment samples, beach profiles, and disturbance depths were taken from the intertidal zone on consecutive low tides over semi-lunar tidal cycles, along with continuous wave and tide measurements. Results from two separate field surveys are presented, representing 26 and 24 consecutive low tides, respectively. A combination of Canonical Correlation Analysis (CCA) and Empirical Orthogonal Function (EOF) analysis was used to identify a number of consistent relationships in morphological and sedimentological variables not readily apparent using ordinary correlations. The disadvantage of such statistical models is that the relationships obtained cannot be expressed in physically meaningful units, which does limit its utility in physical-numerical modelling. However, the results reveal some interesting relationships between gravel beachface sedimentology and morphological change. For example, beachface morphology and sedimentology are more similar at a given spatial location over time than over space (cross-shore) at any individual time. Subsurface sedimentology over the depth of disturbance indicates that the beach step can be traced through the sediment characteristics. Indeed, the study suggests that gravel beachface sedimentology is 'slaved' to morphological change rather than vice-versa; and that the relationship becomes more evident as secondary morphological features develop on the beachface. The results imply that median sediment size and geometric sorting are suitable parameters for detecting such relationships. Strong hysteresis over space was present in the EOF modes associated with the most variance in the data sets, for both sediment size and sorting. Statistically significant relationships were found between the temporal modes of (absolute) size/sorting and net sedimentation associated with the largest variance in the non-decomposed respective data sets. Finally, significant relationships were found between a suite of measured hydrodynamic time-series and pairs of significantly correlated morpho-sedimentary eigenmodes. The techniques used were thus able to objectively demonstrate linear association between morphological and sedimentological change on a gravel beachface over a semi-lunar tidal cycle; and also that simultaneous changes in each could be linearly correlated to hydrodynamic forcing.

  9. Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos

    2017-04-01

    Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.

  10. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  11. How does mental-physical multimorbidity express itself in lived time and space? A phenomenological analysis of encounters with depression and chronic physical illness.

    PubMed

    Coventry, Peter A; Dickens, Chris; Todd, Chris

    2014-10-01

    Mental-physical multimorbidity (the co-existence of mental and physical ill health) is highly prevalent and associated with significant impairments and high healthcare costs. While the sociology of chronic illness has developed a mature discourse on coping with long term physical illness the impact of mental and physical health have remained analytically separated, highlighting the need for a better understanding of the day-to-day complexities encountered by people living with mental-physical multimorbidity. We used the phenomenological paradigm of the lived body to elucidate how the experience of mental-physical multimorbidity shapes people's lifeworlds. Nineteen people with chronic obstructive pulmonary disease (COPD) and depression (defined as a score ≥8 on depression scale of Hospital Anxiety and Depression Scale) were recruited from secondary NHS care and interviewed at their homes. Data were analysed phenomenologically using van Manen's lifeworld existential framework of the lived body, lived time, lived space, lived relations. Additionally, we re-analysed data (using the same framework) collected from 13 people recruited from secondary NHS care with either COPD, rheumatoid arthritis, heart disease, or type 1 or type 2 diabetes and depression. The phenomenology of mental-physical multimorbidity was articulated through embodied and emotional encounters with day-to-day life in four ways: [a] participants' perception of lived time and lived space contracted; [b] time and [c] space were experienced as liminal categories, enforcing negative mood and temporal and spatial contraction; and [d] time and space could also be customised to reinstate agency and self-determination. Mental-physical multimorbidity negatively impacts on individuals' perceptions of lived time and lived space, leading to a loss of agency, heightened uncertainty, and poor well-being. Harnessing people's capacity to modify their experience of time and space may be a novel way to support people with mental-physical multimorbidity to live well with illness. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.

    PubMed

    Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W

    2005-01-01

    We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.

  13. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    NASA Astrophysics Data System (ADS)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  14. A Demonstration of the Analysis of Variance Using Physical Movement and Space

    ERIC Educational Resources Information Center

    Owen, William J.; Siakaluk, Paul D.

    2011-01-01

    Classroom demonstrations help students better understand challenging concepts. This article introduces an activity that demonstrates the basic concepts involved in analysis of variance (ANOVA). Students who physically participated in the activity had a better understanding of ANOVA concepts (i.e., higher scores on an exam question answered 2…

  15. Universal self-similarity of propagating populations

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d -dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common—yet arbitrary—motion pattern; each particle has its own random propagation parameters—emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles’ displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles’ underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  16. Universal self-similarity of propagating populations.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d-dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common--yet arbitrary--motion pattern; each particle has its own random propagation parameters--emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles' displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles' underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  17. Surveying Turkish high school and university students' attitudes and approaches to physics problem solving

    NASA Astrophysics Data System (ADS)

    Balta, Nuri; Mason, Andrew J.; Singh, Chandralekha

    2016-06-01

    Students' attitudes and approaches to physics problem solving can impact how well they learn physics and how successful they are in solving physics problems. Prior research in the U.S. using a validated Attitude and Approaches to Problem Solving (AAPS) survey suggests that there are major differences between students in introductory physics and astronomy courses and physics experts in terms of their attitudes and approaches to physics problem solving. Here we discuss the validation, administration, and analysis of data for the Turkish version of the AAPS survey for high school and university students in Turkey. After the validation and administration of the Turkish version of the survey, the analysis of the data was conducted by grouping the data by grade level, school type, and gender. While there are no statistically significant differences between the averages of various groups on the survey, overall, the university students in Turkey were more expertlike than vocational high school students. On an item by item basis, there are statistically differences between the averages of the groups on many items. For example, on average, the university students demonstrated less expertlike attitudes about the role of equations and formulas in problem solving, in solving difficult problems, and in knowing when the solution is not correct, whereas they displayed more expertlike attitudes and approaches on items related to metacognition in physics problem solving. A principal component analysis on the data yields item clusters into which the student responses on various survey items can be grouped. A comparison of the responses of the Turkish and American university students enrolled in algebra-based introductory physics courses shows that on more than half of the items, the responses of these two groups were statistically significantly different, with the U.S. students on average responding to the items in a more expertlike manner.

  18. Study of solid rocket motor for space shuttle booster, volume 2, book 3, appendix A

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A systems requirements analysis for the solid propellant rocket engine to be used with the space shuttle was conducted. The systems analysis was developed to define the physical and functional requirements for the systems and subsystems. The operations analysis was performed to identify the requirements of the various launch operations, mission operations, ground operations, and logistic and flight support concepts.

  19. Physical employment standards for U.K. fire and rescue service personnel.

    PubMed

    Blacker, S D; Rayson, M P; Wilkinson, D M; Carter, J M; Nevill, A M; Richmond, V L

    2016-01-01

    Evidence-based physical employment standards are vital for recruiting, training and maintaining the operational effectiveness of personnel in physically demanding occupations. (i) Develop criterion tests for in-service physical assessment, which simulate the role-related physical demands of UK fire and rescue service (UK FRS) personnel. (ii) Develop practical physical selection tests for FRS applicants. (iii) Evaluate the validity of the selection tests to predict criterion test performance. Stage 1: we conducted a physical demands analysis involving seven workshops and an expert panel to document the key physical tasks required of UK FRS personnel and to develop 'criterion' and 'selection' tests. Stage 2: we measured the performance of 137 trainee and 50 trained UK FRS personnel on selection, criterion and 'field' measures of aerobic power, strength and body size. Statistical models were developed to predict criterion test performance. Stage 3: matter experts derived minimum performance standards. We developed single person simulations of the key physical tasks required of UK FRS personnel as criterion and selection tests (rural fire, domestic fire, ladder lift, ladder extension, ladder climb, pump assembly, enclosed space search). Selection tests were marginally stronger predictors of criterion test performance (r = 0.88-0.94, 95% Limits of Agreement [LoA] 7.6-14.0%) than field test scores (r = 0.84-0.94, 95% LoA 8.0-19.8%) and offered greater face and content validity and more practical implementation. This study outlines the development of role-related, gender-free physical employment tests for the UK FRS, which conform to equal opportunities law. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less

  1. Transactions of the Army Conference on Applied Mathematics and Computing (1st) Held at Washington, DC on 9-11 May 1983

    DTIC Science & Technology

    1984-02-01

    I . . . . . . An Introduction to Geometric Programming Patrick D. Allen and David W. Baker . . . . . . , . . . . . . . Space and Time...Zarwyn, US-Army Electronics R & D Comhiand GEOMETRIC PROGRAMING SPACE AND TIFFE ANALYSIS IN DYNAMIC PROGRAMING ALGORITHMS Renne..tf Stizti, AkeanXa...physical and parameter space can be connected by asymptotic matching. The purpose of the asymptotic analysis is to define the simplest problems

  2. Discontinuous precipitation in a Cd-6 at.% Ag alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manna, I.; Bala, P.K.; Pabi, S.K.

    1996-11-01

    Discontinuous precipitation (DP) in a Cd-6 at.% Ag alloy has been investigated for the first time. The precipitate phase maintains a lamellar morphology and statistically constant interlamellar spacing under a given isothermal condition in the temperature range studied (333--523 K). The interlamellar spacing increases with an increase in isothermal temperature. The reaction front velocity registers a typical C-curve variation with the inverse of temperature. The reaction rate is maximum at 470 K. The predicted upper limit of DP occurrence in this alloy is 23 K lower than the concerned equilibrium solvus temperature. Continuous precipitation accompanies DP at all temperatures, especiallymore » beyond a certain time, and adversely affects the growth kinetics of DP colonies by reducing the local chemical driving force and/or posing physical hindrance to the reaction front migration. An extensive kinetic analysis of DP using the models by Turnbull, Aaronson and Liu, and Petermann and Hornbogen has yielded the grain boundary chemical diffusivity data in Cd-6 At.% Ag for the first time, the activation energy of which lies in the range 55--77 kJ/mol.« less

  3. The statistical analysis of received time-series signals from the laser illumination of remote objects through turbulence

    NASA Astrophysics Data System (ADS)

    Chandler, Susan; Lukesh, Gordon

    2006-11-01

    Ground-to-space illumination experiments, such as the Floodbeam I (FBE I, 1993), Floodbeam II (FBE II, 1996) and Active Imaging Testbed (AIT, 1999), fielded by the Imaging Branch of the United States Air Force Research Laboratory at Starfire Optical Range (SOR) on Kirtland AFB, NM, obtained considerable information from these highly successful experiments. While the experiments were primarily aimed at collecting focal/pupil plane data, the authors recognized during data reduction that the received time-series signals from the integrated full receiver focal plane data contains considerable hitherto unexploited information. For more than 10 years the authors have investigated the exploitation of data contained within the time-series signal from ground-to-space experiments. Results have been presented at numerous SPIE and EOS Remote Sensing Meetings. In July 2005, the authors were honored as invited speakers at the XIIth Symposium "Atmosphere and Ocean Optics; Atmospheric Physics" Tomsk, Russia. The authors were invited to return to Tomsk in 2006 however a serious automobile accident precluded attendance. This paper, requested for publication, provides an important summary of recent results.

  4. On the problem of boundaries and scaling for urban street networks

    PubMed Central

    Masucci, A. Paolo; Arcaute, Elsa; Hatna, Erez; Stanilov, Kiril; Batty, Michael

    2015-01-01

    Urban morphology has presented significant intellectual challenges to mathematicians and physicists ever since the eighteenth century, when Euler first explored the famous Königsberg bridges problem. Many important regularities and scaling laws have been observed in urban studies, including Zipf's law and Gibrat's law, rendering cities attractive systems for analysis within statistical physics. Nevertheless, a broad consensus on how cities and their boundaries are defined is still lacking. Applying an elementary clustering technique to the street intersection space, we show that growth curves for the maximum cluster size of the largest cities in the UK and in California collapse to a single curve, namely the logistic. Subsequently, by introducing the concept of the condensation threshold, we show that natural boundaries of cities can be well defined in a universal way. This allows us to study and discuss systematically some of the regularities that are present in cities. We show that some scaling laws present consistent behaviour in space and time, thus suggesting the presence of common principles at the basis of the evolution of urban systems. PMID:26468071

  5. On the problem of boundaries and scaling for urban street networks.

    PubMed

    Masucci, A Paolo; Arcaute, Elsa; Hatna, Erez; Stanilov, Kiril; Batty, Michael

    2015-10-06

    Urban morphology has presented significant intellectual challenges to mathematicians and physicists ever since the eighteenth century, when Euler first explored the famous Königsberg bridges problem. Many important regularities and scaling laws have been observed in urban studies, including Zipf's law and Gibrat's law, rendering cities attractive systems for analysis within statistical physics. Nevertheless, a broad consensus on how cities and their boundaries are defined is still lacking. Applying an elementary clustering technique to the street intersection space, we show that growth curves for the maximum cluster size of the largest cities in the UK and in California collapse to a single curve, namely the logistic. Subsequently, by introducing the concept of the condensation threshold, we show that natural boundaries of cities can be well defined in a universal way. This allows us to study and discuss systematically some of the regularities that are present in cities. We show that some scaling laws present consistent behaviour in space and time, thus suggesting the presence of common principles at the basis of the evolution of urban systems. © 2015 The Authors.

  6. Student Solution Manual for Essential Mathematical Methods for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Riley, K. F.; Hobson, M. P.

    2011-02-01

    1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics.

  7. Essential Mathematical Methods for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Riley, K. F.; Hobson, M. P.

    2011-02-01

    1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics; Appendices; Index.

  8. Multidisciplinary research in space sciences and engineering with emphasis on theoretical chemistry

    NASA Technical Reports Server (NTRS)

    Hirschfelder, J. O.; Curtiss, C. F.

    1974-01-01

    A broad program is reported of research in theoretical chemistry, particularly in molecular quantum and statistical mechanics, directed toward determination of the physical and chemical properties of materials, relation of these macroscopic properties to properties of individual molecules, and determination of the structure and properties of the individual molecules. Abstracts are presented for each research project conducted during the course of the program.

  9. Management of the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, James L.; Thomas, Valerie L.; Butler, Todd F.; Peters, David J.; Sisson, Patricia L.

    1990-01-01

    Here, the purpose is to define the operational management structure and to delineate the responsibilities of key Space Physics Analysis Network (SPAN) individuals. The management structure must take into account the large NASA and ESA science research community by giving them a major voice in the operation of the system. Appropriate NASA and ESA interfaces must be provided so that there will be adequate communications facilities available when needed. Responsibilities are delineated for the Advisory Committee, the Steering Committee, the Project Scientist, the Project Manager, the SPAN Security Manager, the Internetwork Manager, the Network Operations Manager, the Remote Site Manager, and others.

  10. Non-gaussianity versus nonlinearity of cosmological perturbations.

    PubMed

    Verde, L

    2001-06-01

    Following the discovery of the cosmic microwave background, the hot big-bang model has become the standard cosmological model. In this theory, small primordial fluctuations are subsequently amplified by gravity to form the large-scale structure seen today. Different theories for unified models of particle physics, lead to different predictions for the statistical properties of the primordial fluctuations, that can be divided in two classes: gaussian and non-gaussian. Convincing evidence against or for gaussian initial conditions would rule out many scenarios and point us toward a physical theory for the origin of structures. The statistical distribution of cosmological perturbations, as we observe them, can deviate from the gaussian distribution in several different ways. Even if perturbations start off gaussian, nonlinear gravitational evolution can introduce non-gaussian features. Additionally, our knowledge of the Universe comes principally from the study of luminous material such as galaxies, but galaxies might not be faithful tracers of the underlying mass distribution. The relationship between fluctuations in the mass and in the galaxies distribution (bias), is often assumed to be local, but could well be nonlinear. Moreover, galaxy catalogues use the redshift as third spatial coordinate: the resulting redshift-space map of the galaxy distribution is nonlinearly distorted by peculiar velocities. Nonlinear gravitational evolution, biasing, and redshift-space distortion introduce non-gaussianity, even in an initially gaussian fluctuation field. I investigate the statistical tools that allow us, in principle, to disentangle the above different effects, and the observational datasets we require to do so in practice.

  11. Developments in Space Research in Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, O.

    2006-08-01

    Nigeria's desire to venture into space technology was first made known to ECA/ OAU member countries at an inter-governmental meeting in Addis Ababa, 1976. The Nigerian space research is highly rated in Africa in terms of reputation and scientific results. The National Space Research and Development Agency (NASRDA), Nigeria's space research coordinating body; has taken a more active role to help Nigeria's space research community to succeed internationally. The paper presents recent examples of Nigeria's successes in space and its detailed applications in areas such as remote sensing, meteorology, communication and Information Technology. and many more. It gave an analysis of the statistics of Nigerian born space scientists working in the other space-faring nations. The analysis have been used to develop a model for increasing Nigerian scientist's involvement in the development of space research in Nigeria. It concluded with some thoughts on the current and future of Nigeria's space borne scientific experiments, policies and programs.

  12. Lagrangian particle statistics of numerically simulated shear waves

    NASA Astrophysics Data System (ADS)

    Kirby, J.; Briganti, R.; Brocchini, M.; Chen, Q. J.

    2006-12-01

    The properties of numerical solutions of various circulation models (Boussinesq-type and wave-averaged NLSWE) have been investigated on the basis of the induced horizontal flow mixing, for the case of shear waves. The mixing properties of the flow have been investigated using particle statistics, following the approach of LaCasce (2001) and Piattella et al. (2006). Both an idealized barred beach bathymetry and a test case taken from SANDYDUCK '97 have been considered. Random seeding patterns of passive tracer particles are used. The flow exhibits features similar to those discussed in literature. Differences are also evident due both to the physics (intense longshore shear shoreward of the bar) and the procedure used to obtain the statistics (lateral conditions limit the time/space window for the longshore flow). Within the Boussinesq framework, different formulations of Boussinesq type equations have been used and the results compared (Wei et al. 1995, Chen et al. (2003), Chen et al. (2006)). Analysis based on the Eulerian velocity fields suggests a close similarity between Wei et al. (1995) and Chen et. al (2006), while examination of particle displacements and implied mixing suggests a closer behaviour between Chen et al. (2003) and Chen et al. (2006). Two distinct stages of mixing are evident in all simulations: i) the first stage ends at t

  13. The REU Program in Solar Physics at Montana State University

    NASA Astrophysics Data System (ADS)

    Martens, Petrus C.; Canfield, R. C.; McKenzie, D. M.

    2007-05-01

    The Solar Physics group at Montana State University has organized an annual summer REU program in Solar Physics, Astronomy, and Space Physics since 1999, with NSF funding since 2003. The number of students applying and being admitted to the program has increased every year, and we have been very successful in attracting female participants. A great majority of our REU alumni have chosen career paths in the sciences, and, according to their testimonies, our REU program has played a significant role in their decisions. From the start our REU program has had an important international component through a close collaboration with the University of St. Andrews in Scotland. In our poster we will describe the goals, organization, scientific contents, international aspects, and results, and present statistics on applications, participants, gender balance, and diversity.

  14. Scaling in biomechanical experimentation: a finite similitude approach.

    PubMed

    Ochoa-Cabrero, Raul; Alonso-Rasgado, Teresa; Davey, Keith

    2018-06-01

    Biological experimentation has many obstacles: resource limitations, unavailability of materials, manufacturing complexities and ethical compliance issues; any approach that resolves all or some of these is of some interest. The aim of this study is applying the recently discovered concept of finite similitude as a novel approach for the design of scaled biomechanical experiments supported with analysis using a commercial finite-element package and validated by means of image correlation software. The study of isotropic scaling of synthetic bones leads to the selection of three-dimensional (3D) printed materials for the trial-space materials. These materials conforming to the theory are analysed in finite-element models of a cylinder and femur geometries undergoing compression, tension, torsion and bending tests to assess the efficacy of the approach using reverse scaling of the approach. The finite-element results show similar strain patterns in the surface for the cylinder with a maximum difference of less than 10% and for the femur with a maximum difference of less than 4% across all tests. Finally, the trial-space, physical-trial experimentation using 3D printed materials for compression and bending testing provides a good agreement in a Bland-Altman statistical analysis, providing good supporting evidence for the practicality of the approach. © 2018 The Author(s).

  15. PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG.

    PubMed

    Ball, Kenneth; Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay

    2016-01-01

    Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals.

  16. PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG

    PubMed Central

    Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay

    2016-01-01

    Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397

  17. Refining Martian Ages and Understanding Geological Processes From Cratering Statistics

    NASA Technical Reports Server (NTRS)

    Hartmann, William K.

    2005-01-01

    Senior Scientist William K. Hartman presents his final report on Mars Data Analysis Program grant number NAG5-12217: The third year of the three-year program was recently completed in mid-2005. The program has been extremely productive in research and data analysis regarding Mars, especially using Mars Global Surveyor and Mars Odyssey imagery. In the 2005 alone, three papers have already been published, to which this work contributed.1) Hartmann, W. K. 200.5. Martian cratering 8. Isochron refinement and the history of Martian geologic activity Icarus 174, 294-320. This paper is a summary of my entire program of establishing Martian chronology through counts of Martian impact craters. 2) Arfstrom, John, and W. K. Hartmann 2005. Martian flow features, moraine-like rieges, and gullies: Terrestrial analogs and interrelationships. Icarus 174,32 1-335. This paper makes pioneering connections between Martian glacier-like features and terrestrial glacial features. 3) Hartmann, W.K., D. Winterhalter, and J. Geiss. 2005 Chronology and Physical Evolution of Planet Mars. In The Solar System and Beyond: Ten Years of ISSI (Bern: International Space Science Institute). This is a summary of work conducted at the International Space Science Institute with an international team, emphasizing our publication of a conference volume about Mars, edited by Hartmann and published in 2001.

  18. Relationship between environmental management with quality of kampong space room (Case study: RW 3 of Sukun Sub District, Malang City)

    NASA Astrophysics Data System (ADS)

    Wardhani, D. K.; Azmi, D. S.; Purnamasari, W. D.

    2017-06-01

    RW 3 Sukun Malang was one of kampong that won the competition kampong environment and had managed to maintain the preservation of the kampong. Society of RW 3 Sukun undertake various activities to manage the environment by optimizing the use of kampong space. Despite RW 3 Sukun had conducted environmental management activities, there are several locations in the kampong space that less well maintained. The purpose of this research was to determine the relation of environmental management with the quality of kampong space in RW 3 Sukun. This research used qualitative and quantitative research approaches. Quantitative research conducted by using descriptive statistical analysis in assessing the quality of kampong space with weighting, scoring, and overlay maps. Quantitative research was also conducted on the relation analysis of environmental management with the quality of kampong space by using typology analysis and pearson correlation analysis. Qualitative research conducted on the analysis of environmental management and the relation analysis of environmental management with the quality of kampong space. Result of this research indicates that environmental management in RW 3 Sukun have relation with the quality of kampong space.

  19. Application of econometric and ecology analysis methods in physics software

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  20. Permutation glass.

    PubMed

    Williams, Mobolaji

    2018-01-01

    The field of disordered systems in statistical physics provides many simple models in which the competing influences of thermal and nonthermal disorder lead to new phases and nontrivial thermal behavior of order parameters. In this paper, we add a model to the subject by considering a disordered system where the state space consists of various orderings of a list. As in spin glasses, the disorder of such "permutation glasses" arises from a parameter in the Hamiltonian being drawn from a distribution of possible values, thus allowing nominally "incorrect orderings" to have lower energies than "correct orderings" in the space of permutations. We analyze a Gaussian, uniform, and symmetric Bernoulli distribution of energy costs, and, by employing Jensen's inequality, derive a simple condition requiring the permutation glass to always transition to the correctly ordered state at a temperature lower than that of the nondisordered system, provided that this correctly ordered state is accessible. We in turn find that in order for the correctly ordered state to be accessible, the probability that an incorrectly ordered component is energetically favored must be less than the inverse of the number of components in the system. We show that all of these results are consistent with a replica symmetric ansatz of the system. We conclude by arguing that there is no distinct permutation glass phase for the simplest model considered here and by discussing how to extend the analysis to more complex Hamiltonians capable of novel phase behavior and replica symmetry breaking. Finally, we outline an apparent correspondence between the presented system and a discrete-energy-level fermion gas. In all, the investigation introduces a class of exactly soluble models into statistical mechanics and provides a fertile ground to investigate statistical models of disorder.

  1. Internet dynamics

    NASA Astrophysics Data System (ADS)

    Lukose, Rajan Mathew

    The World Wide Web and the Internet are rapidly expanding spaces, of great economic and social significance, which offer an opportunity to study many phenomena, often previously inaccessible, on an unprecedented scale and resolution with relative ease. These phenomena are measurable on the scale of tens of millions of users and hundreds of millions of pages. By virtue of nearly complete electronic mediation, it is possible in principle to observe the time and ``spatial'' evolution of nearly all choices and interactions. This cyber-space therefore provides a view into a number of traditional research questions (from many academic disciplines) and creates its own new phenomena accessible for study. Despite its largely self-organized and dynamic nature, a number of robust quantitative regularities are found in the aggregate statistics of interesting and useful quantities. These regularities can be understood with the help of models that draw on ideas from statistical physics as well as other fields such as economics, psychology and decision theory. This thesis develops models that can account for regularities found in the statistics of Internet congestion and user surfing patterns and discusses some practical consequences. practical consequences.

  2. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution

    PubMed Central

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-01

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919

  3. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  4. The Gaussian CL s method for searches of new physics

    DOE PAGES

    Qian, X.; Tan, A.; Ling, J. J.; ...

    2016-04-23

    Here we describe a method based on the CL s approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CL s method. Our work provides a self-contained mathematical proof for the Gaussian CL s method, that explicitlymore » outlines the required conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CL s method in an example of searching for a sterile neutrino, where the CL s approach was rarely used before. We also compare data analysis results produced by the Gaussian CL s method and various CI methods to showcase their differences.« less

  5. Constraining the physical state by symmetries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fatibene, L., E-mail: lorenzo.fatibene@unito.it; INFN - Sezione Torino - IS QGSKY; Ferraris, M.

    After reviewing the hole argument and its relations with initial value problem and general covariance, we shall discuss how much freedom one has to define the physical state in a generally covariant field theory (with or without internal gauge symmetries). Our analysis relies on Cauchy problems, thus it is restricted to globally hyperbolic spacetimes. We shall show that in generally covariant theories on a compact space (as well as for internal gauge symmetries on any spacetime) one has no freedom and one is forced to declare as physically equivalent two configurations which differ by a global spacetime diffeomorphism (or bymore » an internal gauge transformation) as it is usually prescribed. On the contrary, when space is not compact, the result does not hold true and one may have different options to define physically equivalent configurations, still preserving determinism. - Highlights: • Investigate the relation between the hole argument, covariance, determinism and physical state. • Show that if space is compact then any diffeomorphism is a gauge symmetry. • Show that if space is not compact then there may be more freedom in choosing gauge group.« less

  6. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  7. Stochastic Calculus and Differential Equations for Physics and Finance

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2013-02-01

    1. Random variables and probability distributions; 2. Martingales, Markov, and nonstationarity; 3. Stochastic calculus; 4. Ito processes and Fokker-Planck equations; 5. Selfsimilar Ito processes; 6. Fractional Brownian motion; 7. Kolmogorov's PDEs and Chapman-Kolmogorov; 8. Non Markov Ito processes; 9. Black-Scholes, martingales, and Feynman-Katz; 10. Stochastic calculus with martingales; 11. Statistical physics and finance, a brief history of both; 12. Introduction to new financial economics; 13. Statistical ensembles and time series analysis; 14. Econometrics; 15. Semimartingales; References; Index.

  8. Space biology initiative program definition review. Trade study 4: Design modularity and commonality

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry

    1989-01-01

    The relative cost impacts (up or down) of developing Space Biology hardware using design modularity and commonality is studied. Recommendations for how the hardware development should be accomplished to meet optimum design modularity requirements for Life Science investigation hardware will be provided. In addition, the relative cost impacts of implementing commonality of hardware for all Space Biology hardware are defined. Cost analysis and supporting recommendations for levels of modularity and commonality are presented. A mathematical or statistical cost analysis method with the capability to support development of production design modularity and commonality impacts to parametric cost analysis is provided.

  9. Physical activity and risk of pancreatic cancer: a systematic review and meta-analysis.

    PubMed

    Behrens, Gundula; Jochem, Carmen; Schmid, Daniela; Keimling, Marlen; Ricci, Cristian; Leitzmann, Michael F

    2015-04-01

    Physical activity may prevent pancreatic cancer by regulating body weight and decreasing insulin resistance, DNA damage, and chronic inflammation. Previous meta-analyses found inconsistent evidence for a protective effect of physical activity on pancreatic cancer but those studies did not investigate whether the association between physical activity and pancreatic cancer varies by smoking status, body mass index (BMI), or level of consistency of physical activity over time. To address these issues, we conducted an updated meta-analysis following the PRISMA guidelines among 30 distinct studies with a total of 10,501 pancreatic cancer cases. Random effects meta-analysis of cohort studies revealed a weak, statistically significant reduction in pancreatic cancer risk for high versus low levels of physical activity (relative risk (RR) 0.93, 95 % confidence interval (CI) 0.88-0.98). By comparison, case-control studies yielded a stronger, statistically significant risk reduction (RR 0.78, 95 % CI 0.66-0.94; p-difference by study design = 0.07). When focusing on cohort studies, physical activity summary risk estimates appeared to be more pronounced for consistent physical activity over time (RR 0.86, 95 % CI 0.76-0.97) than for recent past physical activity (RR 0.95, 95 % CI 0.90-1.01) or distant past physical activity (RR 0.95, 95 % CI 0.79-1.15, p-difference by timing in life of physical activity = 0.36). Physical activity summary risk estimates did not differ by smoking status or BMI. In conclusion, physical activity is not strongly associated with pancreatic cancer risk, and the relation is not modified by smoking status or BMI level. While overall findings were weak, we did find some suggestion of potential pancreatic cancer risk reduction with consistent physical activity over time.

  10. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  11. Correlated randomness: Some examples of exotic statistical physics

    NASA Astrophysics Data System (ADS)

    Stanley, H. Eugene

    2005-05-01

    One challenge of biology, medicine, and economics is that the systems treated by these sciences have no perfect metronome in time and no perfect spatial architecture -- crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. To understand this `miracle', one might consider placing aside the human tendency to see the universe as a machine. Instead, one might address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at many spatial and temporal patterns in biology, medicine, and economics. Inspired by principles developed by statistical physics over the past 50 years -- scale invariance and universality -- we review some recent applications of correlated randomness to fields that might startle Boltzmann if he were alive today.

  12. Magnetofermionic condensate in two dimensions

    PubMed Central

    Kulik, L. V.; Zhuravlev, A. S.; Dickmann, S.; Gorbunov, A. V.; Timofeev, V. B.; Kukushkin, I. V.; Schmult, S.

    2016-01-01

    Coherent condensate states of particles obeying either Bose or Fermi statistics are in the focus of interest in modern physics. Here we report on condensation of collective excitations with Bose statistics, cyclotron magnetoexcitons, in a high-mobility two-dimensional electron system in a magnetic field. At low temperatures, the dense non-equilibrium ensemble of long-lived triplet magnetoexcitons exhibits both a drastic reduction in the viscosity and a steep enhancement in the response to the external electromagnetic field. The observed effects are related to formation of a super-absorbing state interacting coherently with the electromagnetic field. Simultaneously, the electrons below the Fermi level form a super-emitting state. The effects are explicable from the viewpoint of a coherent condensate phase in a non-equilibrium system of two-dimensional fermions with a fully quantized energy spectrum. The condensation occurs in the space of vectors of magnetic translations, a property providing a completely new landscape for future physical investigations. PMID:27848969

  13. Methods to Measure Physical Activity Behaviors in Health Education Research

    ERIC Educational Resources Information Center

    Fitzhugh, Eugene C.

    2015-01-01

    Regular physical activity (PA) is an important concept to measure in health education research. The health education researcher might need to measure physical activity because it is the primary measure of interest, or PA might be a confounding measure that needs to be controlled for in statistical analysis. The purpose of this commentary is to…

  14. Generating a Multiphase Equation of State with Swarm Intelligence

    NASA Astrophysics Data System (ADS)

    Cox, Geoffrey

    2017-06-01

    Hydrocode calculations require knowledge of the variation of pressure of a material with density and temperature, which is given by the equation of state. An accurate model needs to account for discontinuities in energy, density and properties of a material across a phase boundary. When generating a multiphase equation of state the modeller attempts to balance the agreement between the available data for compression, expansion and phase boundary location. However, this can prove difficult because minor adjustments in the equation of state for a single phase can have a large impact on the overall phase diagram. Recently, Cox and Christie described a method for combining statistical-mechanics-based condensed matter physics models with a stochastic analysis technique called particle swarm optimisation. The models produced show good agreement with experiment over a wide range of pressure-temperature space. This talk details the general implementation of this technique, shows example results, and describes the types of analysis that can be performed with this method.

  15. Analysis on H Spectral Shape During the Early 2012 SEPs with the PAMELA Experiment

    NASA Technical Reports Server (NTRS)

    Martucci, Matteo; Boezio, M.; Bravar, U.; Carbone, R.; Christian, E. R.; De Nolfo, G. A.; Merge, M.; Mocchiutti, E.; Munini, R.; Ricci, M.; hide

    2013-01-01

    The satellite-borne PAMELA experiment has been continuously collecting data since 2006.This apparatus is designed to study charged particles in the cosmic radiation. The combination of a permanent magnet, a silicon strip tracker and a silicon-tungsten imaging calorimeter, and the redundancy of instrumentation allow very precise studies on the physics of cosmic rays in a wide energy range and with high statistics. This makes PAMELA a very suitable instrument for Solar Energetic Particle (SEP) observations. Not only does its pan the energy range between the ground-based neutron monitor data and the observations of SEPs from space,but PAMELA also carries out the first direct measurements of the composition for the highest energy SEP events, including those causing Ground Level Enhancements (GLEs).In particular, PAMELA has registered many SEP events during solar cycle 24,offering unique opportunities to address the question of high-energy SEP origin. A preliminary analysis on proton spectra behaviour during this event is presented in this work.

  16. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  17. Space, time, and the third dimension (model error)

    USGS Publications Warehouse

    Moss, Marshall E.

    1979-01-01

    The space-time tradeoff of hydrologic data collection (the ability to substitute spatial coverage for temporal extension of records or vice versa) is controlled jointly by the statistical properties of the phenomena that are being measured and by the model that is used to meld the information sources. The control exerted on the space-time tradeoff by the model and its accompanying errors has seldom been studied explicitly. The technique, known as Network Analyses for Regional Information (NARI), permits such a study of the regional regression model that is used to relate streamflow parameters to the physical and climatic characteristics of the drainage basin.The NARI technique shows that model improvement is a viable and sometimes necessary means of improving regional data collection systems. Model improvement provides an immediate increase in the accuracy of regional parameter estimation and also increases the information potential of future data collection. Model improvement, which can only be measured in a statistical sense, cannot be quantitatively estimated prior to its achievement; thus an attempt to upgrade a particular model entails a certain degree of risk on the part of the hydrologist.

  18. Numerical solutions of the semiclassical Boltzmann ellipsoidal-statistical kinetic model equation

    PubMed Central

    Yang, Jaw-Yen; Yan, Chin-Yuan; Huang, Juan-Chen; Li, Zhihui

    2014-01-01

    Computations of rarefied gas dynamical flows governed by the semiclassical Boltzmann ellipsoidal-statistical (ES) kinetic model equation using an accurate numerical method are presented. The semiclassical ES model was derived through the maximum entropy principle and conserves not only the mass, momentum and energy, but also contains additional higher order moments that differ from the standard quantum distributions. A different decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. The numerical method in phase space combines the discrete-ordinate method in momentum space and the high-resolution shock capturing method in physical space. Numerical solutions of two-dimensional Riemann problems for two configurations covering various degrees of rarefaction are presented and various contours of the quantities unique to this new model are illustrated. When the relaxation time becomes very small, the main flow features a display similar to that of ideal quantum gas dynamics, and the present solutions are found to be consistent with existing calculations for classical gas. The effect of a parameter that permits an adjustable Prandtl number in the flow is also studied. PMID:25104904

  19. An elevated level of physical activity is associated with normal lipoprotein(a) levels in individuals from Maracaibo, Venezuela.

    PubMed

    Bermúdez, Valmore; Aparicio, Daniel; Rojas, Edward; Peñaranda, Lianny; Finol, Freddy; Acosta, Luis; Mengual, Edgardo; Rojas, Joselyn; Arráiz, Nailet; Toledo, Alexandra; Colmenares, Carlos; Urribarí, Jesica; Sanchez, Wireynis; Pineda, Carlos; Rodriguez, Dalia; Faria, Judith; Añez, Roberto; Cano, Raquel; Cano, Clímaco; Sorell, Luis; Velasco, Manuel

    2010-01-01

    Coronary artery disease is the main cause of death worldwide. Lipoprotein(a) [Lp(a)], is an independent risk factor for coronary artery disease in which concentrations are genetically regulated. Contradictory results have been published about physical activity influence on Lp(a) concentration. This research aimed to determine associations between different physical activity levels and Lp(a) concentration. A descriptive and cross-sectional study was made in 1340 randomly selected subjects (males = 598; females = 712) to whom a complete clinical history, the International Physical Activity Questionnaire, and Lp(a) level determination were made. Statistical analysis was carried out to assess qualitative variables relationship by chi2 and differences between means by one-way analysis of variance considering a P value <0.05 as statistically significant. Results are shown as absolute frequencies, percentages, and mean +/- standard deviation according to case. Physical activity levels were ordinal classified as follows: low activity with 24.3% (n = 318), moderate activity with 35.0% (n = 458), and high physical activity with 40.8% (n = 534). Lp(a) concentration in the studied sample was 26.28 +/- 12.64 (IC: 25.59-26.96) mg/dL. Lp(a) concentration according to low, moderate, and high physical activity levels were 29.22 +/- 13.74, 26.27 +/- 12.91, and 24.53 +/- 11.35 mg/dL, respectively, observing statistically significant differences between low and moderate level (P = 0.004) and low and high level (P < 0.001). A strong association (chi2 = 9.771; P = 0.002) was observed among a high physical activity level and a normal concentration of Lp(a) (less than 30 mg/dL). A lifestyle characterized by high physical activity is associated with normal Lp(a) levels.

  20. Interior properties of the inner Saturnian moons from space astrometry data

    NASA Astrophysics Data System (ADS)

    Lainey, Valery; Noyelles, Benoît; Cooper, Nick; Murray, Carl; Park, Ryan; Rambaux, Nicolas

    2018-04-01

    During thirteen years in orbit around Saturn before its final plunge, the Cassini spacecraft provided more than ten thousand astrometric measurements. Such large amounts of accurate data enable the search for extremely faint signals in the orbital motion of the moons. Among those, the detection of the dynamical feedback of the rotation of the inner moons of Saturn on their respective orbits becomes possible. Using all the currently available astrometric data associated with Atlas, Prometheus, Pandora, Janus and Epimetheus, we provide a detailed analysis of the ISS data, with special emphasis on their statistical behavior and source of biases. Then, we try quantifying the physical librations of Prometheus, Pandora, Epimetheus and Janus from the monitoring of their orbits. Last, we show how introducing measurements directly derived from imaging can provide tighter constraints on these quantities.

  1. Seismic sample areas defined from incomplete catalogues: an application to the Italian territory

    NASA Astrophysics Data System (ADS)

    Mulargia, F.; Tinti, S.

    1985-11-01

    The comprehensive understanding of earthquake source-physics under real conditions requires the study not of single faults as separate entities but rather of a seismically active region as a whole, accounting for the interaction among different structures. We define "seismic sample area" the most convenient region to be used as a natural laboratory for the study of seismic source physics. This coincides with the region where the average large magnitude seismicity is the highest. To this end, time and space future distributions of large earthquakes are to be estimated. Using catalog seismicity as an input, the rate of occurrence is not constant but appears generally biased by incompleteness in some parts of the catalog and possible nonstationarities in seismic activity. We present a statistical procedure which is capable, under a few mild assumptions, of both detecting nonstationarities in seismicity and finding the incomplete parts of a seismic catalog. The procedure is based on Kolmogorov-Smirnov nonparametric statistics, and can be applied without a priori assuming the parent distribution of the events. The efficiency of this procedure allows the analysis of small data sets. An application to the Italian territory is presented, using the most recent version of the ENEL seismic catalog. Seismic activity takes place in six well defined areas but only five of them have a number of events sufficient for analysis. Barring a few exceptions, seismicity is found stationary throughout the whole catalog span 1000-1980. The eastern Alps region stands out as the best "sample area", with the highest average probability of event occurrence per time and area unit. Final objective of this characterization is to stimulate a program of intensified research.

  2. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    NASA Astrophysics Data System (ADS)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  3. Modification of Poisson Distribution in Radioactive Particle Counting.

    ERIC Educational Resources Information Center

    Drotter, Michael T.

    This paper focuses on radioactive practicle counting statistics in laboratory and field applications, intended to aid the Health Physics technician's understanding of the effect of indeterminant errors on radioactive particle counting. It indicates that although the statistical analysis of radioactive disintegration is best described by a Poisson…

  4. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  5. Statistical description of tectonic motions

    NASA Technical Reports Server (NTRS)

    Agnew, Duncan Carr

    1993-01-01

    This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.

  6. Physical Space and the Resource-Based View of the College

    ERIC Educational Resources Information Center

    Fugazzotto, Sam J.

    2010-01-01

    Space serves as a key resource for colleges and universities, and institutions exchange information about it with each other and with prospective students. Using content analysis to examine several widely circulated publications, this study looked for differences in the value attributed to space when institutional leaders present it to students…

  7. Research in space physics at the University of Iowa

    NASA Technical Reports Server (NTRS)

    Vanallen, J. A.

    1976-01-01

    Energetic particles in outer space and their relationship to electric, magnetic, and electromagnetic fields associated with the earth, sun, moon, and planets, and the interplanetary medium are investigated. Special attention was given to observations of earth and moon satellites and interplanetary spacecraft; phenomenological analysis and interpretation were emphasized. Data also cover ground based on radio astronomical and optical techniques and theoretical problems in plasma physics as revelant to solar planetary and interplanetary phenomena.

  8. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  9. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  10. BRST technique for the cosmological density matrix

    NASA Astrophysics Data System (ADS)

    Barvinsky, A. O.

    2013-10-01

    The microcanonical density matrix in closed cosmology has a natural definition as a projector on the space of solutions of Wheeler-DeWitt equations, which is motivated by the absence of global non-vanishing charges and energy in spatially closed gravitational systems. Using the BRST/BFV formalism in relativistic phase space of gauge and ghost variables we derive the path integral representation for this projector and the relevant statistical sum. This derivation circumvents the difficulties associated with the open algebra of noncommutative quantum Dirac constraints and the construction/regularization of the physical inner product in the subspace of BRS singlets. This inner product is achieved via the Batalin-Marnelius gauge fixing in the space of BRS-invariant states, which in its turn is shown to be a result of truncation of the BRST/BFV formalism to the "matter" sector of relativistic phase space.

  11. Point pattern analysis of FIA data

    Treesearch

    Chris Woodall

    2002-01-01

    Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...

  12. A pedagogical derivation of the matrix element method in particle physics data analysis

    NASA Astrophysics Data System (ADS)

    Sumowidagdo, Suharyo

    2018-03-01

    The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.

  13. Optimal linear and nonlinear feature extraction based on the minimization of the increased risk of misclassification. [Bayes theorem - statistical analysis/data processing

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.

    1974-01-01

    General classes of nonlinear and linear transformations were investigated for the reduction of the dimensionality of the classification (feature) space so that, for a prescribed dimension m of this space, the increase of the misclassification risk is minimized.

  14. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  15. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    NASA Technical Reports Server (NTRS)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  16. Testing New Physics with the Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Gluscevic, Vera

    2013-01-01

    In my thesis work, I have developed and applied tests of new fundamental physics that utilize high-precision CMB polarization measurements. I especially focused on a wide class of dark energy models that propose existence of new scalar fields to explain accelerated expansion of the Universe. Such fields naturally exhibit a weak interaction with photons, giving rise to "cosmic birefringence"---a rotation of the polarization plane of light traveling cosmological distances, which alters the statistics of the CMB fluctuations in the sky by inducing a characteristic B-mode polarization. A birefringent rotation of the CMB would be smoking-gun evidence that dark energy is a dynamical component rather than a cosmological constant, while its absence gives clues about the allowed regions of the parameter space for new models. I developed a full-sky formalism to search for cosmic birefringence by cross-correlating CMB temperature and polarization maps, after allowing for the rotation angle to vary across the sky. With my collaborators, I also proposed a cross-correlation of the rotation-angle estimator with the CMB temperature as a novel statistical probe which can boost signal-to-noise in the case of marginal detection and help disentangle the underlying physical models. I then investigated the degeneracy between the rotation signal and the signals from other exotic scenarios that induce a similar B-mode polarization signature, such as chiral primordial gravitational waves, and demonstrated that these effects are completely separable. Finally, I applied this formalism to WMAP-7 data and derived the first CMB constraint on the power spectrum of the birefringent-rotation angle and presented forecasts for future experiments. To demonstrate the value of this analysis method beyond the search for direction-dependent cosmic birefringence, I have also used it to probe patchy screening from the epoch of cosmic reionization with WMAP-7 data.

  17. Impact of baryonic physics on intrinsic alignments

    DOE PAGES

    Tenneti, Ananth; Gnedin, Nickolay Y.; Feng, Yu

    2017-01-11

    We explore the effects of specific assumptions in the subgrid models of star formation and stellar and AGN feedback on intrinsic alignments of galaxies in cosmological simulations of "MassiveBlack-II" family. Using smaller volume simulations, we explored the parameter space of the subgrid star formation and feedback model and found remarkable robustness of the observable statistical measures to the details of subgrid physics. The one observational probe most sensitive to modeling details is the distribution of misalignment angles. We hypothesize that the amount of angular momentum carried away by the galactic wind is the primary physical quantity that controls the orientationmore » of the stellar distribution. Finally, our results are also consistent with a similar study by the EAGLE simulation team.« less

  18. The Wang-Landau Sampling Algorithm

    NASA Astrophysics Data System (ADS)

    Landau, David P.

    2003-03-01

    Over the past several decades Monte Carlo simulations[1] have evolved into a powerful tool for the study of wide-ranging problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, usually in the canonical ensemble, and enormous improvements have been made in performance through the implementation of novel algorithms. Nonetheless, difficulties arise near phase transitions, either due to critical slowing down near 2nd order transitions or to metastability near 1st order transitions, thus limiting the applicability of the method. We shall describe a new and different Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is estimated, all thermodynamic properties can be calculated at all temperatures. This approach can be extended to multi-dimensional parameter spaces and has already found use in classical models of interacting particles including systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc., as well as for quantum models. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  19. No Space for Girliness in Physics: Understanding and Overcoming the Masculinity of Physics

    ERIC Educational Resources Information Center

    Götschel, Helene

    2014-01-01

    Allison Gonsalves' article on "women doctoral students' positioning around discourses of gender and competence in physics" explores narratives of Canadian women physicists concerning their strategies to gain recognition as physicists. In my response to her rewarding and inspiring analysis I will reflect on her findings and arguments and…

  20. NASA Langley's Approach to the Sandia's Structural Dynamics Challenge Problem

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kenny, Sean P.; Crespo, Luis G.; Elliott, Kenny B.

    2007-01-01

    The objective of this challenge is to develop a data-based probabilistic model of uncertainty to predict the behavior of subsystems (payloads) by themselves and while coupled to a primary (target) system. Although this type of analysis is routinely performed and representative of issues faced in real-world system design and integration, there are still several key technical challenges that must be addressed when analyzing uncertain interconnected systems. For example, one key technical challenge is related to the fact that there is limited data on target configurations. Moreover, it is typical to have multiple data sets from experiments conducted at the subsystem level, but often samples sizes are not sufficient to compute high confidence statistics. In this challenge problem additional constraints are placed as ground rules for the participants. One such rule is that mathematical models of the subsystem are limited to linear approximations of the nonlinear physics of the problem at hand. Also, participants are constrained to use these models and the multiple data sets to make predictions about the target system response under completely different input conditions. Our approach involved initially the screening of several different methods. Three of the ones considered are presented herein. The first one is based on the transformation of the modal data to an orthogonal space where the mean and covariance of the data are matched by the model. The other two approaches worked solutions in physical space where the uncertain parameter set is made of masses, stiffnesses and damping coefficients; one matches confidence intervals of low order moments of the statistics via optimization while the second one uses a Kernel density estimation approach. The paper will touch on all the approaches, lessons learned, validation 1 metrics and their comparison, data quantity restriction, and assumptions/limitations of each approach. Keywords: Probabilistic modeling, model validation, uncertainty quantification, kernel density

  1. An MCMC determination of the primordial helium abundance

    NASA Astrophysics Data System (ADS)

    Aver, Erik; Olive, Keith A.; Skillman, Evan D.

    2012-04-01

    Spectroscopic observations of the chemical abundances in metal-poor H II regions provide an independent method for estimating the primordial helium abundance. H II regions are described by several physical parameters such as electron density, electron temperature, and reddening, in addition to y, the ratio of helium to hydrogen. It had been customary to estimate or determine self-consistently these parameters to calculate y. Frequentist analyses of the parameter space have been shown to be successful in these parameter determinations, and Markov Chain Monte Carlo (MCMC) techniques have proven to be very efficient in sampling this parameter space. Nevertheless, accurate determination of the primordial helium abundance from observations of H II regions is constrained by both systematic and statistical uncertainties. In an attempt to better reduce the latter, and continue to better characterize the former, we apply MCMC methods to the large dataset recently compiled by Izotov, Thuan, & Stasińska (2007). To improve the reliability of the determination, a high quality dataset is needed. In pursuit of this, a variety of cuts are explored. The efficacy of the He I λ4026 emission line as a constraint on the solutions is first examined, revealing the introduction of systematic bias through its absence. As a clear measure of the quality of the physical solution, a χ2 analysis proves instrumental in the selection of data compatible with the theoretical model. Nearly two-thirds of the observations fall outside a standard 95% confidence level cut, which highlights the care necessary in selecting systems and warrants further investigation into potential deficiencies of the model or data. In addition, the method also allows us to exclude systems for which parameter estimations are statistical outliers. As a result, the final selected dataset gains in reliability and exhibits improved consistency. Regression to zero metallicity yields Yp = 0.2534 ± 0.0083, in broad agreement with the WMAP result. The inclusion of more observations shows promise for further reducing the uncertainty, but more high quality spectra are required.

  2. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  3. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  4. From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''

    NASA Astrophysics Data System (ADS)

    Bergeron, H.

    2001-09-01

    Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].

  5. Agile Combat Support Doctrine and Logistics Officer Training: Do We Need an Integrated Logistics School for the Expeditionary Air and Space Force?

    DTIC Science & Technology

    2003-02-01

    Rank-Order Correlation Coefficients statistical analysis via SPSS 8.0. Interview informants’ perceptions and perspec­ tives are combined with...logistics training in facilitating the em­ ployment of doctrinal tenets in a deployed environment. Statistical Correlations: Confirmed Relationships...integration of technology and cross-func­ tional training for the tactical practitioners. Statistical Correlations: Confirmed Relationships on the Need

  6. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  7. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  8. On-Line Analysis of Physiologic and Neurobehavioral Variables During Long-Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Brown, Emery N.

    1999-01-01

    The goal of this project is to develop reliable statistical algorithms for on-line analysis of physiologic and neurobehavioral variables monitored during long-duration space missions. Maintenance of physiologic and neurobehavioral homeostasis during long-duration space missions is crucial for ensuring optimal crew performance. If countermeasures are not applied, alterations in homeostasis will occur in nearly all-physiologic systems. During such missions data from most of these systems will be either continually and/or continuously monitored. Therefore, if these data can be analyzed as they are acquired and the status of these systems can be continually assessed, then once alterations are detected, appropriate countermeasures can be applied to correct them. One of the most important physiologic systems in which to maintain homeostasis during long-duration missions is the circadian system. To detect and treat alterations in circadian physiology during long duration space missions requires development of: 1) a ground-based protocol to assess the status of the circadian system under the light-dark environment in which crews in space will typically work; and 2) appropriate statistical methods to make this assessment. The protocol in Project 1, Circadian Entrainment, Sleep-Wake Regulation and Neurobehavioral will study human volunteers under the simulated light-dark environment of long-duration space missions. Therefore, we propose to develop statistical models to characterize in near real time circadian and neurobehavioral physiology under these conditions. The specific aims of this project are to test the hypotheses that: 1) Dynamic statistical methods based on the Kronauer model of the human circadian system can be developed to estimate circadian phase, period, amplitude from core-temperature data collected under simulated light- dark conditions of long-duration space missions. 2) Analytic formulae and numerical algorithms can be developed to compute the error in the estimates of circadian phase, period and amplitude determined from the data in Specific Aim 1. 3) Statistical models can detect reliably in near real- time (daily) significant alternations in the circadian physiology of individual subjects by analyzing the circadian and neurobehavioral data collected in Project 1. 4) Criteria can be developed using the Kronauer model and the recently developed Jewett model of cognitive -performance and subjective alertness to define altered circadian and neurobehavioral physiology and to set conditions for immediate administration of countermeasures.

  9. Canonical Statistical Model for Maximum Expected Immission of Wire Conductor in an Aperture Enclosure

    NASA Technical Reports Server (NTRS)

    Bremner, Paul G.; Vazquez, Gabriel; Christiano, Daniel J.; Trout, Dawn H.

    2016-01-01

    Prediction of the maximum expected electromagnetic pick-up of conductors inside a realistic shielding enclosure is an important canonical problem for system-level EMC design of space craft, launch vehicles, aircraft and automobiles. This paper introduces a simple statistical power balance model for prediction of the maximum expected current in a wire conductor inside an aperture enclosure. It calculates both the statistical mean and variance of the immission from the physical design parameters of the problem. Familiar probability density functions can then be used to predict the maximum expected immission for deign purposes. The statistical power balance model requires minimal EMC design information and solves orders of magnitude faster than existing numerical models, making it ultimately viable for scaled-up, full system-level modeling. Both experimental test results and full wave simulation results are used to validate the foundational model.

  10. The Economic Impact of Space Weather: Where Do We Stand?

    PubMed

    Eastwood, J P; Biffis, E; Hapgood, M A; Green, L; Bisi, M M; Bentley, R D; Wicks, R; McKinnell, L-A; Gibbs, M; Burnett, C

    2017-02-01

    Space weather describes the way in which the Sun, and conditions in space more generally, impact human activity and technology both in space and on the ground. It is now well understood that space weather represents a significant threat to infrastructure resilience, and is a source of risk that is wide-ranging in its impact and the pathways by which this impact may occur. Although space weather is growing rapidly as a field, work rigorously assessing the overall economic cost of space weather appears to be in its infancy. Here, we provide an initial literature review to gather and assess the quality of any published assessments of space weather impacts and socioeconomic studies. Generally speaking, there is a good volume of scientific peer-reviewed literature detailing the likelihood and statistics of different types of space weather phenomena. These phenomena all typically exhibit "power-law" behavior in their severity. The literature on documented impacts is not as extensive, with many case studies, but few statistical studies. The literature on the economic impacts of space weather is rather sparse and not as well developed when compared to the other sections, most probably due to the somewhat limited data that are available from end-users. The major risk is attached to power distribution systems and there is disagreement as to the severity of the technological footprint. This strongly controls the economic impact. Consequently, urgent work is required to better quantify the risk of future space weather events. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  11. Evolutionary computing for the design search and optimization of space vehicle power subsystems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook

    2004-01-01

    Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.

  12. [An expert system for controlling the physical training program of crews on long-term space missions].

    PubMed

    Son'kin, V D; Egorov, A D; Zaĭtseva, V V; Son'kin, V V; Stepantsov, V I

    2003-01-01

    The concept of in-flight expert system for controlling (ESC) the physical training program during extended, including Martian, space missions has been developed based on the literature dedicated to the microgravity countermeasures and a retrospective analysis of effectiveness of the known ESC methods. This concept and the principle of crew autonomy were used as prime assumptions for defining the structure of ESC-based training in long-duration and planetary missions.

  13. Student Trajectories in Physics: The Need for Analysis through a Socio-Cultural Lens

    ERIC Educational Resources Information Center

    Zapata, Mara

    2010-01-01

    An analysis of student connections through time and space relative to the core discipline of physics is attempted, as viewed through the lens of actor-network-theory, by Antonia Candela. Using lenses of cultural realities, networks, and perceived power in the discourse of one specific university in the capital city of Mexico and one undergraduate…

  14. Masses and decay constants of the Ds0 *(2317 ) and Ds 1(2460 ) from Nf=2 lattice QCD close to the physical point

    NASA Astrophysics Data System (ADS)

    Bali, Gunnar S.; Collins, Sara; Cox, Antonio; Schäfer, Andreas; RQCD Collaboration

    2017-10-01

    We perform a high statistics study of the JP=0+ and 1+ charmed-strange mesons, Ds0 *(2317 ) and Ds 1(2460 ), respectively. The effects of the nearby D K and D*K thresholds are taken into account by employing the corresponding four-quark operators. Six ensembles with Nf=2 nonperturbatively O (a ) improved clover Wilson sea quarks at a =0.07 fm are employed, covering different spatial volumes and pion masses: linear lattice extents L /a =24 , 32, 40, 64, equivalent to 1.7 fm to 4.5 fm, are realized for mπ=290 MeV and L /a =48 , 64 or 3.4 fm and 4.5 fm for an almost physical pion mass of 150 MeV. Through a phase shift analysis and the effective range approximation we determine the scattering lengths, couplings to the thresholds and the infinite-volume masses. Differences relative to the experimental values are observed for these masses, however, this is likely to be due to discretization effects as spin-averaged quantities and splittings are reasonably compatible with experiment. We also compute the weak decay constants of the scalar and axialvector and find fV0+=114 (2 )(0 )(+5 )(10 ) MeV and fA1+=194 (3 )(4 )(+5 )(10 ) MeV , where the errors are due to statistics, renormalization, finite-volume and lattice spacing effects.

  15. Perspective: Sloppiness and emergent theories in physics, biology, and beyond.

    PubMed

    Transtrum, Mark K; Machta, Benjamin B; Brown, Kevin S; Daniels, Bryan C; Myers, Christopher R; Sethna, James P

    2015-07-07

    Large scale models of physical phenomena demand the development of new statistical and computational tools in order to be effective. Many such models are "sloppy," i.e., exhibit behavior controlled by a relatively small number of parameter combinations. We review an information theoretic framework for analyzing sloppy models. This formalism is based on the Fisher information matrix, which is interpreted as a Riemannian metric on a parameterized space of models. Distance in this space is a measure of how distinguishable two models are based on their predictions. Sloppy model manifolds are bounded with a hierarchy of widths and extrinsic curvatures. The manifold boundary approximation can extract the simple, hidden theory from complicated sloppy models. We attribute the success of simple effective models in physics as likewise emerging from complicated processes exhibiting a low effective dimensionality. We discuss the ramifications and consequences of sloppy models for biochemistry and science more generally. We suggest that the reason our complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.

  16. "Space and Consequences": The Influence of the Roundtable Classroom Design on Student Dialogue

    ERIC Educational Resources Information Center

    Parsons, Caroline S.

    2016-01-01

    This study sought to explore how the design of both physical and virtual learning spaces influence student dialogue in a modern university. Qualitative analysis of the learning spaces in an undergraduate liberal arts program was conducted. Interview and focus group data from students and faculty, in addition to classroom observations, resulted in…

  17. Curvature and temperature of complex networks.

    PubMed

    Krioukov, Dmitri; Papadopoulos, Fragkiskos; Vahdat, Amin; Boguñá, Marián

    2009-09-01

    We show that heterogeneous degree distributions in observed scale-free topologies of complex networks can emerge as a consequence of the exponential expansion of hidden hyperbolic space. Fermi-Dirac statistics provides a physical interpretation of hyperbolic distances as energies of links. The hidden space curvature affects the heterogeneity of the degree distribution, while clustering is a function of temperature. We embed the internet into the hyperbolic plane and find a remarkable congruency between the embedding and our hyperbolic model. Besides proving our model realistic, this embedding may be used for routing with only local information, which holds significant promise for improving the performance of internet routing.

  18. Statistical Symbolic Execution with Informed Sampling

    NASA Technical Reports Server (NTRS)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  19. Unit of Analysis: Impact of Silverman and Solmon's Article on Field-Based Intervention Research in Physical Education in the U.S.A.

    ERIC Educational Resources Information Center

    Li, Weidong; Chen, Yung-Ju; Xiang, Ping; Xie, Xiuge; Li, Yilin

    2017-01-01

    Purpose: The purposes of this study were to: (a) examine the impact of the Silverman and Solmon article (1998) on how researchers handle the unit of analysis issue in their field-based intervention research in physical education in the United States and summarize statistical approaches that have been used to analyze the data, and (b) provide…

  20. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    NASA Astrophysics Data System (ADS)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  1. Life Cycle Analysis of a SpaceCube Printed Circuit Board Assembly Using Physics of Failure Methodologies

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu; Evans, John; Daniluk, Kelly; Sturgis, Jason; Davis, Milton; Petrick, David

    2017-01-01

    In this reliability life cycle evaluation of the SpaceCube 2.0 processor card, a partially populated version of the card is being evaluated to determine its durability with respect to typical GSFC mission loads.

  2. The 1980 Goddard Space Flight Center Battery Workshop

    NASA Technical Reports Server (NTRS)

    Halpert, G.

    1981-01-01

    Several aspects of lithium primary cell technology are discussed with respect to aerospace application. Particular attention is given to the statistical analysis of battery data and accelerated testing.

  3. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  4. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  5. The CMS Data Analysis School Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Filippis, N.; Bauerdick, L.; Chen, J.

    The CMS Data Analysis School is an official event organized by the CMS Collaboration to teach students and post-docs how to perform a physics analysis. The school is coordinated by the CMS schools committee and was first implemented at the LHC Physics Center at Fermilab in 2010. As part of the training, there are a number of “short” exercises on physics object reconstruction and identification, Monte Carlo simulation, and statistical analysis, which are followed by “long” exercises based on physics analyses. Some of the long exercises go beyond the current state of the art of the corresponding CMS analyses. Thismore » paper describes the goals of the school, the preparations for a school, the structure of the training, and student satisfaction with the experience as measured by surveys.« less

  6. The CMS data analysis school experience

    NASA Astrophysics Data System (ADS)

    De Filippis, N.; Bauerdick, L.; Chen, J.; Gallo, E.; Klima, B.; Malik, S.; Mulders, M.; Palla, F.; Rolandi, G.

    2017-10-01

    The CMS Data Analysis School is an official event organized by the CMS Collaboration to teach students and post-docs how to perform a physics analysis. The school is coordinated by the CMS schools committee and was first implemented at the LHC Physics Center at Fermilab in 2010. As part of the training, there are a number of “short” exercises on physics object reconstruction and identification, Monte Carlo simulation, and statistical analysis, which are followed by “long” exercises based on physics analyses. Some of the long exercises go beyond the current state of the art of the corresponding CMS analyses. This paper describes the goals of the school, the preparations for a school, the structure of the training, and student satisfaction with the experience as measured by surveys.

  7. Application of statistical distribution theory to launch-on-time for space construction logistic support

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  8. Visualization tool for three-dimensional plasma velocity distributions (ISEE_3D) as a plug-in for SPEDAS

    NASA Astrophysics Data System (ADS)

    Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron

    2017-12-01

    This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.

  9. Quantum signature of chaos and thermalization in the kicked Dicke model

    NASA Astrophysics Data System (ADS)

    Ray, S.; Ghosh, A.; Sinha, S.

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  10. Time and space: undergraduate Mexican physics in motion

    NASA Astrophysics Data System (ADS)

    Candela, Antonia

    2010-09-01

    This is an ethnographic study of the trajectories and itineraries of undergraduate physics students at a Mexican university. In this work learning is understood as being able to move oneself and, other things (cultural tools), through the space-time networks of a discipline (Nespor in Knowledge in motion: space, time and curriculum in undergraduate physics and management. Routledge Farmer, London, 1994). The potential of this socio-cultural perspective allows an analysis of how students are connected through extended spaces and times with an international core discipline as well as with cultural features related to local networks of power and construction. Through an example, I show that, from an actor-network-theory (Latour in Science in action. Harvard University Press, Cambridge, 1987), that in order to understand the complexities of undergraduate physics processes of learning you have to break classroom walls and take into account students' movements through complex spatial and temporal traces of the discipline of physics. Mexican professors do not give classes following one textbook but in a moment-to-moment open dynamism tending to include undergraduate students as actors in classroom events extending the teaching space-time of the classroom to the disciplinary research work of physics. I also find that Mexican undergraduate students show initiative and display some autonomy and power in the construction of their itineraries as they are encouraged to examine a variety of sources including contemporary research articles, unsolved physics problems, and even to participate in several physicists' spaces, as for example being speakers at the national congresses of physics. Their itineraries also open up new spaces of cultural and social practices, creating more extensive networks beyond those associated with a discipline. Some economic, historical and cultural contextual features of this school of sciences are analyzed in order to help understanding the particular way students are encouraged to develop their autonomy.

  11. Quantum signature of chaos and thermalization in the kicked Dicke model.

    PubMed

    Ray, S; Ghosh, A; Sinha, S

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  12. Understanding amyloid aggregation by statistical analysis of atomic force microscopy images

    NASA Astrophysics Data System (ADS)

    Adamcik, Jozef; Jung, Jin-Mi; Flakowski, Jérôme; de Los Rios, Paolo; Dietler, Giovanni; Mezzenga, Raffaele

    2010-06-01

    The aggregation of proteins is central to many aspects of daily life, including food processing, blood coagulation, eye cataract formation disease and prion-related neurodegenerative infections. However, the physical mechanisms responsible for amyloidosis-the irreversible fibril formation of various proteins that is linked to disorders such as Alzheimer's, Creutzfeldt-Jakob and Huntington's diseases-have not yet been fully elucidated. Here, we show that different stages of amyloid aggregation can be examined by performing a statistical polymer physics analysis of single-molecule atomic force microscopy images of heat-denatured β-lactoglobulin fibrils. The atomic force microscopy analysis, supported by theoretical arguments, reveals that the fibrils have a multistranded helical shape with twisted ribbon-like structures. Our results also indicate a possible general model for amyloid fibril assembly and illustrate the potential of this approach for investigating fibrillar systems.

  13. Developments in Space Research in Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, O.

    Nigeria s desire to venture into space technology was first made known to ECA OAU member countries at an inter-governmental meeting in Addis Ababa 1976 The Nigerian space research is highly rated in Africa in terms of reputation and scientific results The National Space Research and Development Agency NASRDA Nigeria s space research coordinating body has taken a more active role to help Nigeria s space research community to succeed internationally The paper presents recent examples of Nigeria s successes in space and its detailed applications in areas such as remote sensing meteorology communication and Information Technology and many more It gave an analysis of the statistics of Nigerian born space scientists working in the other space-faring nations The analysis have been used to develop a model for increasing Nigerian scientist s involvement in the development of space research in Nigeria It concluded with some thoughts on the current and future of Nigeria s space borne scientific experiments policies and programs

  14. Advanced microwave soil moisture studies. [Big Sioux River Basin, Iowa

    NASA Technical Reports Server (NTRS)

    Dalsted, K. J.; Harlan, J. C.

    1983-01-01

    Comparisons of low level L-band brightness temperature (TB) and thermal infrared (TIR) data as well as the following data sets: soil map and land cover data; direct soil moisture measurement; and a computer generated contour map were statistically evaluated using regression analysis and linear discriminant analysis. Regression analysis of footprint data shows that statistical groupings of ground variables (soil features and land cover) hold promise for qualitative assessment of soil moisture and for reducing variance within the sampling space. Dry conditions appear to be more conductive to producing meaningful statistics than wet conditions. Regression analysis using field averaged TB and TIR data did not approach the higher sq R values obtained using within-field variations. The linear discriminant analysis indicates some capacity to distinguish categories with the results being somewhat better on a field basis than a footprint basis.

  15. Physical restraints in an Italian psychiatric ward: clinical reasons and staff organization problems.

    PubMed

    Di Lorenzo, Rosaria; Baraldi, Sara; Ferrara, Maria; Mimmi, Stefano; Rigatelli, Marco

    2012-04-01

    To analyze physical restraint use in an Italian acute psychiatric ward, where mechanical restraint by belt is highly discouraged but allowed. Data were retrospectively collected from medical and nursing charts, from January 1, 2005, to December 31, 2008. Physical restraint rate and relationships between restraints and selected variables were statistically analyzed. Restraints were statistically significantly more frequent in compulsory or voluntary admissions of patients with an altered state of consciousness, at night, to control aggressive behavior, and in patients with "Schizophrenia and other Psychotic Disorders" during the first 72 hr of hospitalization. Analysis of clinical and organizational factors conditioning restraints may limit its use. © 2011 Wiley Periodicals, Inc.

  16. Urban environmental health applications of remote sensing, summary report

    NASA Technical Reports Server (NTRS)

    Rush, M.; Goldstein, J.; Hsi, B. P.; Olsen, C. B.

    1975-01-01

    Health and its association with the physical environment was studied based on the hypothesis that there is a relationship between the man-made physical environment and health status of a population. The statistical technique of regression analysis was employed to show the degree of association and aspects of physical environment which accounted for the greater variation in health status. Mortality, venereal disease, tuberculosis, hepatitis, meningitis, shigella/salmonella, hypertension and cardiac arrest/myocardial infarction were examined. The statistical techniques were used to measure association and variation, not necessarily cause and effect. Conclusions drawn show that the association still exists in the decade of the 1970's and that it can be successfully monitored with the methodology of remote sensing.

  17. Physical and Hydrological Meaning of the Spectral Information from Hydrodynamic Signals at Karst Springs

    NASA Astrophysics Data System (ADS)

    Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.

    2017-12-01

    Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.

  18. Theory and analysis of statistical discriminant techniques as applied to remote sensing data

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1973-01-01

    Classification of remote earth resources sensing data according to normed exponential density statistics is reported. The use of density models appropriate for several physical situations provides an exact solution for the probabilities of classifications associated with the Bayes discriminant procedure even when the covariance matrices are unequal.

  19. The Effect of Satellite Observing System Changes on MERRA Water and Energy Fluxes

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, M. G.; Chen, J.; Miller, T. L.

    2011-01-01

    Because reanalysis data sets offer state variables and fluxes at regular space / time intervals, atmospheric reanalyses have become a mainstay of the climate community for diagnostic purposes and for driving offline ocean and land models. Although one weakness of these data sets is the susceptibility of the flux products to uncertainties because of shortcomings in parameterized model physics, another issue, perhaps less appreciated, is the fact that continual but discreet changes in the evolving observational system, particularly from satellite sensors, may also introduce artifacts in the time series of quantities. In this paper we examine the ability of the NASA MERRA (Modern Era Retrospective Analysis for Research and Applications) and other recent reanalyses to determine variability in the climate system over the satellite record (approx. the last 30 years). In particular we highlight the effect on the reanalysis of discontinuities at the junctures of the onset of passive microwave imaging (Special Sensor Microwave Imager) in late 1987 and, more prominently, with improved sounding and imaging with the Advanced Microwave Sounding Unit, AMSU-A, in 1998. We first examine MERRA fluxes from the perspective of how physical modes of variability (e.g. ENSO events, Pacific Decadal Variability) are contained by artificial step-like trends induced by the onset of new moisture data these two satellite observing systems. Secondly, we show how Redundancy Analysis, a statistical regression methodology, is effective in relating these artifact signals in the moisture and temperature analysis increments to their presence in the physical flux terms (e.g. precipitation, radiation). This procedure is shown to be effective greatly reducing the artificial trends in the flux quantities.

  20. The Effect of Satellite Observing System Changes on MERRA Water and Energy Fluxes

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, M. G.; Chen, J.; Miller, t. L.

    2010-01-01

    Because reanalysis data sets offer state variables and fluxes at regular space / time intervals, atmospheric reanalyses have become a mainstay of the climate community for diagnostic purposes and for driving offline ocean and land models. Although one weakness of these data sets is the susceptibility of the flux products to uncertainties because of shortcomings in parameterized model physics, another issue, perhaps less appreciated, is the fact that continual but discreet changes in the evolving observational system, particularly from satellite sensors, may also introduce artifacts in the time series of quantities. In this paper we examine the ability of the NASA MERRA (Modern Era Retrospective Analysis for Research and Applications) and other recent reanalyses to determine variability in the climate system over the satellite record (approximately the last 30 years). In particular we highlight the effect on the reanalysis of discontinuities at the junctures of the onset of passive microwave imaging (Special Sensor Microwave Imager) in late 1987 as well as improved sounding and imaging with the Advanced Microwave Sounding Unit, AMSU-A, in 1998. We first examine MERRA fluxes from the perspective of how physical modes of variability (e.g. ENSO events, Pacific Decadal Variability) are contaminated by artificial step-like trends induced by the onset of new moisture data these two satellite observing systems. Secondly, we show how Redundancy Analysis, a statistical regression methodology, is effective in relating these artifact signals in the moisture and temperature analysis increments to their presence in the physical flux terms (e.g. precipitation, radiation). This procedure is shown to be effective greatly reducing the artificial trends in the flux quantities.

  1. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births

    PubMed Central

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A. Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers. PMID:29546120

  2. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births.

    PubMed

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers.

  3. Analysis of uncertainties and convergence of the statistical quantities in turbulent wall-bounded flows by means of a physically based criterion

    NASA Astrophysics Data System (ADS)

    Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu

    2018-04-01

    The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.

  4. Relation between the electromagnetic processes in the near-Earth space and dynamics of the biological resources in Russian Arctic

    NASA Astrophysics Data System (ADS)

    Makarova, L. N.; Shirochkov, A. V.; Tumanov, I. L.

    The start of the satellite era in the Space explorations led to new and more profound knowledge of the solar physics and the sources of its activity. From these points of view, it is worthy to examine again the relations between biological processes and the solar activity. We explore the relation between dynamics of the solar activity (including the solar wind) and changes in population of some species of Arctic fauna (lemmings, polar foxes, caribous, wolves, elks, etc.). The data include statistical rows of various lengths (30 80 years). The best correlation between two data sets is found when the solar wind dynamic pressure as well as variations of the total solar irradiance (i.e., level of the solar UV radiation) is taken as the space parameters. Probably the electromagnetic fields of space origin are an important factor determining dynamics of population of the Arctic fauna species.

  5. New Approaches to Quantifying Transport Model Error in Atmospheric CO2 Simulations

    NASA Technical Reports Server (NTRS)

    Ott, L.; Pawson, S.; Zhu, Z.; Nielsen, J. E.; Collatz, G. J.; Gregg, W. W.

    2012-01-01

    In recent years, much progress has been made in observing CO2 distributions from space. However, the use of these observations to infer source/sink distributions in inversion studies continues to be complicated by difficulty in quantifying atmospheric transport model errors. We will present results from several different experiments designed to quantify different aspects of transport error using the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric General Circulation Model (AGCM). In the first set of experiments, an ensemble of simulations is constructed using perturbations to parameters in the model s moist physics and turbulence parameterizations that control sub-grid scale transport of trace gases. Analysis of the ensemble spread and scales of temporal and spatial variability among the simulations allows insight into how parameterized, small-scale transport processes influence simulated CO2 distributions. In the second set of experiments, atmospheric tracers representing model error are constructed using observation minus analysis statistics from NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA). The goal of these simulations is to understand how errors in large scale dynamics are distributed, and how they propagate in space and time, affecting trace gas distributions. These simulations will also be compared to results from NASA's Carbon Monitoring System Flux Pilot Project that quantified the impact of uncertainty in satellite constrained CO2 flux estimates on atmospheric mixing ratios to assess the major factors governing uncertainty in global and regional trace gas distributions.

  6. Translations on USSR Science and Technology, Physical Sciences and Technology, Number 49.

    DTIC Science & Technology

    1978-09-20

    significant reduction in the times and now a reduction in the cost of the work), and data from the surveys of the incomes of families of workers...computer equipment, it provides comprehensive elaboration of the accounting and statistical material with a reduction in the cost of the work, and...themselves, while actively developing under conditons of space flight? We have already written about hydrogenous bacteria (TEKHNIKA — MOLODEZHI, No 4

  7. A Guerilla Guide to Common Problems in ‘Neurostatistics’: Essential Statistical Topics in Neuroscience

    PubMed Central

    Smith, Paul F.

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855

  8. A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience.

    PubMed

    Smith, Paul F

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.

  9. Suited crewmember productivity.

    PubMed

    Barer, A S; Filipenkov, S N

    1994-01-01

    Analysis of the extravehicular activity (EVA) sortie experience gained in the former Soviet Union and physiologic hygienic aspect of space suit design and development shows that crewmember productivity is related to the following main factors: -space suit microclimate (gas composition, pressure and temperature); -limitation of motion activity and perception, imposed by the space suit; -good crewmember training in the ground training program; -level of crewmember general physical performance capabilities in connection with mission duration and intervals between sorties; -individual EVA experience (with accumulation) at which workmanship improves, while metabolism, physical and emotional stress decreases; -concrete EVA duration and work rate; -EVA bioengineering, including selection of tools, work station, EVA technology and mechanization.

  10. Active Missions and the VxOs with THEMIS as an Example

    NASA Technical Reports Server (NTRS)

    Sibeck, D. G.; Merka, J.

    2009-01-01

    The Virtual Observatories (VxOs) provide a host of services to data producers and researchers. They help data producers to describe their data in standard Space Physics Archive Search and Extract (SPASE) terms that enable scientists to understand data products from a wide range of missions. They offer search interfaces based on specified criteria that help researchers discover conjunctions, prominent events, and intervals of interest. In this talk, we show how VMO services can be used with Time History of Events and Macroscale Interactions during Substorms (THEMIS) observations to identify magnetotail intervals marked by high speed flows, enhanced densities, or high temperatures. We present statistical surveys of when and where these phenomena occur. We then show how the VMO services can be used to identify events in which two or more THEMIS spacecraft observe specified features for more detailed analysis. We conclude by discussing the current limitations of VMO tools and outline plans for the future.

  11. A Bayesian sequential processor approach to spectroscopic portal system decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sale, K; Candy, J; Breitfeller, E

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less

  12. Probing CP violation in $$h\\rightarrow\\gamma\\gamma$$ with converted photons

    DOE PAGES

    Bishara, Fady; Grossman, Yuval; Harnik, Roni; ...

    2014-04-11

    We study Higgs diphoton decays, in which both photons undergo nuclear conversion to electron- positron pairs. The kinematic distribution of the two electron-positron pairs may be used to probe the CP violating (CPV) coupling of the Higgs to photons, that may be produced by new physics. Detecting CPV in this manner requires interference between the spin-polarized helicity amplitudes for both conversions. We derive leading order, analytic forms for these amplitudes. In turn, we obtain compact, leading-order expressions for the full process rate. While performing experiments involving photon conversions may be challenging, we use the results of our analysis to constructmore » experimental cuts on certain observables that may enhance sensitivity to CPV. We show that there exist regions of phase space on which sensitivity to CPV is of order unity. As a result, the statistical sensitivity of these cuts are verified numerically, using dedicated Monte-Carlo simulations.« less

  13. High Energy Astrophysics and Cosmology from Space: NASA's Physics of the Cosmos Program

    NASA Astrophysics Data System (ADS)

    Hornschemeier, Ann

    2016-03-01

    We summarize currently-funded NASA activities in high energy astrophysics and cosmology, embodied in the NASA Physics of the Cosmos program, including updates on technology development and mission studies. The portfolio includes development of a space mission for measuring gravitational waves from merging supermassive black holes, currently envisioned as a collaboration with the European Space Agency (ESA) on its L3 mission and development of an X-ray observatory that will measure X-ray emission from the final stages of accretion onto black holes, currently envisioned as a NASA collaboration on ESA's Athena observatory. The portfolio also includes the study of cosmic rays and gamma ray photons resulting from a range of processes, of the physical process of inflation associated with the birth of the universe and of the nature of the dark energy that dominates the mass-energy of the modern universe. The program is supported by an analysis group called the PhysPAG that serves as a forum for community input and analysis and the talk will include a description of activities of this group.

  14. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    PubMed

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  15. Comment on ‘Are physicists afraid of mathematics?’

    NASA Astrophysics Data System (ADS)

    Higginson, Andrew D.; Fawcett, Tim W.

    2016-11-01

    In 2012, we showed that the citation count for articles in ecology and evolutionary biology declines with increasing density of equations. Kollmer et al (2015 New J. Phys. 17 013036) claim this effect is an artefact of the manner in which we plotted the data. They also present citation data from Physical Review Letters and argue, based on graphs, that citation counts are unrelated to equation density. Here we show that both claims are misguided. We identified the effects in biology not by visual means, but using the most appropriate statistical analysis. Since Kollmer et al did not carry out any statistical analysis, they cannot draw reliable inferences about the citation patterns in physics. We show that when statistically analysed their data actually do provide evidence that in physics, as in biology, citation counts are lower for articles with a high density of equations. This indicates that a negative relationship between equation density and citations may extend across the breadth of the sciences, even those in which researchers are well accustomed to mathematical descriptions of natural phenomena. We restate our assessment that this is a genuine problem and discuss what we think should be done about it.

  16. Physics Education: A Significant Backbone of Sustainable Development in Developing Countries

    NASA Astrophysics Data System (ADS)

    Akintola, R. A.

    2006-08-01

    In the quest for technological self-reliance, many policies, programs and projects have been proposed and implemented in order to procure solutions to the problems of technological inadequacies of developing countries. It has been observed that all these failed. This research identifies the problems and proposes lasting solutions to emancipate physics education in developing nations and highlight possible future gains. The statistical analysis employed was based on questionnaires, interviews and data analysis.

  17. Use of discrete chromatic space to tune the image tone in a color image mosaic

    NASA Astrophysics Data System (ADS)

    Zhang, Zuxun; Li, Zhijiang; Zhang, Jianqing; Zheng, Li

    2003-09-01

    Color image process is a very important problem. However, the main approach presently of them is to transfer RGB colour space into another colour space, such as HIS (Hue, Intensity and Saturation). YIQ, LUV and so on. Virutally, it may not be a valid way to process colour airborne image just in one colour space. Because the electromagnetic wave is physically altered in every wave band, while the color image is perceived based on psychology vision. Therefore, it's necessary to propose an approach accord with physical transformation and psychological perception. Then, an analysis on how to use relative colour spaces to process colour airborne photo is discussed and an application on how to tune the image tone in colour airborne image mosaic is introduced. As a practice, a complete approach to perform the mosaic on color airborne images via taking full advantage of relative color spaces is discussed in the application.

  18. Development of students' conceptual thinking by means of video analysis and interactive simulations at technical universities

    NASA Astrophysics Data System (ADS)

    Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav

    2015-03-01

    Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.

  19. Unravelling the geometry of data matrices: effects of water stress regimes on winemaking.

    PubMed

    Fushing, Hsieh; Hsueh, Chih-Hsin; Heitkamp, Constantin; Matthews, Mark A; Koehl, Patrice

    2015-10-06

    A new method is proposed for unravelling the patterns between a set of experiments and the features that characterize those experiments. The aims are to extract these patterns in the form of a coupling between the rows and columns of the corresponding data matrix and to use this geometry as a support for model testing. These aims are reached through two key steps, namely application of an iterative geometric approach to couple the metric spaces associated with the rows and columns, and use of statistical physics to generate matrices that mimic the original data while maintaining their inherent structure, thereby providing the basis for hypothesis testing and statistical inference. The power of this new method is illustrated on the study of the impact of water stress conditions on the attributes of 'Cabernet Sauvignon' Grapes, Juice, Wine and Bottled Wine from two vintages. The first step, named data mechanics, de-convolutes the intrinsic effects of grape berries and wine attributes due to the experimental irrigation conditions from the extrinsic effects of the environment. The second step provides an analysis of the associations of some attributes of the bottled wine with characteristics of either the matured grape berries or the resulting juice, thereby identifying statistically significant associations between the juice pH, yeast assimilable nitrogen, and sugar content and the bottled wine alcohol level. © 2015 The Author(s).

  20. Unravelling the geometry of data matrices: effects of water stress regimes on winemaking

    PubMed Central

    Fushing, Hsieh; Hsueh, Chih-Hsin; Heitkamp, Constantin; Matthews, Mark A.; Koehl, Patrice

    2015-01-01

    A new method is proposed for unravelling the patterns between a set of experiments and the features that characterize those experiments. The aims are to extract these patterns in the form of a coupling between the rows and columns of the corresponding data matrix and to use this geometry as a support for model testing. These aims are reached through two key steps, namely application of an iterative geometric approach to couple the metric spaces associated with the rows and columns, and use of statistical physics to generate matrices that mimic the original data while maintaining their inherent structure, thereby providing the basis for hypothesis testing and statistical inference. The power of this new method is illustrated on the study of the impact of water stress conditions on the attributes of ‘Cabernet Sauvignon’ Grapes, Juice, Wine and Bottled Wine from two vintages. The first step, named data mechanics, de-convolutes the intrinsic effects of grape berries and wine attributes due to the experimental irrigation conditions from the extrinsic effects of the environment. The second step provides an analysis of the associations of some attributes of the bottled wine with characteristics of either the matured grape berries or the resulting juice, thereby identifying statistically significant associations between the juice pH, yeast assimilable nitrogen, and sugar content and the bottled wine alcohol level. PMID:26468072

  1. Identification and diagnosis of spatiotemporal hydrometeorological structure of heavy precipitation induced floods in Southeast Asia

    NASA Astrophysics Data System (ADS)

    Lu, M.; Hao, X.; Devineni, N.

    2017-12-01

    Extreme floods have a long history of being an important cause of death and destruction worldwide. It is estimated by Munich RE and Swiss RE that floods and severe storms dominate all other natural hazards globally in terms of average annual property loss and human fatalities. The top 5 most disastrous floods in the period from 1900 to 2015, ranked by economic damage, are all in the Asian monsoon region. This study presents an interdisciplinary approach integrating hydrometeorology, atmospheric science and state-of-the-art space-time statistics and modeling to investigate the association between the space-time characteristics of floods, precipitation and atmospheric moisture transport in a statistical and physical framework, using tropical moisture export dataset and curve clustering algorithm to study the source-to-destination features; explore the teleconnected climate regulations on the moisture formation process at different timescales (PDO, ENSO and MJO), and study the role of the synoptic-to-large atmospheric steering on the moisture transport and convergence.

  2. Analysis of differences in exercise recognition by constraints on physical activity of hospitalized cancer patients based on their medical history.

    PubMed

    Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk

    2018-04-01

    The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.

  3. Wave chaos in a randomly inhomogeneous waveguide: spectral analysis of the finite-range evolution operator.

    PubMed

    Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S

    2013-01-01

    The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.

  4. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  5. Space Radiation Induced Cytogenetic Damage in the Blood Lymphocytes of Astronauts: Persistence of Damage After Flight and the Effects of Repeat Long Duration Missions

    NASA Technical Reports Server (NTRS)

    George, Kerry; Rhone, Jordan; Chappell, L. J.; Cucinotta, F. A.

    2010-01-01

    Cytogenetic damage was assessed in blood lymphocytes from astronauts before and after they participated in long-duration space missions of three months or more. The frequency of chromosome damage was measured by fluorescence in situ hybridization (FISH) chromosome painting before flight and at various intervals from a few days to many months after return from the mission. For all individuals, the frequency of chromosome exchanges measured within a month of return from space was higher than their prefight yield. However, some individuals showed a temporal decline in chromosome damage with time after flight. Statistical analysis using combined data for all astronauts indicated a significant overall decreasing trend in total chromosome exchanges with time after flight, although this trend was not seen for all astronauts and the yield of chromosome damage in some individuals actually increased with time after flight. The decreasing trend in total exchanges was slightly more significant when statistical analysis was restricted to data collected more than 220 days after return from flight. In addition, limited data on multiple flights show a lack of correlation between time in space and translocation yields. Data from three crewmembers who has participated in two separate long-duration space missions provide limited information on the effect of repeat flights and show a possible adaptive response to space radiation exposure.

  6. Teaching Perspectives of Chinese Teachers: Compatibility with the Goals of the Physical Education Curriculum

    ERIC Educational Resources Information Center

    Wang, Lijuan; Ha, Amy Sau-ching; Wen, Xu

    2014-01-01

    This research primarily aimed to examine the compatibility of teaching perspectives of teachers with the Physical Education (PE) curriculum in China. The Teaching Perspective Inventory (Pratt, 1998) was used to collect data from 272 PE teachers. Descriptive statistics, MANOVAs, and correlational procedures were used for quantitative data analysis.…

  7. Granger causality for state-space models

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Seth, Anil K.

    2015-04-01

    Granger causality has long been a prominent method for inferring causal interactions between stochastic variables for a broad range of complex physical systems. However, it has been recognized that a moving average (MA) component in the data presents a serious confound to Granger causal analysis, as routinely performed via autoregressive (AR) modeling. We solve this problem by demonstrating that Granger causality may be calculated simply and efficiently from the parameters of a state-space (SS) model. Since SS models are equivalent to autoregressive moving average models, Granger causality estimated in this fashion is not degraded by the presence of a MA component. This is of particular significance when the data has been filtered, downsampled, observed with noise, or is a subprocess of a higher dimensional process, since all of these operations—commonplace in application domains as diverse as climate science, econometrics, and the neurosciences—induce a MA component. We show how Granger causality, conditional and unconditional, in both time and frequency domains, may be calculated directly from SS model parameters via solution of a discrete algebraic Riccati equation. Numerical simulations demonstrate that Granger causality estimators thus derived have greater statistical power and smaller bias than AR estimators. We also discuss how the SS approach facilitates relaxation of the assumptions of linearity, stationarity, and homoscedasticity underlying current AR methods, thus opening up potentially significant new areas of research in Granger causal analysis.

  8. Space processing applications payload equipment study. Volume 2A: Experiment requirements

    NASA Technical Reports Server (NTRS)

    Smith, A. G.; Anderson, W. T., Jr.

    1974-01-01

    An analysis of the space processing applications payload equipment was conducted. The primary objective was to perform a review and an update of the space processing activity research equipment requirements and specifications that were derived in the first study. The analysis is based on the six major experimental classes of: (1) biological applications, (2) chemical processes in fluids, (3) crystal growth, (4) glass technology, (5) metallurgical processes, and (6) physical processes in fluids. Tables of data are prepared to show the functional requirements for the areas of investigation.

  9. [The ways in which variations in space and atmospheric factors act upon the biosphere and humans].

    PubMed

    Chernogor, L F

    2010-01-01

    The system analysis is validated to be an efficient means for studying the channels through which variations in space and tropospheric weather affect the biosphere (humans). The basics of the system analysis paradigm are presented. The causes of variations in space and tropospheric weather are determined, and the interrelations between them are demonstrated. The ways in which these variations affect the biosphere (humans) are discussed. Aperiodic and quasi-periodic disturbances in the physical fields that influence the biosphere (humans) are intercompared.

  10. The link between perceived characteristics of neighbourhood green spaces and adults' physical activity in UK cities: analysis of the EURO-URHIS 2 Study.

    PubMed

    Ali, Omer; Di Nardo, Francesco; Harrison, Annie; Verma, Arpana

    2017-08-01

    Urban dwellers represent half the world's population and are increasing worldwide. Their health and behaviours are affected by the built environment and green areas may play a major role in promoting physical activity, thus decreasing the burden of chronic diseases, overweight and inactivity. However, the availability of green areas may not guarantee healthy levels of physical activity among the urban dwellers. It is therefore necessary to study how the perceived characteristics of green areas affect physical activity. Data from the EURO-URHIS 2 survey of residents of 13 cities across the UK were analyzed and a multivariable model was created in order to assess the association between their perceptions of the green areas in their neighbourhood and their engagement in physical activity. Results were adjusted for age, gender and other potential confounders. Those who felt unable to engage in active recreational activities in their local green spaces were significantly less likely to carry out moderate physical exercise for at least 60 min per week (adjusted OR: 0.50; 95% 0.37-0.68). Availability of green areas within walking distance did not affect engagement in physical activity. Other characteristics such as accessibility and safety may play an important role. This study showed that the presence of green space may not itself encourage the necessary preventative health behaviours to tackle physical inactivity in urban populations. Development of more appropriate green spaces may be required. Further research is needed to shed light on the types green spaces that are most effective. © The Author 2017. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  11. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  12. "I didn't think I could get out of the fucking park." Gay men's retrospective accounts of neighborhood space, emerging sexuality and migrations.

    PubMed

    Frye, Victoria; Egan, James E; Van Tieu, Hong; Cerdá, Magdalena; Ompad, Danielle; Koblin, Beryl A

    2014-03-01

    Young, African American and Latino gay, bisexual and other men who have sex with men (MSM) are disproportionately represented among new HIV cases according to the most recent national surveillance statistics. Analysts have noted that these racial/ethnic disparities in HIV among MSM exist within the wider context of sexual, mental and physical health disparities between MSM and heterosexuals. The intercorrelation of these adverse health outcomes among MSM, termed syndemics, has been theorized to be socially produced by a heterosexist social system that marginalizes lesbian, gay, bisexual, MSM and other sexual minorities. African American and Latino MSM experience overlapping systems of oppression that may increase their risk of experiencing syndemic health outcomes. In this paper, using data from twenty in-depth qualitative interviews with MSM living in four New York City (NYC) neighborhoods, we present accounts of neighborhood space, examining how space can both physically constitute and reinforce social systems of stratification and oppression, which in turn produce social disparities in sexual health outcomes. By analyzing accounts of emerging sexuality in neighborhood space, i.e. across time and space, we identify pathways to risk and contribute to our understanding of how neighborhood space is experienced by gay men, adding to our ability to support young men as they emerge in place and to shape the social topography of urban areas. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  14. Economic fluctuations and statistical physics: Quantifying extremely rare and less rare events in finance

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Gabaix, Xavier; Gopikrishnan, Parameswaran; Plerou, Vasiliki

    2007-08-01

    One challenge of economics is that the systems treated by these sciences have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time. We present an overview of recent research joining practitioners of economic theory and statistical physics to try to better understand puzzles regarding economic fluctuations. One of these puzzles is how to describe outliers, phenomena that lie outside of patterns of statistical regularity. We review evidence consistent with the possibility that such outliers may not exist. This possibility is supported by recent analysis of databases containing information about each trade of every stock.

  15. Multiscale solvers and systematic upscaling in computational physics

    NASA Astrophysics Data System (ADS)

    Brandt, A.

    2005-07-01

    Multiscale algorithms can overcome the scale-born bottlenecks that plague most computations in physics. These algorithms employ separate processing at each scale of the physical space, combined with interscale iterative interactions, in ways which use finer scales very sparingly. Having been developed first and well known as multigrid solvers for partial differential equations, highly efficient multiscale techniques have more recently been developed for many other types of computational tasks, including: inverse PDE problems; highly indefinite (e.g., standing wave) equations; Dirac equations in disordered gauge fields; fast computation and updating of large determinants (as needed in QCD); fast integral transforms; integral equations; astrophysics; molecular dynamics of macromolecules and fluids; many-atom electronic structures; global and discrete-state optimization; practical graph problems; image segmentation and recognition; tomography (medical imaging); fast Monte-Carlo sampling in statistical physics; and general, systematic methods of upscaling (accurate numerical derivation of large-scale equations from microscopic laws).

  16. Mutual information, neural networks and the renormalization group

    NASA Astrophysics Data System (ADS)

    Koch-Janusz, Maciej; Ringel, Zohar

    2018-06-01

    Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.

  17. Honors

    NASA Astrophysics Data System (ADS)

    Anonymous

    2012-05-01

    A number of AGU members were honored during the European Geosciences Union's (EGU) General Assembly, held on 22-27 April in Vienna. EGU Union awards were presented to the following people: Vincent Courtillot, University of Paris Diderot, France, received the 2012 Arthur Holmes Medal and EGU honorary membership for seminal contributions to geomagnetism and the geodynamics of mantle hot spots.Michael Ghil, University of California, Los Angeles, and École Normale Supérieure, France, received the 2012 Alfred Wegener Medal and EGU honorary membership for his leading contributions to theoretical climate dynamics; his innovative observational studies involving model assimilation of satellite data in meteorology, oceanography, and space physics; the breadth of his interdisciplinary studies, including macroeconomics; and his extensive supervision and mentoring of scores of graduate and postdoctoral students. Robin Clarke, Universidade Federal do Rio Grande do Sul, Brazil, received the 2012 Alexander von Humboldt Medal for fundamental contributions in statistical analysis and modeling of hydrological processes.Angioletta Coradini, Istituto Nazionale di Astrofsica, Italy, received the 2012 Jean Dominique Cassini Medal and EGU honorary membership in recognition of her important and wide range of work in planetary sciences and solar system formation and for her leading role in the development of space infrared instrumentation for planetary exploration.

  18. Honors

    NASA Astrophysics Data System (ADS)

    2012-05-01

    A number of AGU members were honored during the European Geosciences Union's (EGU) General Assembly, held on 22-27 April in Vienna. EGU Union awards were presented to the following people: Vincent Courtillot, University of Paris Diderot, France, received the 2012 Arthur Holmes Medal and EGU honorary membership for seminal contributions to geomagnetism and the geodynamics of mantle hot spots. Michael Ghil, University of California, Los Angeles, and École Normale Supérieure, France, received the 2012 Alfred Wegener Medal and EGU honorary membership for his leading contributions to theoretical climate dynamics; his innovative observational studies involving model assimilation of satellite data in meteorology, oceanography, and space physics; the breadth of his interdisciplinary studies, including macroeconomics; and his extensive supervision and mentoring of scores of graduate and postdoctoral students. Robin Clarke, Universidade Federal do Rio Grande do Sul, Brazil, received the 2012 Alexander von Humboldt Medal for fundamental contributions in statistical analysis and modeling of hydrological processes. Angioletta Coradini, Istituto Nazionale di Astrofsica, Italy, received the 2012 Jean Dominique Cassini Medal and EGU honorary membership in recognition of her important and wide range of work in planetary sciences and solar system formation and for her leading role in the development of space infrared instrumentation for planetary exploration.

  19. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  20. Spectral analysis of groove spacing on Ganymede

    NASA Technical Reports Server (NTRS)

    Grimm, R. E.; Squyres, S. W.

    1985-01-01

    A quantitative analysis of groove spacing on Ganymede is described. Fourier transforms of a large number of photometric profiles across groove sets are calculated and the resulting power spectra are examined for the position and strength of peaks representing topographic periodicities. The geographic and global statistical distribution of groove wavelengths are examined, and these data are related to models of groove tectonism. It is found that groove spacing on Ganymede shows an approximately long-normal distribution with a minimum of about 3.5 km, a maximum of about 17 km, and a mean of 8.4 km. Groove spacing tends to be quite regular within a single groove set but can vary substantially from one groove set to another within a single geographic region.

  1. Preliminary results of the comparative study between EO-1/Hyperion and ALOS/PALSAR

    NASA Astrophysics Data System (ADS)

    Koizumi, E.; Furuta, R.; Yamamoto, A.

    2011-12-01

    [Introduction]Hyper-spectral remote sensing images have been used for land-cover classification due to their high spectral resolutions. Synthetic Aperture Radar (SAR) remote sensing data are also useful to probe surface condition because radar image reflects surface geometry, although there are not so many reports about the land-cover detection with combination use of both hyper-spectral data and SAR data. Among SAR sensors, L-band SAR is thought to be useful tool to find physical properties because its comparatively long wave length can through small objects on surface. We are comparing the result of land cover classification and/or physical values from hyper-spectral and L-band SAR data to find the relationship between these two quite different sensors and to confirm the possibility of the combined analysis of hyper-spectral and L-band SAR data, and in this presentation we will report the preliminary result of this study. There are only few sources of both hyper-spectral and L-band SAR data from the space in this time, however, several space organizations plan to launch new satellites on which hyper-spectral or L-band SAR equipments are mounted in next few years. So, the importance of the combined analysis will increase more than ever. [Target Area]We are performing and planning analyses on the following areas in this study. (a)South of Cairo, Nile river area, Egypt, for sand, sandstone, limestone, river, crops. (b)Mount Sakurajima, Japan, for igneous rock and other related geological property. [Methods and Results]EO-1 Hyperion data are analyzed in this study as hyper-spectral data. The Hyperion equipment has 242 channels but some of them include full noise or have no data. We selected channels for analysis by checking each channel, and select about 150 channels (depend on the area). Before analysis, the atmospheric correction of ATCOR-3 was applied for the selected channels. The corrected data were analyzed by unsupervised classification or principal component analysis (PCA). We also did the unsupervised classification with the several components from PCA. According to the analysis results, several classifications can be extracted for each category (vegetation, sand and rocks, and water). One of the interesting results is that there are a few classes for sand as those of other categories, and these classes seem to reflect artificial and natural surface changes that are some result of excavation or scratching. ALOS PALSAR data are analyzed as L-band SAR data. We selected the Dual Polarization data for each target area. The data were converted to backscattered images, and then calculated some image statistic values. The topographic information also calculates with SAR interferometry technique as reference. Comparing the Hyperion classification results with the result of the calculation of statistic values from PALSAR, there are some areas where relativities seem to be confirmed. To confirm the combined analysis between hyper-spectral and L-band SAR data to detect and classify the surface material, further studies are still required. We will continue to investigate more efficient analytic methods and to examine other functions like the adopted channels, the number of class in classification, the kind of statistic information, and so on, to refine the method.

  2. Orbit Determination (OD) Error Analysis Results for the Triana Sun-Earth L1 Libration Point Mission and for the Fourier Kelvin Stellar Interferometer (FKSI) Sun-Earth L2 Libration Point Mission Concept

    NASA Technical Reports Server (NTRS)

    Marr, Greg C.

    2003-01-01

    The Triana spacecraft was designed to be launched by the Space Shuttle. The nominal Triana mission orbit will be a Sun-Earth L1 libration point orbit. Using the NASA Goddard Space Flight Center's Orbit Determination Error Analysis System (ODEAS), orbit determination (OD) error analysis results are presented for all phases of the Triana mission from the first correction maneuver through approximately launch plus 6 months. Results are also presented for the science data collection phase of the Fourier Kelvin Stellar Interferometer Sun-Earth L2 libration point mission concept with momentum unloading thrust perturbations during the tracking arc. The Triana analysis includes extensive analysis of an initial short arc orbit determination solution and results using both Deep Space Network (DSN) and commercial Universal Space Network (USN) statistics. These results could be utilized in support of future Sun-Earth libration point missions.

  3. Topics in quantum chaos

    NASA Astrophysics Data System (ADS)

    Jordan, Andrew Noble

    2002-09-01

    In this dissertation, we study the quantum mechanics of classically chaotic dynamical systems. We begin by considering the decoherence effects a quantum chaotic system has on a simple quantum few state system. Typical time evolution of a quantum system whose classical limit is chaotic generates structures in phase space whose size is much smaller than Planck's constant. A naive application of Heisenberg's uncertainty principle indicates that these structures are not physically relevant. However, if we take the quantum chaotic system in question to be an environment which interacts with a simple two state quantum system (qubit), we show that these small phase-space structures cause the qubit to generically lose quantum coherence if and only if the environment has many degrees of freedom, such as a dilute gas. This implies that many-body environments may be crucial for the phenomenon of quantum decoherence. Next, we turn to an analysis of statistical properties of time correlation functions and matrix elements of quantum chaotic systems. A semiclassical evaluation of matrix elements of an operator indicates that the dominant contribution will be related to a classical time correlation function over the energy surface. For a highly chaotic class of dynamics, these correlation functions may be decomposed into sums of Ruelle resonances, which control exponential decay to the ergodic distribution. The theory is illustrated both numerically and theoretically on the Baker map. For this system, we are able to isolate individual Ruelle modes. We further consider dynamical systems whose approach to ergodicity is given by a power law rather than an exponential in time. We propose a billiard with diffusive boundary conditions, whose classical solution may be calculated analytically. We go on to compare the exact solution with an approximation scheme, as well calculate asympotic corrections. Quantum spectral statistics are calculated assuming the validity of the Again, Altshuler and Andreev ansatz. We find singular behavior of the two point spectral correlator in the limit of small spacing. Finally, we analyse the effect that slow decay to ergodicity has on the structure of the quantum propagator, as well as wavefunction localization. We introduce a statistical quantum description of systems that are composed of both an orderly region and a random region. By averaging over the random region only, we find that measures of localization in momentum space semiclassically diverge with the dimension of the Hilbert space. We illustrate this numerically with quantum maps and suggest various other systems where this behavior should be important.

  4. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes.

    PubMed

    Cafaro, Carlo; Alsing, Paul M

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  5. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo; Alsing, Paul M.

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  6. Physical interactions of charged particles for radiotherapy and space applications.

    PubMed

    Zeitlin, Cary

    2012-11-01

    In this paper, the basic physics by which energetic charged particles deposit energy in matter is reviewed. Energetic charged particles are used for radiotherapy and are encountered in spaceflight, where they pose a health risk to astronauts. They interact with matter through nuclear and electromagnetic forces. Deposition of energy occurs mostly along the trajectory of the incoming particle, but depending on the type of incident particle and its energy, there is some nonzero probability for energy deposition relatively far from the nominal trajectory, either due to long-ranged knock-on electrons (sometimes called delta rays) or from the products of nuclear fragmentation, including neutrons. In the therapy setting, dose localization is of paramount importance, and the deposition of energy outside nominal treatment volumes complicates planning and increases the risk of secondary cancers as well as noncancer effects in normal tissue. Statistical effects are also important and will be discussed. In contrast to radiation therapy patients, astronauts in space receive comparatively small whole-body radiation doses from energetic charged particles and associated secondary radiation. A unique aspect of space radiation exposures is the high-energy heavy-ion component of the dose. This is not present in terrestrial exposures except in carbon-ion radiotherapy. Designers of space missions must limit exposures to keep risk within acceptable limits. These limits are, at present, defined for low-Earth orbit, but not for deep-space missions outside the geomagnetosphere. Most of the uncertainty in risk assessment for such missions comes from the lack of understanding of the biological effectiveness of the heavy-ion component, with a smaller component due to uncertainties in transport physics and dosimetry. These same uncertainties are also critical in the therapy setting.

  7. Memory matters: influence from a cognitive map on animal space use.

    PubMed

    Gautestad, Arild O

    2011-10-21

    A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. NASA Enterprise Visual Analysis

    NASA Technical Reports Server (NTRS)

    Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck

    2007-01-01

    NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.

  9. Integrating Statistical Mechanics with Experimental Data from the Rotational-Vibrational Spectrum of HCl into the Physical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Findley, Bret R.; Mylon, Steven E.

    2008-01-01

    We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…

  10. Toward a New Conceptual Framework for Teaching about Flood Risk in Introductory Geoscience Courses

    ERIC Educational Resources Information Center

    Lutz, Tim

    2011-01-01

    An analysis of physical geology textbooks used in introductory courses shows that there is a systematic lack of clarity regarding flood risk. Some problems originate from confusion relating to statistical terms such as "100-year flood" and "100-year floodplain." However, the main problem is conceptual: statistics such as return…

  11. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  12. Evaluation of a dimension-reduction-based statistical technique for Temperature, Water Vapour and Ozone retrievals from IASI radiances

    NASA Astrophysics Data System (ADS)

    Amato, Umberto; Antoniadis, Anestis; De Feis, Italia; Masiello, Guido; Matricardi, Marco; Serio, Carmine

    2009-03-01

    Remote sensing of atmosphere is changing rapidly thanks to the development of high spectral resolution infrared space-borne sensors. The aim is to provide more and more accurate information on the lower atmosphere, as requested by the World Meteorological Organization (WMO), to improve reliability and time span of weather forecasts plus Earth's monitoring. In this paper we show the results we have obtained on a set of Infrared Atmospheric Sounding Interferometer (IASI) observations using a new statistical strategy based on dimension reduction. Retrievals have been compared to time-space colocated ECMWF analysis for temperature, water vapor and ozone.

  13. Analysis of Particulate and Fiber Debris Samples Returned from the International Space Station

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Coston, James E.

    2014-01-01

    During the period of International Space Station (ISS) Increments 30 and 31, crewmember reports cited differences in the cabin environment relating to particulate matter and fiber debris compared to earlier experience as well as allergic responses to the cabin environment. It was hypothesized that a change in the cabin atmosphere's suspended particulate matter load may be responsible for the reported situation. Samples were collected and returned to ground-based laboratories for assessment. Assessments included physical classification, optical microscopy and photographic analysis, and scanning electron microscopy (SEM) evaluation using energy dispersive X-ray spectrometry (EDS) methods. Particular points of interest for assessing the samples were for the presence of allergens, carbon dioxide removal assembly (CDRA) zeolite dust, and FGB panel fibers. The results from the physical classification, optical microscopy and photographic analysis, and SEM EDS analysis are presented and discussed.

  14. Rayleigh scattering in an emitter-nanofiber-coupling system

    NASA Astrophysics Data System (ADS)

    Tang, Shui-Jing; Gao, Fei; Xu, Da; Li, Yan; Gong, Qihuang; Xiao, Yun-Feng

    2017-04-01

    Scattering is a general process in both fundamental and applied physics. In this paper, we investigate Rayleigh scattering of a solid-state-emitter coupled to a nanofiber, by S -matrix-like theory in k -space description. Under this model, both Rayleigh scattering and dipole interaction are studied between a two-level artificial atom embedded in a nanocrystal and fiber modes (guided and radiation modes). It is found that Rayleigh scattering plays a critical role in the transport properties and quantum statistics of photons. On the one hand, Rayleigh scattering produces the transparency in the optical transmitted field of the nanofiber, accompanied by the change of atomic phase, population, and frequency shift. On the other hand, the interference between two kinds of scattering fields by Rayleigh scattering and dipole transition modifies the photon statistics (second-order autocorrelation function) of output fields, showing a strong wavelength dependence. This study provides guidance for the solid-state emitter acting as a single-photon source and can be extended to explore the scattering effect in many-body physics.

  15. A Case for More Multiple Scattering Lidar from Space: Analysis of Four LITE Pulses Returned from a Marine Stratocumulus Deck

    NASA Technical Reports Server (NTRS)

    Davis, Anthony B.; Winker, David M.

    2011-01-01

    Outline: (1) Signal Physics for Multiple-Scattering Cloud Lidar, (2) SNR Estimation (3) Cloud Property Retrievals (3a) several techniques (3b) application to Lidar-In-space Technology Experiment (LITE) data (3c) relation to O2 A-band

  16. Career Aspirations and Career Outcomes for Solar and Space Physics Ph.D

    NASA Astrophysics Data System (ADS)

    Moldwin, M.; Morrow, C. A.

    2013-12-01

    Results from a recent graduate student survey found unsurprisingly that Solar and Space Physics (S&SP) Ph.D. graduate students almost all aspire to have research careers in Solar and Space Physics. This study reports on the research career outcomes over the last decade for S&SP Ph.Ds. We used publication of peer-reviewed articles as the indicator for persistence in a research career. We found that nearly two-thirds (64%) of Ph.D.s who graduated between 2001 to 2009 published refereed-papers four or more years after their Ph.D., while 17% of Ph.D.s never published another paper beyond the year they received their Ph.D. The remaining 19% of Ph.Ds, stopped publishing within three-years of receiving their Ph.D. We found that though there is statistically no difference on persistence of publishing research between graduates of the largest programs compared to all other programs, there are significant differences between programs. We also found there was no gender differences in any of the persistence data (i.e., men and women stop or continue publishing at the same rates). Graduate programs, faculty advisors and potential graduate students can use these data for career planning. This study suggests that a significant majority of S&SP Ph.D.s (77%) find post-doctoral research positions and a majority (56%) find research careers beyond their post-doc.

  17. Research Career Persistence for Solar and Space Physics PhD

    NASA Astrophysics Data System (ADS)

    Moldwin, Mark B.; Morrow, Cherilynn

    2016-06-01

    Results from a recent graduate student survey found unsurprisingly that Solar and Space Physics (S&SP) PhD graduate students almost all aspire to have research careers in Solar and Space Physics. This study reports on the research career persistence over the first decade of the new millennium for S&SP PhDs. We used publication of science citation indexed articles as the indicator for persistence in a research career. We found that nearly two thirds (64%) of PhDs who graduated between 2001 and 2009 published refereed papers in 2012 or 2013, while 17% of PhDs never published another paper beyond the year they received their PhD. The remaining 19% of PhDs stopped publishing within three years of receiving their PhD. We found no gender difference between research persistence. We also found that though there is statistically no difference on persistence of publishing research between graduates of the largest programs compared to all other programs, there are significant differences between individual programs. This study indicates that a majority of S&SP PhDs find research careers but that a significant fraction pursue careers where publishing in science citation indexed journals is not required. Graduate programs, advisors, and potential graduate students can use these data for career planning and developing mentoring programs that meet the career outcomes of all of their graduates.

  18. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  19. Lattice QCD Thermodynamics and RHIC-BES Particle Production within Generic Nonextensive Statistics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2018-05-01

    The current status of implementing Tsallis (nonextensive) statistics on high-energy physics is briefly reviewed. The remarkably low freezeout-temperature, which apparently fails to reproduce the firstprinciple lattice QCD thermodynamics and the measured particle ratios, etc. is discussed. The present work suggests a novel interpretation for the so-called " Tsallis-temperature". It is proposed that the low Tsallis-temperature is due to incomplete implementation of Tsallis algebra though exponential and logarithmic functions to the high-energy particle-production. Substituting Tsallis algebra into grand-canonical partition-function of the hadron resonance gas model seems not assuring full incorporation of nonextensivity or correlations in that model. The statistics describing the phase-space volume, the number of states and the possible changes in the elementary cells should be rather modified due to interacting correlated subsystems, of which the phase-space is consisting. Alternatively, two asymptotic properties, each is associated with a scaling function, are utilized to classify a generalized entropy for such a system with large ensemble (produced particles) and strong correlations. Both scaling exponents define equivalence classes for all interacting and noninteracting systems and unambiguously characterize any statistical system in its thermodynamic limit. We conclude that the nature of lattice QCD simulations is apparently extensive and accordingly the Boltzmann-Gibbs statistics is fully fulfilled. Furthermore, we found that the ratios of various particle yields at extreme high and extreme low energies of RHIC-BES is likely nonextensive but not necessarily of Tsallis type.

  20. A Mokken scale analysis of the peer physical examination questionnaire.

    PubMed

    Vaughan, Brett; Grace, Sandra

    2018-01-01

    Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.

  1. Space and Place and the "American" Legacy: Female Protagonists and the Discovery of Self in Two Novels for Young Adults

    ERIC Educational Resources Information Center

    Glenn, Wendy J.

    2017-01-01

    This qualitative literary analysis explores the intersection of place, space, and identity in two novels for young adults to explore how the provision of a new physical place provides space for independence development among female teen protagonists and the implications of this development given the authors' identities as non-US authors writing…

  2. Streetscape greenery and health: stress, social cohesion and physical activity as mediators.

    PubMed

    de Vries, Sjerp; van Dillen, Sonja M E; Groenewegen, Peter P; Spreeuwenberg, Peter

    2013-10-01

    Several studies have shown a positive relationship between local greenspace availability and residents' health, which may offer opportunities for health improvement. This study focuses on three mechanisms through which greenery might exert its positive effect on health: stress reduction, stimulating physical activity and facilitating social cohesion. Knowledge on mechanisms helps to identify which type of greenspace is most effective in generating health benefits. In eighty neighbourhoods in four Dutch cities data on quantity and quality of streetscape greenery were collected by observations. Data on self-reported health and proposed mediators were obtained for adults by mail questionnaires (N = 1641). Multilevel regression analyses, controlling for socio-demographic characteristics, revealed that both quantity and quality of streetscape greenery were related to perceived general health, acute health-related complaints, and mental health. Relationships were generally stronger for quality than for quantity. Stress and social cohesion were the strongest mediators. Total physical activity was not a mediator. Physical activity that could be undertaken in the public space (green activity) was, but less so than stress and social cohesion. With all three mediators included in the analysis, complete mediation could statistically be proven in five out of six cases. In these analyses the contribution of green activity was often not significant. The possibility that the effect of green activity is mediated by stress and social cohesion, rather than that it has a direct health effect, is discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punjabi, Alkesh; Ali, Halima

    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates ({psi},{theta}) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. {psi} is the toroidal magnetic flux and {theta} is the poloidal angle. Natural canonical coordinates ({psi},{theta},{phi}) can be transformed to physical position (R,Z,{phi}) using a canonical transformation. (R,Z,{phi}) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonicalmore » coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.« less

  4. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    NASA Astrophysics Data System (ADS)

    Punjabi, Alkesh; Ali, Halima

    2008-12-01

    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates (ψ,θ) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. ψ is the toroidal magnetic flux and θ is the poloidal angle. Natural canonical coordinates (ψ,θ,φ) can be transformed to physical position (R,Z,φ) using a canonical transformation. (R,Z,φ) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonical coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.

  5. Correspondence between nonrelativistic anti-de Sitter space and conformal field theory, and aging-gravity duality.

    PubMed

    Minic, Djordje; Pleimling, Michel

    2008-12-01

    We point out that the recent discussion of nonrelativistic anti-de Sitter space and conformal field theory correspondence has a direct application in nonequilibrium statistical physics, a fact which has not been emphasized in the recent literature on the subject. In particular, we propose a duality between aging in systems far from equilibrium characterized by the dynamical exponent z=2 and gravity in a specific background. The key ingredient in our proposal is the recent geometric realization of the Schrödinger group. We also discuss the relevance of the proposed correspondence for the more general aging phenomena in systems where the value of the dynamical exponent is different from 2.

  6. Mean-field approximation for spacing distribution functions in classical systems

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.

    2012-01-01

    We propose a mean-field method to calculate approximately the spacing distribution functions p(n)(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p(n)(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed.

  7. Gain degradation and amplitude scintillation due to tropospheric turbulence

    NASA Technical Reports Server (NTRS)

    Theobold, D. M.; Hodge, D. B.

    1978-01-01

    It is shown that a simple physical model is adequate for the prediction of the long term statistics of both the reduced signal levels and increased peak-to-peak fluctuations. The model is based on conventional atmospheric turbulence theory and incorporates both amplitude and angle of arrival fluctuations. This model predicts the average variance of signals observed under clear air conditions at low elevation angles on earth-space paths at 2, 7.3, 20 and 30 GHz. Design curves based on this model for gain degradation, realizable gain, amplitude fluctuation as a function of antenna aperture size, frequency, and either terrestrial path length or earth-space path elevation angle are presented.

  8. Correction of defective pixels for medical and space imagers based on Ising Theory

    NASA Astrophysics Data System (ADS)

    Cohen, Eliahu; Shnitser, Moriel; Avraham, Tsvika; Hadar, Ofer

    2014-09-01

    We propose novel models for image restoration based on statistical physics. We investigate the affinity between these fields and describe a framework from which interesting denoising algorithms can be derived: Ising-like models and simulated annealing techniques. When combined with known predictors such as Median and LOCO-I, these models become even more effective. In order to further examine the proposed models we apply them to two important problems: (i) Digital Cameras in space damaged from cosmic radiation. (ii) Ultrasonic medical devices damaged from speckle noise. The results, as well as benchmark and comparisons, suggest in most of the cases a significant gain in PSNR and SSIM in comparison to other filters.

  9. Linear canonical transformations of coherent and squeezed states in the Wigner phase space. III - Two-mode states

    NASA Technical Reports Server (NTRS)

    Han, D.; Kim, Y. S.; Noz, Marilyn E.

    1990-01-01

    It is shown that the basic symmetry of two-mode squeezed states is governed by the group SP(4) in the Wigner phase space which is locally isomorphic to the (3 + 2)-dimensional Lorentz group. This symmetry, in the Schroedinger picture, appears as Dirac's two-oscillator representation of O(3,2). It is shown that the SU(2) and SU(1,1) interferometers exhibit the symmetry of this higher-dimensional Lorentz group. The mathematics of two-mode squeezed states is shown to be applicable to other branches of physics including thermally excited states in statistical mechanics and relativistic extended hadrons in the quark model.

  10. Rotation of EOFs by the Independent Component Analysis: Towards A Solution of the Mixing Problem in the Decomposition of Geophysical Time Series

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2001-01-01

    The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.

  11. Millimeter wave attenuation prediction using a piecewise uniform rain rate model

    NASA Technical Reports Server (NTRS)

    Persinger, R. R.; Stutzman, W. L.; Bostian, C. W.; Castle, R. E., Jr.

    1980-01-01

    A piecewise uniform rain rate distribution model is introduced as a quasi-physical model of real rain along earth-space millimeter wave propagation paths. It permits calculation of the total attenuation from specific attenuation in a simple fashion. The model predications are verified by comparison with direct attenuation measurements for several frequencies, elevation angles, and locations. Also, coupled with the Rice-Holmberg rain rate model, attenuation statistics are predicated from rainfall accumulation data.

  12. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. The 2006 Cape Canaveral Air Force Station Range Reference Atmosphere Model Validation Study and Sensitivity Analysis to the National Aeronautics and Space Administration's Space Shuttle

    NASA Technical Reports Server (NTRS)

    Burns, Lee; Merry, Carl; Decker, Ryan; Harrington, Brian

    2008-01-01

    The 2006 Cape Canaveral Air Force Station (CCAFS) Range Reference Atmosphere (RRA) is a statistical model summarizing the wind and thermodynamic atmospheric variability from surface to 70 kin. Launches of the National Aeronautics and Space Administration's (NASA) Space Shuttle from Kennedy Space Center utilize CCAFS RRA data to evaluate environmental constraints on various aspects of the vehicle during ascent. An update to the CCAFS RRA was recently completed. As part of the update, a validation study on the 2006 version was conducted as well as a comparison analysis of the 2006 version to the existing CCAFS RRA database version 1983. Assessments to the Space Shuttle vehicle ascent profile characteristics were performed to determine impacts of the updated model to the vehicle performance. Details on the model updates and the vehicle sensitivity analyses with the update model are presented.

  14. Gender differences in the effects of urban neighborhood on depressive symptoms in Jamaica.

    PubMed

    Mullings, Jasneth Asher; McCaw-Binns, Affette Michelle; Archer, Carol; Wilks, Rainford

    2013-12-01

    To explore the mental health effects of the urban neighborhood on men and women in Jamaica and the implications for urban planning and social development. A cross-sectional household sample of 2 848 individuals 15-74 years of age obtained from the Jamaica Health and Lifestyle Survey 2007-2008 was analyzed. Secondary analysis was undertaken by developing composite scores to describe observer recorded neighborhood features, including infrastructure, amenities/services, physical conditions, community socioeconomic status, and green spaces around the home. Depressive symptoms were assessed using the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV). Bivariate and multivariate methods were used to explore the associations among gender, neighborhood factors, and risk of depressive symptoms. While no associations were found among rural residents, urban neighborhoods were associated with increased risk of depressive symptoms. Among males, residing in a neighborhood with poor infrastructure increased risk; among females, residing in an informal community/unplanned neighborhood increased risk. The urban neighborhood contributes to the risk of depression symptomatology in Jamaica, with different environmental stressors affecting men and women. Urban and social planners need to consider the physical environment when developing health interventions in urban settings, particularly in marginalized communities.

  15. The APIS service : a tool for accessing value-added HST planetary auroral observations over 1997-2015

    NASA Astrophysics Data System (ADS)

    Lamy, L.; Henry, F.; Prangé, R.; Le Sidaner, P.

    2015-10-01

    The Auroral Planetary Imaging and Spectroscopy (APIS) service http://obspm.fr/apis/ provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro- imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria (Figure 1) and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multispectral combined analysis [1,2]. We will present the updated capabilities of APIS with several examples. Several tutorials are available online.

  16. B*Bπ coupling using relativistic heavy quarks

    DOE PAGES

    Flynn, J. M.; Fritzsch, P.; Kawanai, T.; ...

    2016-01-27

    We report on a calculation of the B*Bπ coupling in lattice QCD. The strong matrix element (Bπ|B*) is directly related to the leading order low-energy constant in heavy meson chiral perturbation theory (HM ΧPT) for B mesons. We carry out our calculation directly at the b-quark mass using a non-perturbatively tuned clover action that controls discretization effects of order |p →a| and (ma) n for all n. Our analysis is performed on RBC/UKQCD gauge configurations using domain-wall fermions and the Iwasaki gauge action at two lattice spacings of a –1 = 1.729(25) GeV, a –1 = 2.281 (28) GeV, andmore » unitary pion masses down to 290 MeV. We achieve good statistical precision and control all systematic uncertainties, giving a final result for the HM ΧPT coupling g b = 0.56(3) stat(7) sys in the continuum and at the physical light-quark masses. Furthermore, this is the first calculation performed directly at the physical b-quark mass and lies in the region one would expect from carrying out an interpolation between previous results at the charm mass and at the static point.« less

  17. Use of Statistical Estimators as Virtual Observatory Search ParametersEnabling Access to Solar and Planetary Resources through the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Merka, J.; Dolan, C. F.

    2015-12-01

    Finding and retrieving space physics data is often a complicated taskeven for publicly available data sets: Thousands of relativelysmall and many large data sets are stored in various formats and, inthe better case, accompanied by at least some documentation. VirtualHeliospheric and Magnetospheric Observatories (VHO and VMO) help researches by creating a single point of uniformdiscovery, access, and use of heliospheric (VHO) and magnetospheric(VMO) data.The VMO and VHO functionality relies on metadata expressed using theSPASE data model. This data model is developed by the SPASE WorkingGroup which is currently the only international group supporting globaldata management for Solar and Space Physics. The two Virtual Observatories(VxOs) have initiated and lead a development of a SPASE-related standardnamed SPASE Query Language for provided a standard way of submittingqueries and receiving results.The VMO and VHO use SPASE and SPASEQL for searches based on various criteria such as, for example, spatial location, time of observation, measurement type, parameter values, etc. The parameter values are represented by their statisticalestimators calculated typically over 10-minute intervals: mean, median, standard deviation, minimum, and maximum. The use of statistical estimatorsenables science driven data queries that simplify and shorten the effort tofind where and/or how often the sought phenomenon is observed, as we will present.

  18. Percolation Analysis as a Tool to Describe the Topology of the Large Scale Structure of the Universe

    NASA Astrophysics Data System (ADS)

    Yess, Capp D.

    1997-09-01

    Percolation analysis is the study of the properties of clusters. In cosmology, it is the statistics of the size and number of clusters. This thesis presents a refinement of percolation analysis and its application to astronomical data. An overview of the standard model of the universe and the development of large scale structure is presented in order to place the study in historical and scientific context. Then using percolation statistics we, for the first time, demonstrate the universal character of a network pattern in the real space, mass distributions resulting from nonlinear gravitational instability of initial Gaussian fluctuations. We also find that the maximum of the number of clusters statistic in the evolved, nonlinear distributions is determined by the effective slope of the power spectrum. Next, we present percolation analyses of Wiener Reconstructions of the IRAS 1.2 Jy Redshift Survey. There are ten reconstructions of galaxy density fields in real space spanning the range β = 0.1 to 1.0, where β=Ω0.6/b,/ Ω is the present dimensionless density and b is the linear bias factor. Our method uses the growth of the largest cluster statistic to characterize the topology of a density field, where Gaussian randomized versions of the reconstructions are used as standards for analysis. For the reconstruction volume of radius, R≈100h-1 Mpc, percolation analysis reveals a slight 'meatball' topology for the real space, galaxy distribution of the IRAS survey. Finally, we employ a percolation technique developed for pointwise distributions to analyze two-dimensional projections of the three northern and three southern slices in the Las Campanas Redshift Survey and then give consideration to further study of the methodology, errors and application of percolation. We track the growth of the largest cluster as a topological indicator to a depth of 400 h-1 Mpc, and report an unambiguous signal, with high signal-to-noise ratio, indicating a network topology which in two dimensions is indicative of a filamentary distribution. It is hoped that one day percolation analysis can characterize the structure of the universe to a degree that will aid theorists in confidently describing the nature of our world.

  19. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  20. Research on natural lighting in reading spaces of university libraries in Jinan under the perspective of energy-efficiency

    NASA Astrophysics Data System (ADS)

    Yang, Zengzhang

    2017-11-01

    The natural lighting design in the reading spaces of university libraries not only influences physical and mental health of readers but also concerns the energy consumption of the libraries. The scientific and rational design of natural lighting is the key to the design of energy saving for physical environment of the reading space. The paper elaborates the present situation and existed problems of natural lighting in reading spaces of university libraries across Jinan region based on characteristics of light climate of Jinan region and concrete utilization of reading spaces in university libraries, and combining field measurement, survey, research and data analysis of reading spaces in Shandong Women’s University’s library. The paper, under the perspective of energy-efficiency, puts forward proposals to improve natural lighting in the reading spaces of university libraries from five aspects, such as adjustment of interior layout, optimization of outer windows design, employment of the reflector panel, design lighting windows on inner walls and utilization of adjustable sun shading facilities.

  1. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  2. Transition probabilities for non self-adjoint Hamiltonians in infinite dimensional Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagarello, F., E-mail: fabio.bagarello@unipa.it

    In a recent paper we have introduced several possible inequivalent descriptions of the dynamics and of the transition probabilities of a quantum system when its Hamiltonian is not self-adjoint. Our analysis was carried out in finite dimensional Hilbert spaces. This is useful, but quite restrictive since many physically relevant quantum systems live in infinite dimensional Hilbert spaces. In this paper we consider this situation, and we discuss some applications to well known models, introduced in the literature in recent years: the extended harmonic oscillator, the Swanson model and a generalized version of the Landau levels Hamiltonian. Not surprisingly we willmore » find new interesting features not previously found in finite dimensional Hilbert spaces, useful for a deeper comprehension of this kind of physical systems.« less

  3. Statistical modeling of space shuttle environmental data

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.

    1983-01-01

    Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.

  4. [The application of the prospective space-time statistic in early warning of infectious disease].

    PubMed

    Yin, Fei; Li, Xiao-Song; Feng, Zi-Jian; Ma, Jia-Qi

    2007-06-01

    To investigate the application of prospective space-time scan statistic in the early stage of detecting infectious disease outbreaks. The prospective space-time scan statistic was tested by mimicking daily prospective analyses of bacillary dysentery data of Chengdu city in 2005 (3212 cases in 102 towns and villages). And the results were compared with that of purely temporal scan statistic. The prospective space-time scan statistic could give specific messages both in spatial and temporal. The results of June indicated that the prospective space-time scan statistic could timely detect the outbreaks that started from the local site, and the early warning message was powerful (P = 0.007). When the merely temporal scan statistic for detecting the outbreak was sent two days later, and the signal was less powerful (P = 0.039). The prospective space-time scan statistic could make full use of the spatial and temporal information in infectious disease data and could timely and effectively detect the outbreaks that start from the local sites. The prospective space-time scan statistic could be an important tool for local and national CDC to set up early detection surveillance systems.

  5. Transport in the Subtropical Lowermost Stratosphere during CRYSTAL-FACE

    NASA Technical Reports Server (NTRS)

    Pittman, Jasna V.; Weinstock, elliot M.; Oglesby, Robert J.; Sayres, David S.; Smith, Jessica B.; Anderson, James G.; Cooper, Owen R.; Wofsy, Steven C.; Xueref, Irene; Gerbig, Cristoph; hide

    2007-01-01

    We use in situ measurements of water vapor (H2O), ozone (O3), carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), and total reactive nitrogen (NO(y)) obtained during the CRYSTAL-FACE campaign in July 2002 to study summertime transport in the subtropical lowermost stratosphere. We use an objective methodology to distinguish the latitudinal origin of the sampled air masses despite the influence of convection, and we calculate backward trajectories to elucidate their recent geographical history. The methodology consists of exploring the statistical behavior of the data by performing multivariate clustering and agglomerative hierarchical clustering calculations, and projecting cluster groups onto principal component space to identify air masses of like composition and hence presumed origin. The statistically derived cluster groups are then examined in physical space using tracer-tracer correlation plots. Interpretation of the principal component analysis suggests that the variability in the data is accounted for primarily by the mean age of air in the stratosphere, followed by the age of the convective influence, and lastly by the extent of convective influence, potentially related to the latitude of convective injection [Dessler and Sherwuud, 2004]. We find that high-latitude stratospheric air is the dominant source region during the beginning of the campaign while tropical air is the dominant source region during the rest of the campaign. Influence of convection from both local and non-local events is frequently observed. The identification of air mass origin is confirmed with backward trajectories, and the behavior of the trajectories is associated with the North American monsoon circulation.

  6. Transport in the Subtropical Lowermost Stratosphere during the Cirrus Regional Study of Tropical Anvils and Cirrus Layers-Florida Area Cirrus Experiment

    NASA Technical Reports Server (NTRS)

    Pittman, Jasna V.; Weinstock, Elliot M.; Oglesby, Robert J.; Sayres, David S.; Smith, Jessica B.; Anderson, James G.; Cooper, Owen R.; Wofsy, Steven C.; Xueref, Irene; Gerbig, Cristoph; hide

    2007-01-01

    We use in situ measurements of water vapor (H2O), ozone (O3), carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), and total reactive nitrogen (NOy) obtained during the CRYSTAL-FACE campaign in July 2002 to study summertime transport in the subtropical lowermost stratosphere. We use an objective methodology to distinguish the latitudinal origin of the sampled air masses despite the influence of convection, and we calculate backward trajectories to elucidate their recent geographical history. The methodology consists of exploring the statistical behavior of the data by performing multivariate clustering and agglomerative hierarchical clustering calculations and projecting cluster groups onto principal component space to identify air masses of like composition and hence presumed origin. The statistically derived cluster groups are then examined in physical space using tracer-tracer correlation plots. Interpretation of the principal component analysis suggests that the variability in the data is accounted for primarily by the mean age of air in the stratosphere, followed by the age of the convective influence, and last by the extent of convective influence, potentially related to the latitude of convective injection (Dessler and Sherwood, 2004). We find that high-latitude stratospheric air is the dominant source region during the beginning of the campaign while tropical air is the dominant source region during the rest of the campaign. Influence of convection from both local and nonlocal events is frequently observed. The identification of air mass origin is confirmed with backward trajectories, and the behavior of the trajectories is associated with the North American monsoon circulation.

  7. The 2006 Kennedy Space Center Range Reference Atmosphere Model Validation Study and Sensitivity Analysis to the Performance of the National Aeronautics and Space Administration's Space Shuttle Vehicle

    NASA Technical Reports Server (NTRS)

    Burns, Lee; Decker, Ryan; Harrington, Brian; Merry, Carl

    2008-01-01

    The Kennedy Space Center (KSC) Range Reference Atmosphere (RRA) is a statistical model that summarizes wind and thermodynamic atmospheric variability from surface to 70 km. The National Aeronautics and Space Administration's (NASA) Space Shuttle program, which launches from KSC, utilizes the KSC RRA data to evaluate environmental constraints on various aspects of the vehicle during ascent. An update to the KSC RRA was recently completed. As part of the update, the Natural Environments Branch at NASA's Marshall Space Flight Center (MSFC) conducted a validation study and a comparison analysis to the existing KSC RRA database version 1983. Assessments to the Space Shuttle vehicle ascent profile characteristics were performed by JSC/Ascent Flight Design Division to determine impacts of the updated model to the vehicle performance. Details on the model updates and the vehicle sensitivity analyses with the update model are presented.

  8. Space flight risk data collection and analysis project: Risk and reliability database

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.

  9. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  10. Analysis models for the estimation of oceanic fields

    NASA Technical Reports Server (NTRS)

    Carter, E. F.; Robinson, A. R.

    1987-01-01

    A general model for statistically optimal estimates is presented for dealing with scalar, vector and multivariate datasets. The method deals with anisotropic fields and treats space and time dependence equivalently. Problems addressed include the analysis, or the production of synoptic time series of regularly gridded fields from irregular and gappy datasets, and the estimate of fields by compositing observations from several different instruments and sampling schemes. Technical issues are discussed, including the convergence of statistical estimates, the choice of representation of the correlations, the influential domain of an observation, and the efficiency of numerical computations.

  11. Quality over Quantity: Contribution of Urban Green Space to Neighborhood Satisfaction

    PubMed Central

    Zhang, Yang; Van den Berg, Agnes E.; Van Dijk, Terry; Weitkamp, Gerd

    2017-01-01

    There is increasing evidence that the quality of green space significantly contributes to neighborhood satisfaction and well-being, independent of the mere amount of green space. In this paper, we examined residents’ perceptions of the quality and beneficial affordances of green space in relation to objectively assessed accessibility and usability. We used data from a survey in two neighborhoods (N = 223) of a medium-sized city in the Netherlands, which were similar in the amount of green space and other physical and socio-demographic characteristics, but differed in the availability of accessible and usable green spaces. Results show that residents of the neighborhood with a higher availability of accessible and usable green spaces were more satisfied with their neighborhood. This difference was statistically mediated by the higher level of perceived green space quality. Neighborhood satisfaction was significantly positively related to well-being. However, residents of the two neighborhoods did not differ in self-reported well-being and beneficial affordances of green space. These analyses contribute to a further understanding of how the accessibility and usability of green spaces may increase people’s neighborhood satisfaction. It highlights the importance of perceived quality in addition to the amount of green space when examining the beneficial effects of green space. PMID:28509879

  12. Quality over Quantity: Contribution of Urban Green Space to Neighborhood Satisfaction.

    PubMed

    Zhang, Yang; Van den Berg, Agnes E; Van Dijk, Terry; Weitkamp, Gerd

    2017-05-16

    There is increasing evidence that the quality of green space significantly contributes to neighborhood satisfaction and well-being, independent of the mere amount of green space. In this paper, we examined residents' perceptions of the quality and beneficial affordances of green space in relation to objectively assessed accessibility and usability. We used data from a survey in two neighborhoods ( N = 223) of a medium-sized city in the Netherlands, which were similar in the amount of green space and other physical and socio-demographic characteristics, but differed in the availability of accessible and usable green spaces. Results show that residents of the neighborhood with a higher availability of accessible and usable green spaces were more satisfied with their neighborhood. This difference was statistically mediated by the higher level of perceived green space quality. Neighborhood satisfaction was significantly positively related to well-being. However, residents of the two neighborhoods did not differ in self-reported well-being and beneficial affordances of green space. These analyses contribute to a further understanding of how the accessibility and usability of green spaces may increase people's neighborhood satisfaction. It highlights the importance of perceived quality in addition to the amount of green space when examining the beneficial effects of green space.

  13. Feynman-Kac equations for reaction and diffusion processes

    NASA Astrophysics Data System (ADS)

    Hou, Ru; Deng, Weihua

    2018-04-01

    This paper provides a theoretical framework for deriving the forward and backward Feynman-Kac equations for the distribution of functionals of the path of a particle undergoing both diffusion and reaction processes. Once given the diffusion type and reaction rate, a specific forward or backward Feynman-Kac equation can be obtained. The results in this paper include those for normal/anomalous diffusions and reactions with linear/nonlinear rates. Using the derived equations, we apply our findings to compute some physical (experimentally measurable) statistics, including the occupation time in half-space, the first passage time, and the occupation time in half-interval with an absorbing or reflecting boundary, for the physical system with anomalous diffusion and spontaneous evanescence.

  14. Retention of retrospective print journals in the digital age: trends and analysis

    PubMed Central

    Kaplan, Richard; Steinberg, Marilyn; Doucette, Joanne

    2006-01-01

    Purpose: The issue of retaining retrospective print journals is examined in light of the shift to electronic titles, the reallocation of library budgets from print to electronic, and the changing research practices of today's library users. This article also examines the evolving role of the physical library and its impact on space allocation. Methods: To determine current practice and opinion, a survey of health sciences librarians and academic librarians was conducted. To demonstrate the use patterns of older journal issues, citation analyses and interlibrary loan statistics were examined. Results: All methods indicate that recent material is accessed more frequently than older material, with a significant drop in use of materials greater than 15 years old. Materials greater than 20 years old constituted less than 5% of interlibrary loans and less than 9% of articles noted in the citation analysis. Conclusions: It is possible to eliminate older years of a print journal collection without a large impact on the needs of researchers. Librarians' preference to maintain full runs of journal titles may be motivated by reasons outside of actual usage or patrons needs. PMID:17082829

  15. Analyzing linear spatial features in ecology.

    PubMed

    Buettel, Jessie C; Cole, Andrew; Dickey, John M; Brook, Barry W

    2018-06-01

    The spatial analysis of dimensionless points (e.g., tree locations on a plot map) is common in ecology, for instance using point-process statistics to detect and compare patterns. However, the treatment of one-dimensional linear features (fiber processes) is rarely attempted. Here we appropriate the methods of vector sums and dot products, used regularly in fields like astrophysics, to analyze a data set of mapped linear features (logs) measured in 12 × 1-ha forest plots. For this demonstrative case study, we ask two deceptively simple questions: do trees tend to fall downhill, and if so, does slope gradient matter? Despite noisy data and many potential confounders, we show clearly that topography (slope direction and steepness) of forest plots does matter to treefall. More generally, these results underscore the value of mathematical methods of physics to problems in the spatial analysis of linear features, and the opportunities that interdisciplinary collaboration provides. This work provides scope for a variety of future ecological analyzes of fiber processes in space. © 2018 by the Ecological Society of America.

  16. Analysis of Chromosomal Aberrations in the Blood Lymphocytes of Astronauts after Space Flight

    NASA Technical Reports Server (NTRS)

    George, K.; Kim, M. Y.; Elliott, T.; Cucinotta, F. A.

    2007-01-01

    It is a NASA requirement that biodosimetry analysis be performed on all US astronauts who participate in long duration missions of 3 months or more onboard the International Space Station. Cytogenetic analysis of blood lymphocytes is the most sensitive and reliable biodosimetry method available at present, especially if chromosome damage is assessed before as well as after space flight. Results provide a direct measurement of space radiation damage in vivo that takes into account individual radiosensitivity and considers the influence of microgravity and other stress conditions. We present data obtained from all twenty-five of the crewmembers who have participated in the biodosimetry program so far. The yield of chromosome exchanges, measured using fluorescence in situ hybridization (FISH) technique with chromosome painting probes, increased after space flight for all these individuals. In vivo dose was derived from frequencies of chromosome exchanges using preflight calibration curves of in vitro exposed cells from the same individual, and RBE was compared with individually measured physically absorbed dose and projected organ dose equivalents. Biodosimetry estimates using samples collected within a few weeks of return from space lie within the range expected from physical dosimetry. For some of these individuals chromosome aberrations were assessed again several months after their respective missions and a temporal decline in stable exchanges was observed in some cases, suggesting that translocations are unstable with time after whole body exposure to space radiation. This may indicate complications with the use of translocations for retrospective dose reconstruction. Data from one crewmember who has participated in two separate long duration space missions and has been followed up for over 10 years provides limited data on the effect of repeat flights and shows a possible adaptive response to space radiation exposure.

  17. Diagnosing alternative conceptions of Fermi energy among undergraduate students

    NASA Astrophysics Data System (ADS)

    Sharma, Sapna; Ahluwalia, Pardeep Kumar

    2012-07-01

    Physics education researchers have scientifically established the fact that the understanding of new concepts and interpretation of incoming information are strongly influenced by the preexisting knowledge and beliefs of students, called epistemological beliefs. This can lead to a gap between what students actually learn and what the teacher expects them to learn. In a classroom, as a teacher, it is desirable that one tries to bridge this gap at least on the key concepts of a particular field which is being taught. One such key concept which crops up in statistical physics/solid-state physics courses, and around which the behaviour of materials is described, is Fermi energy (εF). In this paper, we present the results which emerged about misconceptions on Fermi energy in the process of administering a diagnostic tool called the Statistical Physics Concept Survey developed by the authors. It deals with eight themes of basic importance in learning undergraduate solid-state physics and statistical physics. The question items of the tool were put through well-established sequential processes: definition of themes, Delphi study, interview with students, drafting questions, administration, validity and reliability of the tool. The tool was administered to a group of undergraduate students and postgraduate students, in a pre-test and post-test design. In this paper, we have taken one of the themes i.e. Fermi energy of the diagnostic tool for our analysis and discussion. Students’ responses and reasoning comments given during interview were analysed. This analysis helped us to identify prevailing misconceptions/learning gaps among students on this topic. How spreadsheets can be effectively used to remove the identified misconceptions and help appreciate the finer nuances while visualizing the behaviour of the system around Fermi energy, normally sidestepped both by the teachers and learners, is also presented in this paper.

  18. Suited crewmember productivity

    NASA Astrophysics Data System (ADS)

    Barer, A. S.; Filipenkov, S. N.

    Analysis of the extravehicular activity (EVA) sortie experience gained in the former Soviet Union and physiologic hygienic aspect of space suit design and development shows that crewmember productivity is related to the following main factors: —space suit microclimate (gas composition, pressure and temperature); —limitation of motion activity and perception, imposed by the space suit; —good crewmember training in the ground training program; —level of crewmember general physical performance capabilities in connection with mission duration and intervals between sorties; —individual EVA experience (with accumulation) at which workmanship improves, while metabolism, physical and emotional stress decreases; —concrete EVA duration and work rate; —EVA bioengineering, including selection of tools, work station, EVA technology and mechanization.

  19. Identifying phase-space boundaries with Voronoi tessellations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Debnath, Dipsikha; Gainer, James S.; Kilic, Can

    Determining the masses of new physics particles appearing in decay chains is an important and longstanding problem in high energy phenomenology. Recently it has been shown that these mass measurements can be improved by utilizing the boundary of the allowed region in the fully differentiable phase space in its full dimensionality. Here in this paper we show that the practical challenge of identifying this boundary can be solved using techniques based on the geometric properties of the cells resulting from Voronoi tessellations of the relevant data. The robust detection of such phase-space boundaries in the data could also be usedmore » to corroborate a new physics discovery based on a cut-and-count analysis.« less

  20. Identifying phase-space boundaries with Voronoi tessellations

    DOE PAGES

    Debnath, Dipsikha; Gainer, James S.; Kilic, Can; ...

    2016-11-24

    Determining the masses of new physics particles appearing in decay chains is an important and longstanding problem in high energy phenomenology. Recently it has been shown that these mass measurements can be improved by utilizing the boundary of the allowed region in the fully differentiable phase space in its full dimensionality. Here in this paper we show that the practical challenge of identifying this boundary can be solved using techniques based on the geometric properties of the cells resulting from Voronoi tessellations of the relevant data. The robust detection of such phase-space boundaries in the data could also be usedmore » to corroborate a new physics discovery based on a cut-and-count analysis.« less

Top