Science.gov

Sample records for 50-minute period statistical

  1. Providing Students with Foundational Field Instruction within a 50 Minute Class Period: A Practical Example

    NASA Astrophysics Data System (ADS)

    Percy, M.

    2014-12-01

    There is a growing recognition among secondary educators and administrators that students need to have a science education that provides connections between familiar classes like biology, chemistry, and physics. Because of this waxing interest in an integrative approach to the sciences, there is a broader push for school districts to offer classes geared towards the earth sciences, a field that incorporates knowledge and skills gleaned from the three core science subjects. Within the contexts of a regular secondary school day on a traditional schedule (45- to 50-minute long classes), it is challenging to engage students in rigorous field-based learning, critical for students to develop a deeper understanding of geosciences content, without requiring extra time outside of the regular schedule. We suggest instruction using common, manmade features like drainage retention ponds to model good field practices and provide students with the opportunity to calculate basic hydrologic budgets, take pH readings, and, if in an area with seasonal rainfall, make observations regarding soils by way of trenching, and near-surface processes, including mass wasting and the effects of vegetation on geomorphology. Gains in student understanding are discussed by analyzing the difference in test scores between exams provided to the students after they had received only in-class instruction, and after they had received field instruction in addition to the in-class lectures. In an advanced setting, students made measurements regarding ion contents and pollution that allowed the classes to practice lab skills while developing a data set that was analyzed after field work was completed. It is posited that similar fieldwork could be an effective approach at an introductory level in post-secondary institutions.

  2. Sampling period, statistical complexity, and chaotic attractors

    NASA Astrophysics Data System (ADS)

    De Micco, Luciana; Fernández, Juana Graciela; Larrondo, Hilda A.; Plastino, Angelo; Rosso, Osvaldo A.

    2012-04-01

    We analyze the statistical complexity measure vs. entropy plane-representation of sampled chaotic attractors as a function of the sampling period τ and show that, if the Bandt and Pompe procedure is used to assign a probability distribution function (PDF) to the pertinent time series, the statistical complexity measure (SCM) attains a definite maximum for a specific sampling periodtM. On the contrary, the usual histogram approach for assigning PDFs to a time series leads to essentially constant SCM values for any sampling period τ. The significance of tM is further investigated by comparing it with typical times found in the literature for the two main reconstruction processes: the Takens' one in a delay-time embedding, on one hand, and the exact Nyquist-Shannon reconstruction, on the other one. It is shown that tM is compatible with those times recommended as adequate delay ones in Takens' reconstruction. The reported results correspond to three representative chaotic systems having correlation dimension 2

  3. Statistical Inference of the Be Star Periodicity

    NASA Astrophysics Data System (ADS)

    Hubert, A. M.

    2007-03-01

    A review of periodicities with different timescales found in photometric and line profile variability of Be stars from visual, UV, and X-ray observations is presented. A distinction is made between what are called "stable" periods, "transient" periods and cyclic variations. Firstly, the report focuses on the intrinsic variability of Be stars. I attempt to distinguish between variations due to non-radial pulsations and those due to corotating magnetic structures at or near the stellar surface. As the rotational period is a critical value for selection of processes giving rise to short-term periodicities, estimates provided by rotational modulation attributes are compared with those derived from fundamental parameters. Then, I give an overview of new spatial instrumentation, which will improve our understanding of the origin and nature of Be stars through analyses of their light curves. Secondly, periodicities with longer time scales are considered. These periods are associated with binary systems. Though binarity is irrelevant to the Be phenomenon in many cases, it cannot be excluded that a significant number of Be stars are formed through binary processes. Thus, periodicities linked to mass transfer or orbital effects are analyzed in an evolutionary scheme. Thirdly, long-term cyclic variations that reflect the dynamical state of Be star disks are briefly reviewed.

  4. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  5. Beyond the 50-minute hour: increasing control, choice, and connections in the lives of low-income women.

    PubMed

    Goodman, Lisa A; Smyth, Katya Fels; Banyard, Victoria

    2010-01-01

    Although poverty is associated with a range of mental health difficulties among women in this country, mainstream mental health interventions are not sufficient to meet the complex needs of poor women. This article argues that stress, powerlessness, and social isolation should become primary targets of our interventions, as they are key mediators of the relationship between poverty and emotional distress, particularly for women. Indeed, if ways are not found to address these conditions directly, by increasing women's control, choice, and connections, the capacity to improve the emotional well-being of impoverished women will remain limited at best. This is the first of 5 articles that comprise a special section of the American Journal of Orthopsychiatry, called "Beyond the 50-Minute Hour: Increasing Control, Choice, and Connections in the Lives of Low-Income Women." Together, these articles explore the nature and impact of a range of innovative mental health interventions that are grounded in a deep understanding of the experience of poverty. This introduction: (a) describes briefly how mainstream approaches fail to address the poverty-related mental health needs of low-income women; (b) illuminates the role of stress, powerlessness, and social isolation in women's lives; (c) highlights the ways in which the articles included in this special section address each of these by either adapting traditional mental health practices to attend to poverty's role in participants' lives or adapting community-based, social-justice-oriented interventions to attend to participants' mental health; and (d) discusses the research and evaluation implications of expanding mental health practices to meet the needs of low-income communities.

  6. Asymptotic work statistics of periodically driven Ising chains

    NASA Astrophysics Data System (ADS)

    Russomanno, Angelo; Sharma, Shraddha; Dutta, Amit; Santoro, Giuseppe E.

    2015-08-01

    We study the work statistics of a periodically-driven integrable closed quantum system, addressing in particular the role played by the presence of a quantum critical point. Taking the example of a one-dimensional transverse Ising model in the presence of a spatially homogeneous but periodically time-varying transverse field of frequency {ω0} , we arrive at the characteristic cumulant generating function G(u), which is then used to calculate the work distribution function P(W). By applying the Floquet theory we show that, in the infinite time limit, P(W) converges, starting from the initial ground state, towards an asymptotic steady state value whose small-W behaviour depends only on the properties of the small-wave-vector modes and on a few important ingredients: the time-averaged value of the transverse field, h0, the initial transverse field, {{h}\\text{i}} , and the equilibrium quantum critical point {{h}\\text{c}} , which we find to generate a sequence of non-equilibrium critical points {{h}*l}={{h}\\text{c}}+l{ω0}/2 , with l integer. When {{h}\\text{i}}\

  7. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.

    2014-07-01

    A synodic period of an asteroid can be derived from its lightcurve by standard methods like Fourier-series fitting. A problem appears when results of observations are based on less than a full coverage of a lightcurve and/or high level of noise. Also, long gaps between individual lightcurves create an ambiguity in the cycle count which leads to aliases. Excluding binary systems and objects with non-principal-axis rotation, the rotation period is usually identical to the period of the second Fourier harmonic of the lightcurve. There are cases, however, where it may be connected with the 1st, 3rd, or 4th harmonic and it is difficult to choose among them when searching for the period. To help remove such uncertainties we analysed asteroid lightcurves for a range of shapes and observing/illuminating geometries. We simulated them using a modified internal code from the ISAM service (Marciniak et al. 2012, A&A 545, A131). In our computations, shapes of asteroids were modeled as Gaussian random spheres (Muinonen 1998, A&A, 332, 1087). A combination of Lommel-Seeliger and Lambert scattering laws was assumed. For each of the 100 shapes, we randomly selected 1000 positions of the spin axis, systematically changing the solar phase angle with a step of 5°. For each lightcurve, we determined its peak-to-peak amplitude, fitted the 6th-order Fourier series and derived the amplitudes of its harmonics. Instead of the number of the lightcurve extrema, which in many cases is subjective, we characterized each lightcurve by the order of the highest-amplitude Fourier harmonic. The goal of our simulations was to derive statistically significant conclusions (based on the underlying assumptions) about the dominance of different harmonics in the lightcurves of the specified amplitude and phase angle. The results, presented in the Figure, can be used in individual cases to estimate the probability that the obtained lightcurve is dominated by a specified Fourier harmonic. Some of the

  8. Statistical tests of a periodicity hypothesis for crater formation rate - II

    NASA Astrophysics Data System (ADS)

    Yabushita, S.

    1996-04-01

    A statistical test is made of the periodicity hypothesis for crater formation rate, using a new data set compiled by Grieve. The criterion adopted is that of Broadbent, modified so as to take into account the loss of craters with time. Small craters (diameters <=2 km) are highly concentrated near the recent epoch, and are not adequate as a data set for testing. Various subsets of the original data are subjected to the test and a period close to 30 Myr is detected. On the assumption of random distribution of crater ages, the probability of detecting such a period is calculated at 50, 73 and 64 per cent respectively for craters with D<~2 km (N=49), for those with 10>=D<~2 km (N=31) and for large craters [D<~10 km, (N=18)] (where N is the number of craters). It is thus difficult to regard the detected period as being significant based on statistical argument alone. It is pointed out that a similar period is associated with geometric reversals and the climatic variation as revealed by the deep ocean delta^18O spectrum.

  9. Marginal increment analysis: a new statistical approach of testing for temporal periodicity in fish age verification.

    PubMed

    Okamura, H; Punt, A E; Semba, Y; Ichinokawa, M

    2013-04-01

    This paper proposes a new and flexible statistical method for marginal increment analysis that directly accounts for periodicity in circular data using a circular-linear regression model with random effects. The method is applied to vertebral marginal increment data for Alaska skate Bathyraja parmifera. The best fit model selected using the AIC indicates that growth bands are formed annually. Simulation, where the underlying characteristics of the data are known, shows that the method performs satisfactorily when uncertainty is not extremely high.

  10. Periodic orbit theory and the statistical analysis of scaling quantum graph spectra.

    PubMed

    Dabaghian, Yu

    2007-05-01

    The explicit solution to the spectral problem of quantum graphs found recently by Dabaghian and Blümel [Phys. Rev. E 68, 055201(R) (2003); 70, 046206 (2004); JETP Lett. 77, 530 (2003)] is used to produce an exact periodic orbit theory description for the probability distributions of spectral statistics, including the distribution for the nearest neighbor separations sn = kn - kn-1, and the distribution of the spectral oscillations around the average, deltakn=kn - kn.

  11. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  12. On the Helicity in 3D-Periodic Navier-Stokes Equations II: The Statistical Case

    NASA Astrophysics Data System (ADS)

    Foias, Ciprian; Hoang, Luan; Nicolaenko, Basil

    2009-09-01

    We study the asymptotic behavior of the statistical solutions to the Navier-Stokes equations using the normalization map [9]. It is then applied to the study of mean energy, mean dissipation rate of energy, and mean helicity of the spatial periodic flows driven by potential body forces. The statistical distribution of the asymptotic Beltrami flows are also investigated. We connect our mathematical analysis with the empirical theory of decaying turbulence. With appropriate mathematically defined ensemble averages, the Kolmogorov universal features are shown to be transient in time. We provide an estimate for the time interval in which those features may still be present. Our collaborator and friend Basil Nicolaenko passed away in September of 2007, after this work was completed. Honoring his contribution and friendship, we dedicate this article to him.

  13. Mathematics and Statistics Research Department progress report, period ending June 30, 1982

    SciTech Connect

    Denson, M.V.; Funderlic, R.E.; Gosslee, D.G.; Lever, W.E.

    1982-08-01

    This report is the twenty-fifth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation Nuclear Division (UCC-ND). Part A records research progress in analysis of large data sets, biometrics research, computational statistics, materials science applications, moving boundary problems, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology, chemistry, energy, engineering, environmental sciences, health and safety, materials science, safeguards, surveys, and the waste storage program. Part C summarizes the various educational activities in which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.

  14. Mathematics and statistics research progress report, period ending June 30, 1983

    SciTech Connect

    Beauchamp, J. J.; Denson, M. V.; Heath, M. T.; Lever, W. E.; Wilson, D. G.

    1983-08-01

    This report is the twenty-sixth in the series of progress reports of Mathematics and Statistics Research of the Computer Sciences organization, Union Carbide Corporation Nuclear Division. Part A records research progress in analysis of large data sets, applied analysis, biometrics research, computational statistics, materials science applications, numerical linear algebra, and risk analysis. Collaboration and consulting with others throughout the Oak Ridge Department of Energy complex are recorded in Part B. Included are sections on biological sciences, energy, engineering, environmental sciences, health and safety, and safeguards. Part C summarizes the various educational activities in which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.

  15. The critical period hypothesis in second language acquisition: a statistical critique and a reanalysis.

    PubMed

    Vanhove, Jan

    2013-01-01

    In second language acquisition research, the critical period hypothesis (cph) holds that the function between learners' age and their susceptibility to second language input is non-linear. This paper revisits the indistinctness found in the literature with regard to this hypothesis's scope and predictions. Even when its scope is clearly delineated and its predictions are spelt out, however, empirical studies-with few exceptions-use analytical (statistical) tools that are irrelevant with respect to the predictions made. This paper discusses statistical fallacies common in cph research and illustrates an alternative analytical method (piecewise regression) by means of a reanalysis of two datasets from a 2010 paper purporting to have found cross-linguistic evidence in favour of the cph. This reanalysis reveals that the specific age patterns predicted by the cph are not cross-linguistically robust. Applying the principle of parsimony, it is concluded that age patterns in second language acquisition are not governed by a critical period. To conclude, this paper highlights the role of confirmation bias in the scientific enterprise and appeals to second language acquisition researchers to reanalyse their old datasets using the methods discussed in this paper. The data and R commands that were used for the reanalysis are provided as supplementary materials.

  16. The Critical Period Hypothesis in Second Language Acquisition: A Statistical Critique and a Reanalysis

    PubMed Central

    Vanhove, Jan

    2013-01-01

    In second language acquisition research, the critical period hypothesis (cph) holds that the function between learners' age and their susceptibility to second language input is non-linear. This paper revisits the indistinctness found in the literature with regard to this hypothesis's scope and predictions. Even when its scope is clearly delineated and its predictions are spelt out, however, empirical studies–with few exceptions–use analytical (statistical) tools that are irrelevant with respect to the predictions made. This paper discusses statistical fallacies common in cph research and illustrates an alternative analytical method (piecewise regression) by means of a reanalysis of two datasets from a 2010 paper purporting to have found cross-linguistic evidence in favour of the cph. This reanalysis reveals that the specific age patterns predicted by the cph are not cross-linguistically robust. Applying the principle of parsimony, it is concluded that age patterns in second language acquisition are not governed by a critical period. To conclude, this paper highlights the role of confirmation bias in the scientific enterprise and appeals to second language acquisition researchers to reanalyse their old datasets using the methods discussed in this paper. The data and R commands that were used for the reanalysis are provided as supplementary materials. PMID:23935947

  17. Influence of harvest method and period on olive oil composition: an NMR and statistical study.

    PubMed

    D'Imperio, Marco; Gobbino, Marco; Picanza, Antonio; Costanzo, Simona; Della Corte, Anna; Mannina, Luisa

    2010-10-27

    The influence of harvest period and harvest method on olive oil composition was investigated by nuclear magnetic resonance (NMR) spectroscopy and by some quality parameters such as free acidity, peroxide value, and UV spectrophotometric indices. This work focuses on two secondary factors (harvest period and harvest method) and investigated their interactions with primary (genetic and pedoclimatic) and secondary (agronomic practices and technological procedures) factors. To avoid misinterpretation, the general linear model analysis (GLM) was used to adjust the result obtained from the analysis of variance (ANOVA). In this way, the effect of the factor of interest was corrected for the effects of the other factors that might influence the variable under investigation. The weight of each factor was evaluated by the variance component analysis (VCA). Finally, multivariate statistical analyses, namely, principal component analysis (PCA) and linear discriminant analysis (LDA), were applied. Samples were grouped according to the harvest period and harvest method. Volatile compounds, that is, hexanal and trans-2-hexenal, as well as the sn-1,3-diglycerides and squalene, significantly decreased during the ripening. The relative value of the ΔK parameter and the hexanal amount were higher in the olive oils obtained from olives harvested by one type of hand-held machine (shaker), whereas the unsaturated fatty chains in the olive oils were higher when another type (comb) was used.

  18. Mathematics and statistics research department. Progress report, period ending June 30, 1981

    SciTech Connect

    Lever, W.E.; Kane, V.E.; Scott, D.S.; Shepherd, D.E.

    1981-09-01

    This report is the twenty-fourth in the series of progress reports of the Mathematics and Statistics Research Department of the Computer Sciences Division, Union Carbide Corporation - Nuclear Division (UCC-ND). Part A records research progress in biometrics research, materials science applications, model evaluation, moving boundary problems, multivariate analysis, numerical linear algebra, risk analysis, and complementary areas. Collaboration and consulting with others throughout the UCC-ND complex are recorded in Part B. Included are sections on biology and health sciences, chemistry, energy, engineering, environmental sciences, health and safety research, materials sciences, safeguards, surveys, and uranium resource evaluation. Part C summarizes the various educational activities in which the staff was engaged. Part D lists the presentations of research results, and Part E records the staff's other professional activities during the report period.

  19. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  20. Clusters in the distribution of pulsars in period, pulse-width, and age. [statistical analysis/statistical distributions

    NASA Technical Reports Server (NTRS)

    Baker, K. B.; Sturrock, P. A.

    1975-01-01

    The question of whether pulsars form a single group or whether pulsars come in two or more different groups is discussed. It is proposed that such groups might be related to several factors such as the initial creation of the neutron star, or the orientation of the magnetic field axis with the spin axis. Various statistical models are examined.

  1. An atomic force microscopy statistical analysis of laser-induced azo-polyimide periodic tridimensional nanogrooves.

    PubMed

    Stoica, Iuliana; Epure, Luiza; Sava, Ion; Damian, Victor; Hurduc, Nicolae

    2013-09-01

    The surface morphology of azo-polyimide films was investigated after 355 nm Nd: YAG laser irradiation with two different incident fluencies. Atomic force microscopy (AFM) was employed to correlate the laser-induced tridimensional nanogrooved surface relief with the incident fluence and the number of irradiation pulses. The height images revealed that the grooves depth increased even tens of times by increasing the incident fluence, using the same numbers of irradiation pulses. For low incident fluence, the films were uniformly patterned till 100 pulses of irradiation. Instead, when using higher fluence, after 15 pulses of irradiation the accuracy of the surface relief definition was reduced. This behavior could be explained by means of two different mechanisms, one that suppose the film photo-fluidization due to the cis-trans isomerization processes of the azo-groups and the second one responsible for the directional mass displacement. The dominant surface direction and parameters like isotropy, periodicity, and period were evaluated from the polar representation for texture analysis, revealing the appearance of ordered and directionated nanostructures for most of the experimental conditions. Also, the graphical studies of the functional volume parameters have evidenced the improvement of the relief structuration during surface nanostructuration. The correlation of these statistical texture parameters with the irradiation characteristics is important in controlling the alignment of either the liquid crystals or the cells/tissues on patterned azo-polyimide surfaces for optoelectronic devices and implantable biomaterials, respectively.

  2. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  3. A statistical test for periodicity hypothesis in the crater formation rate

    NASA Astrophysics Data System (ADS)

    Yabushita, S.

    1991-06-01

    The hypothesis that the crater formation rate exhibits periodicity is examined by adopting a criterion proposed by Broadbent, which is more stringent than those adopted previously. Data sets of Alvarez and Muller, Rampino and Stothers and of Grieve are tested. The data set of Rampino and Stothers is found to satisfy the adopted criterion for periodicity with period P = 30 Myr. Again, small craters (D less than 10 km) in the data set of Grieve satisfy the criterion even better with P = 30 Myr and 50 Myr, but large craters do not satisfy the criterion. Removal of some of the very young craters (ages less than 8 Myr) yields three significant periods, 16.5, 30, and 50 Myr. Taken at face value, the result would indicate that small impactors hit the earth at intervals of 16.5 Myr and that this period is modulated by the galactic tide.

  4. Periodization

    PubMed Central

    Lorenz, Daniel S.; Reiman, Michael P.; Walker, John C.

    2010-01-01

    Background: Clinicians are constantly faced with the challenge of designing training programs for injured and noninjured athletes that maximize healing and optimize performance. Periodization is a concept of systematic progression—that is, resistance training programs that follow predictable patterns of change in training variables. The strength training literature is abundant with studies comparing periodization schemes on uninjured, trained, and untrained athletes. The rehabilitation literature, however, is scarce with information about how to optimally design resistance training programs based on periodization principles for injured athletes. The purpose of this review is to discuss relevant training variables and methods of periodization, as well as periodization program outcomes. A secondary purpose is to provide an anecdotal framework regarding implementation of periodization principles into rehabilitation programs. Evidence Acquisition: A Medline search from 1979 to 2009 was implemented with the keywords periodization, strength training, rehabilitation, endurance, power, hypertrophy, and resistance training with the Boolean term AND in all possible combinations in the English language. Each author also undertook independent hand searching of article references used in this review. Results: Based on the studies researched, periodized strength training regimens demonstrate improved outcomes as compared to nonperiodized programs. Conclusions: Despite the evidence in the strength training literature supporting periodization programs, there is a considerable lack of data in the rehabilitation literature about program design and successful implementation of periodization into rehabilitation programs. PMID:23015982

  5. Statistical Analysis of Periodic Oscillations in LASCO Coronal Mass Ejection Speeds

    NASA Astrophysics Data System (ADS)

    Michalek, G.; Shanmugaraju, A.; Gopalswamy, N.; Yashiro, S.; Akiyama, S.

    2016-12-01

    A large set of coronal mass ejections (CMEs, 3463) has been selected to study their periodic oscillations in speed in the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) field of view. These events, reported in the SOHO/LASCO catalog in the period of time 1996 - 2004, were selected based on having at least 11 height-time measurements. This selection criterion allows us to construct at least ten-point speed-distance profiles and evaluate kinematic properties of CMEs with a reasonable accuracy. To identify quasi-periodic oscillations in the speed of the CMEs a sinusoidal function was fitted to speed-distance profiles and the speed-time profiles. Of the considered events 22 % revealed periodic velocity fluctuations. These speed oscillations have on average amplitude equal to 87 km s^{-1} and period 7.8 R _{⊙}/241 min (in distance/time). The study shows that speed oscillations are a common phenomenon associated with CME propagation implying that all the CMEs have a similar magnetic flux-rope structure. The nature of oscillations can be explained in terms of magnetohydrodynamic (MHD) waves excited during the eruption process. More accurate detection of these modes could, in the future, enable us to characterize magnetic structures in space (space seismology).

  6. Statistical properties of quasi-periodic pulsations in white-light flares observed with Kepler

    NASA Astrophysics Data System (ADS)

    Pugh, C. E.; Armstrong, D. J.; Nakariakov, V. M.; Broomhall, A.-M.

    2016-07-01

    We embark on a study of quasi-periodic pulsations (QPPs) in the decay phase of white-light stellar flares observed by Kepler. Out of the 1439 flares on 216 different stars detected in the short-cadence data using an automated search, 56 flares are found to have pronounced QPP-like signatures in the light curve, of which 11 have stable decaying oscillations. No correlation is found between the QPP period and the stellar temperature, radius, rotation period and surface gravity, suggesting that the QPPs are independent of global stellar parameters. Hence they are likely to be the result of processes occurring in the local environment. There is also no significant correlation between the QPP period and flare energy, however there is evidence that the period scales with the QPP decay time for the Gaussian damping scenario, but not to a significant degree for the exponentially damped case. This same scaling has been observed for MHD oscillations on the Sun, suggesting that they could be the cause of the QPPs in those flares. Scaling laws of the flare energy are also investigated, supporting previous reports of a strong correlation between the flare energy and stellar temperature/radius. A negative correlation between the flare energy and stellar surface gravity is also found.

  7. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis.

  8. An integration of statistic method to track droughts periods induced by global change.

    NASA Astrophysics Data System (ADS)

    Djamel, Mimoun; Didier, Graillot

    2013-04-01

    During the last decades, droughts are occurring frequently in France, most notably in 1976, 1988, 1997 and 2003. This culminated in the severe drought of 2003 which affected mainly the south-east of Europe. Global climate models predict a prominent change in rainfall with wetter winters and drier summers over the medium latitude in the Northern Hemisphere. In France, regional climate models (ARPEGE) shows an increasing seasonal climatic variability with (a) hotter, drier summer and (b) an increase in the duration and severity of low-flow periods. The paper focuses on the temperate zone of the south-east of France on the catchment of the Ain river where water resources, consisting mainly of karstic and alluvial groundwater, are already a major concern today. This contribution tried to identify whether any trend in the annual and monthly series of rainfall already appears at the scale of this region and to obtain realistic previsions at 60 years. Two data sources have been used : (a) spatially interpolated historical data for the period 1970-2006 from the French weather service model SAFRAN (NCEP re-analysis for the MSLP field and the Meteo-France SAFRAN mesoscale analysis for the precipitation observations); and (b) the four SRES B2 scenarios namely Arpege_2, Arpege_1, Arpege_A2 and Arpege_B1 have been widely adopted as standard scenarios for the use in climate change impact studies. Scenario runs were taken over two time periods: a) 2010-2040 and b) 2041-2070. Drought characteristics over the study area were revealed by employing the Standardized Precipitation Index (SPI) in different time scales. Negative trends of the SPI drought index were recognized by using the Mann-Kendall non parametric test, which suggested that drought conditions were intensified through time. The trends observed in the 13 sub catchments of interest are consistent with those observed at a larger scale. The results indicated that the drought severity and duration will increase in the future

  9. Infodemiological data concerning silicosis in the USA in the period 2004-2010 correlating with real-world statistical data.

    PubMed

    Bragazzi, Nicola Luigi; Dini, Guglielmo; Toletone, Alessandra; Brigo, Francesco; Durando, Paolo

    2017-02-01

    This article reports data concerning silicosis-related web-activities using Google Trends (GT) capturing the Internet behavior in the USA for the period 2004-2010. GT-generated data were then compared with the most recent available epidemiological data of silicosis mortality obtained from the Centers for Disease Control and Prevention for the same study period. Statistically significant correlations with epidemiological data of silicosis (r=0.805, p-value <0.05) and other related web searches were found. The temporal trend well correlated with the epidemiological data, as well as the geospatial distribution of the web-activities with the geographic epidemiology of silicosis.

  10. Statistic analysis of annual total ozone extremes for the period 1964-1988

    NASA Technical Reports Server (NTRS)

    Krzyscin, Janusz W.

    1994-01-01

    Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.

  11. Statistics of Long-Period Gas Giant Planets in Known Planetary Systems

    NASA Astrophysics Data System (ADS)

    Levesque Bryan, Marta; Knutson, Heather; Howard, Andrew; Ngo, Henry; Batygin, Konstantin; Crepp, Justin; Fulton, Benjamin; Hinkley, Sasha; Isaacson, Howard T.; Johnson, John Asher; Marcy, Geoffrey W.; Wright, Jason

    2015-12-01

    We conducted a Doppler survey at Keck combined with NIRC2 K-band AO imaging to search for massive, long-period companions to 123 known exoplanet systems with one or two planets detected using the radial velocity (RV) method. Our survey is sensitive to Jupiter mass planets out to 20 AU for a majority of the stars in our sample, and we report the discovery of eight new long-period planets in addition to 20 RV trends at 3 sigma significance indicating the presence of an outer companion beyond 5 AU. We combined our RV observations with AO imaging to determine the range of allowed masses and orbital separations for these companions and fit this population with a power law in mass and semi-major axis. We estimate the total occurrence rate of companions in our sample, and find that hot and warm gas giants inside 1 AU are more likely to have an outer companion than cold gas giants. We also find that planets with an outer companion have higher than average eccentricities than their single counterparts, suggesting that dynamical interactions between planets may play an important role in these systems.

  12. Detecting multiple periodicities in observational data with the multifrequency periodogram - I. Analytic assessment of the statistical significance

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-11-01

    We consider the `multifrequency' periodogram, in which the putative signal is modelled as a sum of two or more sinusoidal harmonics with independent frequencies. It is useful in cases when the data may contain several periodic components, especially when their interaction with each other and with the data sampling patterns might produce misleading results. Although the multifrequency statistic itself was constructed earlier, for example by G. Foster in his CLEANest algorithm, its probabilistic properties (the detection significance levels) are still poorly known and much of what is deemed known is not rigorous. These detection levels are nonetheless important for data analysis. We argue that to prove the simultaneous existence of all n components revealed in a multiperiodic variation, it is mandatory to apply at least 2n - 1 significance tests, among which most involve various multifrequency statistics, and only n tests are single-frequency ones. The main result of this paper is an analytic estimation of the statistical significance of the frequency tuples that the multifrequency periodogram can reveal. Using the theory of extreme values of random fields (the generalized Rice method), we find a useful approximation to the relevant false alarm probability. For the double-frequency periodogram, this approximation is given by the elementary formula (π/16)W2e- zz2, where W denotes the normalized width of the settled frequency range, and z is the observed periodogram maximum. We carried out intensive Monte Carlo simulations to show that the practical quality of this approximation is satisfactory. A similar analytic expression for the general multifrequency periodogram is also given, although with less numerical verification.

  13. Statistically-Averaged Rate Equations Obtained in Kinetic Description of Intense Nonneutral Beam Propagation Through a Periodic Solenoidal Focusing Field

    NASA Astrophysics Data System (ADS)

    Lee, W. Wei-Li; Davidson, Ronald C.; Stoltz, Peter

    1997-11-01

    This paper presents a detailed formulation and analysis of the rate equations for statistically-averaged quantities for an intense nonneutral beam propagating through a periodic solenoidal focusing field. B^sol(x) = B_z(z)hatez - (1/2)B'_z(z)(xhatex + yhate_y), where B_z(z+S) = B_z(z), and S = const. is the axial periodicity length. The anaysis assumes a thin beam with characteristic beam radius rb << S, and is based on the nonlinear Vlasov-Maxwell equations. Particularly important in experimental applications and in numerical simulations schemes, such as the nonlinear δ f- scheme,(Q. Qian, W. Lee, and R. Davidson, Phys. Plasmas 4), 1915 (1997). is an understanding of the self-consistent nonlinear evolution of various quantities averaged over the distribution of beam particles f_b(x,p,t). Self-consistent rate equations are derived for the nonlinear evolution of the mean-square beam radius , mean kinetic energy (1/2), field energy ɛ_F(z), unnormalized beam emittance ɛ(z), center of mass motion, etc., and the nonlinear beam dynamics is analysed over a wide range of system parameters.

  14. Planetary populations in the mass-period diagram: A statistical treatment of exoplanet formation and the role of planet traps

    SciTech Connect

    Hasegawa, Yasuhiro; Pudritz, Ralph E. E-mail: pudritz@physics.mcmaster.ca

    2013-11-20

    The rapid growth of observed exoplanets has revealed the existence of several distinct planetary populations in the mass-period diagram. Two of the most surprising are (1) the concentration of gas giants around 1 AU and (2) the accumulation of a large number of low-mass planets with tight orbits, also known as super-Earths and hot Neptunes. We have recently shown that protoplanetary disks have multiple planet traps that are characterized by orbital radii in the disks and halt rapid type I planetary migration. By coupling planet traps with the standard core accretion scenario, we showed that one can account for the positions of planets in the mass-period diagram. In this paper, we demonstrate quantitatively that most gas giants formed at planet traps tend to end up around 1 AU, with most of these being contributed by dead zones and ice lines. We also show that a large fraction of super-Earths and hot Neptunes are formed as 'failed' cores of gas giants—this population being constituted by comparable contributions from dead zone and heat transition traps. Our results are based on the evolution of forming planets in an ensemble of disks where we vary only the lifetimes of disks and their mass accretion rates onto the host star. We show that a statistical treatment of the evolution of a large population of planetary cores caught in planet traps accounts for the existence of three distinct exoplanetary populations—the hot Jupiters, the more massive planets around r = 1 AU, and the short-period super-Earths and hot Neptunes. There are very few populations that feed into the large orbital radii characteristic of the imaged Jovian planet, which agrees with recent surveys. Finally, we find that low-mass planets in tight orbits become the dominant planetary population for low-mass stars (M {sub *} ≤ 0.7 M {sub ☉}).

  15. Statistical analysis and multi-instrument overview of the quasi-periodic 1-hour pulsations in Saturn's outer magnetosphere

    NASA Astrophysics Data System (ADS)

    Palmaerts, B.; Roussos, E.; Krupp, N.; Kurth, W. S.; Mitchell, D. G.; Yates, J. N.

    2016-06-01

    The in-situ exploration of the magnetospheres of Jupiter and Saturn has revealed different periodic processes. In particular, in the Saturnian magnetosphere, several studies have reported pulsations in the outer magnetosphere with a periodicity of about 1 h in the measurements of charged particle fluxes, plasma wave, magnetic field strength and auroral emissions brightness. The Low-Energy Magnetospheric Measurement System detector of the Magnetospheric Imaging Instrument (MIMI/LEMMS) on board Cassini regularly detects 1-hour quasi-periodic enhancements in the intensities of electrons with an energy range from a hundred keV to several MeV. We extend an earlier survey of these relativistic electron injections using 10 years of LEMMS observations in addition to context measurements by several other Cassini magnetospheric experiments. The one-year extension of the data and a different method of detection of the injections do not lead to a discrepancy with the results of the previous survey, indicating an absence of a long-term temporal evolution of this phenomenon. We identified 720 pulsed events in the outer magnetosphere over a wide range of latitudes and local times, revealing that this phenomenon is common and frequent in Saturn's magnetosphere. However, the distribution of the injection events presents a strong local time asymmetry with ten times more events in the duskside than in the dawnside. In addition to the study of their topology, we present a first statistical analysis of the pulsed events properties. The morphology of the pulsations shows a weak local time dependence which could imply a high-latitude acceleration source. We provide some clues that the electron population associated with this pulsed phenomenon is distinct from the field-aligned electron beams previously observed in Saturn's magnetosphere, but both populations can be mixed. We have also investigated the signatures of each electron injection event in the observations acquired by the Radio

  16. Cape Kennedy Wind Component Statistics Monthly and Annual Reference Periods for All Flight Azimuths from 0 to 70 KM Altitude

    NASA Technical Reports Server (NTRS)

    Brown, S. C.

    1969-01-01

    Head-, tail-, and cross-wind component speeds for Cape Kennedy are tabulated for all flight azimuths for altitudes from 0 to 70 kilometers by monthly and annual reference periods. Wind speeds are given for 11 selected percentiles ranging from 0.135 percent to 99.865 percent for each reference period.

  17. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters.

  18. 50-Minute Experiment: Soil Analysis for High School Chemistry Students.

    ERIC Educational Resources Information Center

    Baruch, Gerard, Ed.; And Others

    1980-01-01

    Lists equipment and materials needed and procedures for analyzing soil, in which secondary school students experience practical applications to acid-base reactions, pH, oxidation-reduction, precipitation and solubility. (CS)

  19. Dynamical and statistical phenomena of circulation and heat transfer in periodically forced rotating turbulent Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Sterl, Sebastian; Li, Hui-Min; Zhong, Jin-Qiang

    2016-12-01

    In this paper, we present results from an experimental study into turbulent Rayleigh-Bénard convection forced externally by periodically modulated unidirectional rotation rates. We find that the azimuthal rotation velocity θ ˙(t ) and thermal amplitude δ (t ) of the large-scale circulation (LSC) are modulated by the forcing, exhibiting a variety of dynamics including increasing phase delays and a resonant peak in the amplitude of θ ˙(t ) . We also focus on the influence of modulated rotation rates on the frequency of occurrence η of stochastic cessation or reorientation events, and on the interplay between such events and the periodically modulated response of θ ˙(t ) . Here we identify a mechanism by which η can be amplified by the modulated response, and these normally stochastic events can occur with high regularity. We provide a modeling framework that explains the observed amplitude and phase responses, and we extend this approach to make predictions for the occurrence of cessation events and the probability distributions of θ ˙(t ) and δ (t ) during different phases of a modulation cycle, based on an adiabatic approach that treats each phase separately. Last, we show that such periodic forcing has consequences beyond influencing LSC dynamics, by investigating how it can modify the heat transport even under conditions where the Ekman pumping effect is predominant and strong enhancement of heat transport occurs. We identify phase and amplitude responses of the heat transport, and we show how increased modulations influence the average Nusselt number.

  20. Statistical distribution and size effect of residual strength of quasibrittle materials after a period of constant load

    NASA Astrophysics Data System (ADS)

    Salviato, Marco; Kirane, Kedar; Bazˇant, Zdeneˇk P.

    2014-03-01

    In preceding studies, the type of cumulative probability distribution functions (cdf) of strength and of static lifetime of quasibrittle structures, including their tails, was mathematically derived from atomistic scale arguments based on nano-scale cracks propagating by many small, activation energy-controlled, random breaks of atomic bonds in the nanostructure. It was shown that a quasibrittle structure (of positive geometry) must be modeled by a finite (rather than infinite) weakest-link model, and that the cdf of structural strength as well as lifetime varies from nearly Gaussian to Weibullian as a function of structure size and shape. Excellent agreement with the observed distributions of structural strength and static lifetime was demonstrated. Based on the same theoretical framework, the present paper formulates the statistics of the residual structural strength, which is the strength after the structure has been subjected to sustained loading. A strength degradation equation is derived based on Evans' law for static crack growth during sustained loading. It is shown that the rate of strength degradation is not constant but continuously increasing. The cdf of residual strength of one RVE is shown to be closely approximated by a graft of Weibull and Gaussian (normal) distributions. In the left tail, the cdf is a three-parameter Weibull distribution consisting of the (n+1)th power of the residual strength, where n is the exponent of the Evans law and the threshold is a function of the applied load and load duration. The finiteness of the threshold, which is typically very small, is a new feature of quasibrittle residual strength statistics, contrasting with the previously established absence of a threshold for strength and lifetime. Its cause is that there is a non-zero probability that some specimens fail during the static preloading, and thus are excluded from the statistics of the overload. The predictions of the theory are validated by available test data

  1. Statistical methods for a three-period crossover design in which high dose cannot be used first.

    PubMed

    Peace, K E; Koch, G G

    1993-03-01

    Design and analysis methods for the three-period crossover trial defined by the sequences: (D0, D1, D2), (D1, D0, D2), and (D1, D2, D0), where D0 is a placebo, and D1 and D2 are a low dose and a high dose of a drug, respectively, are developed. This design may be used when investigators are unwilling to administer a higher dose of a new drug to a patient before administering a lower dose. In using this design, patients should be randomized to sequences in blocks that are integer multiples of 3. Both parametric and non-parametric analysis methods are based on contrasts that capture intrapatient variability only and provide unbiased estimates and hypothesis tests of pairwise differences between carryover, direct dose, and period effects. The design and methods are illustrated with data reflecting the cognitive component of the Alzheimer's disease assessment scale collected in a large clinical trial of Tacrine at doses of 0, 40, and 80 mg/day.

  2. Monitoring Radiofrequency Ablation Using Ultrasound Envelope Statistics and Shear Wave Elastography in the Periablation Period: An In Vitro Feasibility Study

    PubMed Central

    Tsui, Po-Hsiang; Wang, Chiao-Yin; Zhou, Zhuhuang; Wan, Yung-Liang

    2016-01-01

    Radiofrequency ablation (RFA) is a minimally invasive method for treating tumors. Shear wave elastography (SWE) has been widely applied in evaluating tissue stiffness and final ablation size after RFA. However, the usefulness of periablation SWE imaging in assessing RFA remains unclear. Therefore, this study investigated the correlation between periablation SWE imaging and final ablation size. An in vitro porcine liver model was used for experimental validation (n = 36). During RFA with a power of 50 W, SWE images were collected using a clinical ultrasound system. To evaluate the effects of tissue temperature and gas bubbles during RFA, changes in the ablation temperature were recorded, and image echo patterns were measured using B-mode and ultrasound statistical parametric images. After RFA, the gross pathology of each tissue sample was compared with the region of change in the corresponding periablation SWE image. The experimental results showed that the tissue temperature at the ablation site varied between 70°C and 100°C. Hyperechoic regions and changes were observed in the echo amplitude distribution induced by gas bubbles. Under this condition, the confounding effects (including the temperature increase, tissue stiffness increase, and presence of gas bubbles) resulted in artifacts in the periablation SWE images, and the corresponding region correlated with the estimated final ablation size obtained from the gross pathology (r = 0.8). The findings confirm the feasibility of using periablation SWE imaging in assessing RFA. PMID:27603012

  3. Climate change scenarios of temperature and precipitation over five Italian regions for the period 2021-2050 obtained by statistical downscaling models

    NASA Astrophysics Data System (ADS)

    Tomozeiu, R.; Tomei, F.; Villani, G.; Pasqui, M.

    2010-09-01

    Climate change scenarios of seasonal maximum, minimum temperature and precipitation in five Italian regions, over the period 2021-2050 against 1961-1990 are assessed. The regions selected by the AGROSCENARI project are important from the local agricultural practises and are situated as follows: in the Northern Italy - Po valley and hilly area of Faenza; in Central part of Italy- Marche, Beneventano and Destra Sele, and in Sardinia Island - Oristano. A statistical downscaling technique applied to the ENSEMBLES global climate simulations, A1B scenario, is used to reach this objective. The method consists of a multivariate regression, based on Canonical Correlation Analysis, using as possible predictors mean sea level pressure, geopotential height at 500hPa and temperature at 850 hPa. The observational data set (predictands) for the selected regions is composed by a reconstruction of minimum, maximum temperature and precipitation daily data on a regular grid with a spatial resolution of 35 km, for 1951-2009 period (managed by the Meteorological and Climatological research unit for agriculture - Agricultural Research Council, CRA - CMA). First, a set-up of statistical model has been made using predictors from ERA40 reanalysis and the seasonal indices of temperature and precipitation from local scale, 1958-2002 period. Then, the statistical downscaling model has been applied to the predictors derived from the ENSEMBLES global climate models, A1B scenario, in order to obtain climate change scenario of temperature and precipitation at local scale, 2021-2050 period. The projections show that increases could be expected to occur under scenario conditions in all seasons, in both minimum and maximum temperature. The magnitude of changes is more intense during summer when the changes could reach values around 2°C for minimum and maximum temperature. In the case of precipitation, the pattern of changes is more complex, different from season to season and over the regions, a

  4. Sensing propagation events and fade statistics at C-band for two over-water, line-of-sight propagation paths over a one year period

    NASA Technical Reports Server (NTRS)

    Goldhirsh, Julius; Dockery, G. Daniel; Musiani, Bert H.

    1992-01-01

    We examine signal fading statistics over a year period corresponding to two over-water, line-of-site, propagation links in the mid-Atlantic coast of the US. These links are comprised of a transmitter on a tower at Parramore Island, VA operating at 4.7 GHz sending simultaneous cw signals to two receiver systems located on a lighthouse and a lookout tower on Assateague Beach, VA at distances of 44 and 39 km, respectively. The receiving sites are separated by approximately 5 km. Cumulative fade distributions corresponding to yearly, monthly, and diurnal time scales were derived. Fade duration statistics correspond to sustained attenuation events were also derived. These events, which were arbitrarily defined as having fades relative to free space powers in excess of 20 dB for durations of two hours or more, are believed to be generally due to subrefraction. Analysis of synoptic weather conditions and nearby rawindsonde data during two sustained deep fading periods showed atmospheric conditions consistent with extreme subrefraction, where the refractivity-height profile had a positive lapse rate. The efficacy of employing the links as indicators of real time conditions of atmospheric propagation was also demonstrated by a telephone call-up procedure which enabled displays of time series of the fading at remote locations to be generated.

  5. Statistical modelling of the main features of the Artemisia pollen season in Wrocław, Poland, during the 2002-2011 time period

    NASA Astrophysics Data System (ADS)

    Drzeniecka-Osiadacz, Anetta; Krynicka, Justyna; Malkiewicz, Małgorzata; Klaczak, Kamilla; Migała, Krzysztof

    2015-02-01

    The aim of this article is to present statistical forecasting models concerning the dynamics of Artemisia pollen seasons in Wrocław, including the start and end, the date of maximum pollen concentration and seasonal pollen index (SPI). For statistical evaluation, use was made of aerobiological and meteorological data from the last 10 years (2002-2011). Based on this data, agroclimatic indicators, i.e. crop heat units (CHUs), were determined for various averaging periods. The beginning of the Artemisia pollen season in the studied time period, on average, took place on 23 June. Its length usually varied between 26 and 45 days, and maximum daily concentrations occurred between 31 July and 18 August. It was found that the beginning of the pollen season depends, above all, on the values of CHUs and photothermal unit (PTU) ( p < 0.05) in the period from March to June, for various thermal thresholds. The date of maximum daily concentration correlates with sunshine duration, PTU and air temperature for June and July ( p < 0.05). On the other hand, SPI is connected with thermal variables, i.e. average, maximum and minimum air temperatures and CHUs and heliothermal unit (HTU) for July ( p < 0.05) and the beginning of spring. Based on the correlation analysis and the chosen variables, regression models for the beginning date of Artemisia pollen season and SPI were prepared, which were then verified by using leave-one-out cross-validation. A better fit between modelled and actual values was found for the analysis concerning the season start date than for the SPI.

  6. A statistical analysis of energy and power demand for the tractive purposes of an electric vehicle in urban traffic - an analysis of a short and long observation period

    NASA Astrophysics Data System (ADS)

    Slaski, G.; Ohde, B.

    2016-09-01

    The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.

  7. Statistical analysis of test-day milk yields using random regression models for the comparison of feeding groups during the lactation period.

    PubMed

    Mielenz, Norbert; Spilke, Joachim; Krejcova, Hana; Schüler, Lutz

    2006-10-01

    Random regression models are widely used in the field of animal breeding for the genetic evaluation of daily milk yields from different test days. These models are capable of handling different environmental effects on the respective test day, and they describe the characteristics of the course of the lactation period by using suitable covariates with fixed and random regression coefficients. As the numerically expensive estimation of parameters is already part of advanced computer software, modifications of random regression models will considerably grow in importance for statistical evaluations of nutrition and behaviour experiments with animals. Random regression models belong to the large class of linear mixed models. Thus, when choosing a model, or more precisely, when selecting a suitable covariance structure of the random effects, the information criteria of Akaike and Schwarz can be used. In this study, the fitting of random regression models for a statistical analysis of a feeding experiment with dairy cows is illustrated under application of the program package SAS. For each of the feeding groups, lactation curves modelled by covariates with fixed regression coefficients are estimated simultaneously. With the help of the fixed regression coefficients, differences between the groups are estimated and then tested for significance. The covariance structure of the random and subject-specific effects and the serial correlation matrix are selected by using information criteria and by estimating correlations between repeated measurements. For the verification of the selected model and the alternative models, mean values and standard deviations estimated with ordinary least square residuals are used.

  8. Autopsy statistics on the relative frequency of acute myocardial infarction in the Japanese mental workers and the unemployed during the two oil-crises periods.

    PubMed

    Chang, N C; Kawai, S; Okada, R

    1989-03-20

    In order to investigate whether job induced emotional stress, arising from socioenvironmental disasters would act as a trigger for the onset of AMI, the author reviewed all pathological autopsies throughout Japan 15 years old and over. Data was obtained from publications of the "Annual of the Pathological Autopsy Cases in Japan" for the years 1966-1968 (a period of high economic growth), 1973-1975 (1974, the year of the first oil crisis), and 1978-1980 (1979, the year of the second oil crisis). Relative frequencies of AMI were significantly higher during the years of both oil crises than in both the preceding and following years (2.6% in 1973, 3.7% in 1974, 3.0% in 1975; 2.8% in 1978, 3.2% in 1979, and 2.0% in 1980), and in each of three years of the high economic growth period (1.9-2.2% in 1966-1968). The proportions of managers and officials among AMI victims were significantly higher in the years of both oil crises than in both the preceding and following years (13.4% in 1973, 17.5% in 1974, 12.5% in 1975; 11.6% in 1978, 15.8% in 1979, and 11.1% in 1980). Moreover, there was a significantly higher value in the year of first oil crisis than in each of three years of the high economic growth period (11.7-13.1% in 1966-1968). The proportions of "out of job" persons were also significantly higher in the years of both oil crises than in the preceding years (24.0% in 1973, 29.3% in 1974, 27.8% in 1975; 24.1% in 1978, 29.1% in 1979, and 27.4% in 1980). For 11,199 randomly selected autopsies, the proportions of AMI in the above two occupational groups were significantly higher in the years of both oil crises than in the preceding years. Moreover, the proportion of "out of job" persons was significantly higher in the year of first oil crisis than in each of three years of the high economic growth period. A similar trend was noted among professional and technical workers, with more AMI occurring in this group during the years of both oil crises than in both the preceding

  9. Causes of death among detainees: a statistical study on the casework of the Forensic Medicine Institute in Cluj-Napoca during the period 2000–2014

    PubMed Central

    GHERMAN, CRISTIAN; CHIROBAN, OVIDIU

    2015-01-01

    Background and aims The detainees’ right to healthcare is granted by laws, in accordance with EU directives and recommendations to which our country has consented. Prison population is a particularly vulnerable and marginalized group characterized by mortality rates different from the general population. This study aims at providing a picture of the causes of death, quality of healthcare and measures needed to reduce the number of in-prison deaths, including legal medicine expertise in view of sentence postponement/interruption. Methods The present paper is based on the statistical analysis of in-prison deaths casework recorded at the Forensic Medicine Institute of Cluj-Napoca and provided by territorially subordinated counties forensic services. The data collected cover over 15 years (2000–2014), a period long enough for significant retrospective statistical analysis. Results The total number of deaths among the inmates was 113, the majority of male sex (110). Distribution by age groups shows a greater incidence among inmates aged 50 to 59 years (32 cases, 28.31%), followed by those in their 40s’ (30 cases, 26.54%) and 30s’ (25 cases, 22.12%). The most frequent pathological causes of death were cardiovascular (53 cases) followed by tumors (26 cases) and infectious diseases. A significant number of deaths were due to violent causes (14 cases-12,38%). Conclusions Special problems are raised by the high number of deaths among prisoners, especially at a young age, while the high frequency of violent deaths from self- or non-self-inflicted traumatic causes requires supervision, monitoring and continuous analysis. Despite recent improvements, healthcare in prisons still poses some problems, mainly regarding diagnosis and treatment of heart diseases, neurosurgery and cancer. PMID:26609263

  10. Medicare program; statistical standards for evaluating intermediary performance during fiscal year 1982--Health Care Financing Administration. General notice with comment period.

    PubMed

    1982-10-05

    This is HCFAs annual notice containing statistical standards to be used for evaluating the performance of fiscal intermediaries in the administration of the Medicare program for fiscal year 1982. The standards were developed from available statistical data contained in routine intermediary reports and consists of measures of timeliness and of cost of an intermediary's Medicare operations. The results of the evaluations are considered whenever we make, renew or terminate an intermediary agreement; assign or reassing providers of services to an intermediary; or designate regional or national intermediaries.

  11. An analysis of the AVE-SESAME I period using statistical structure and correlation functions. [Atmospheric Variability Experiment-Severe Environmental Storm and Mesoscale Experiment

    NASA Technical Reports Server (NTRS)

    Fuelberg, H. E.; Meyer, P. J.

    1984-01-01

    Structure and correlation functions are used to describe atmospheric variability during the 10-11 April day of AVE-SESAME 1979 that coincided with the Red River Valley tornado outbreak. The special mesoscale rawinsonde data are employed in calculations involving temperature, geopotential height, horizontal wind speed and mixing ratio. Functional analyses are performed in both the lower and upper troposphere for the composite 24 h experiment period and at individual 3 h observation times. Results show that mesoscale features are prominent during the composite period. Fields of mixing ratio and horizontal wind speed exhibit the greatest amounts of small-scale variance, whereas temperature and geopotential height contain the least. Results for the nine individual times show that small-scale variance is greatest during the convective outbreak. The functions also are used to estimate random errors in the rawinsonde data. Finally, sensitivity analyses are presented to quantify confidence limits of the structure functions.

  12. The impact of Mount Etna sulfur emissions on the atmospheric composition and aerosol properties in the central Mediterranean: A statistical analysis over the period 2000-2013 based on observations and Lagrangian modelling

    NASA Astrophysics Data System (ADS)

    Sellitto, Pasquale; Zanetel, Claudia; di Sarra, Alcide; Salerno, Giuseppe; Tapparo, Andrea; Meloni, Daniela; Pace, Giandomenico; Caltabiano, Tommaso; Briole, Pierre; Legras, Bernard

    2017-01-01

    The emission of gases and aerosols due to volcanic activity may impact significantly atmospheric composition, cloud occurrence and properties, and the regional and global climate. While the effects of strong explosive (stratospheric) eruptions are relatively well known, limited information on the impacts of small to moderate volcanic activities, including passive degassing, is available. In this paper, the downwind impact of Mount Etna's sulfur emissions on the central Mediterranean is investigated on a statistical basis over the period 2000-2013 using: (a) daily sulfur dioxide emission rates measured near crater at Mount Etna with ground-based ultraviolet spectrophotometers, (b) Lagrangian trajectories and simulated plume dispersion obtained with the FLEXPART (FLEXible PARTicle dispersion) model, and (c) long-term observations of column SO2 concentration and aerosol Ångström exponent α at Lampedusa (35.5° N, 12.6° E). This statistical analysis has allowed, for the first time, the characterization of decadal impact of Mount Etna's sulfur emissions on the sulfur dioxide and the aerosol microphysical/optical properties in the central Mediterranean. On average, statistically significant higher SO2 concentrations and smaller aerosol sizes are present when air masses from Mount Etna overpass Lampedusa. Despite being upwind of Lampedusa for only 5% of the time, Mount Etna is potentially responsible for up to 40% and 20% of the SO2 and α extreme values (exceedances of a fixed threshold), respectively, at this location. The most important factor determining this perturbation is the prevailing dynamics, while the magnitude of the SO2 emission rates from Mount Etna appears to be likely important only for relatively strong emissions. The observed perturbations to the aerosol size distribution are expected to produce a direct regional radiative effect in this area.

  13. Life outside the 50-Minute Hour: The Personal Lives of Counsellors

    ERIC Educational Resources Information Center

    Kennedy, Barbara Sampaio Alhanati; Black, Timothy G.

    2010-01-01

    This study investigates the effects that becoming and being a professional counsellor, including training and professional practice, can have on one's personal life. Semi-structured interviews were conducted with six professional counsellors, asking how their training and professional practice has affected their personal lives. Qualitative…

  14. Poverty and mental health practice: within and beyond the 50-minute hour.

    PubMed

    Goodman, Lisa A; Pugach, Meghan; Skolnik, Avy; Smith, Laura

    2013-02-01

    Despite the high and increasing prevalence of poverty in the United States, psychologists and allied professionals have done little to develop mental health interventions that are tailored to the specific sociocultural experiences of low-income families. In this article, we describe the sociocultural stressors that accompany the material deprivations of poverty, and the mental health difficulties to which they often give rise. Next, we outline the psychosocial and class-related issues surrounding low-income adults' access to and use of mental health services and suggest a conceptual framework to guide the modification of mental health practice to better accommodate poor peoples' complex needs. This framework describes opportunities for practice modification at three levels of intervention, beginning at the individual level of traditional individual psychotherapy and subsequently targeting increasingly broad contextual elements of poverty.

  15. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.

  16. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  17. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  18. Neural, pituitary, and mammary tumors in Sprague-Dawley rats treated with X irradiation to the heat and N-ethyl-N-nitrosourea (ENU) during the early postnatal period: a statistical study of tumor incidence and survival

    SciTech Connect

    Mandybur, T.I.; Ormsby, I.; Samuels, S.; Mancardi, G.L.

    1985-03-01

    To study the late effects of early postnatal treatment with N-ethyl-N-nitrosourea (ENU) preceded by X irradiation to the heat, 226 neonatal CD rats were divided into six groups which received the following treatment: (1) 500-rad X irradiation to the head on the third postnatal day (pnd); (2) injection ip with 30 mg/kg ENU on the fourth pnd; (3) injection ip with 30 mg/kg ENU on the seventh pnd; (4) a combination of 500-rad X irradiation to the head on the third pnd, followed by ip 30 mg/kg ENU on the fourth pnd; (5) a combination of 500-rad X irradiation to the head on the third pnd, followed by ip 30 mg/kg ENU on the seventh pnd; and (6) untreated controls. The results indicate that (1) X irradiation to the head alone significantly extended the life span of females compared to that of control females, and did not affect survival of males; (2) X irradiation did not influence the latent period of mortality from neurogenic tumors when ENU was given 1 or 3 days afterward; (3) ENU itself was a factor in shortening latent period for mammary tumors; (4) X irradiation alone did not increase the incidence of mammary tumors, and revealed no protective effect on the ENU-induced mammary carcinogenesis; (5) X irradiation increased the prevalence of pituitary tumors in the females; (6) no enhancement of pituitary tumors by ENU was observed; and (7) there was a statistically significant association of pituitary and mammary tumors in females.

  19. High-Resolution Dynamical Downscaling of ERA-Interim Using the WRF Regional Climate Model for the Area of Poland. Part 1: Model Configuration and Statistical Evaluation for the 1981-2010 Period

    NASA Astrophysics Data System (ADS)

    Kryza, Maciej; Wałaszek, Kinga; Ojrzyńska, Hanna; Szymanowski, Mariusz; Werner, Małgorzata; Dore, Anthony J.

    2017-02-01

    In this work, we present the results of high-resolution dynamical downscaling of air temperature, relative humidity, wind speed and direction, for the area of Poland, with the Weather Research and Forecasting (WRF) model. The model is configured using three nested domains, with spatial resolution of 45 km × 45 km, 15 km × 15 km and 5 km × 5 km. The ERA-Interim database is used for boundary conditions. The results are evaluated by comparison with station measurements for the period 1981-2010. The model is capable of reproducing the main climatological features of the study area. The results are in very close agreement with the measurements, especially for the air temperature. For all four meteorological variables, the model performance captures seasonal and daily cycles. For the air temperature and winter season, the model underestimates the measurements. For summer, the model shows higher values, compared with the measurements. The opposite is the case for relative humidity. There is a strong diurnal pattern in mean error, which changes seasonally. The agreement with the measurements is worse for the seashore and mountain areas, which suggests that the 5 km × 5 km grid might still have an insufficient spatial resolution. There is no statistically significant temporal trend in the model performance. The larger year-to-year changes in the model performance, e.g. for the years 1982 and 2010 for the air temperature should therefore be linked with the natural variability of meteorological conditions.

  20. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  1. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  2. Quick Statistics

    MedlinePlus

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  3. Learning Statistical Concepts

    ERIC Educational Resources Information Center

    Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah

    2004-01-01

    In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…

  4. Education Statistics Quarterly, 2003.

    ERIC Educational Resources Information Center

    Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  5. Irregular Periods

    MedlinePlus

    ... number of days after the last one. The Menstrual Cycle Most girls get their first period between the ... to skip periods or to have an irregular menstrual cycle. Illness, rapid weight change, or stress can also ...

  6. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  7. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  8. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  9. [Descriptive statistics].

    PubMed

    Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.

  10. Order Statistics and Nonparametric Statistics.

    DTIC Science & Technology

    2014-09-26

    Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of

  11. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  12. Education Statistics Quarterly, Fall 2000.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2000-01-01

    The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released during a 3-month period. Each message also contains a…

  13. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  14. Period Pain

    MedlinePlus

    ... You may also have other symptoms, such as lower back pain, nausea, diarrhea, and headaches. Period pain is not ... Taking a hot bath Doing relaxation techniques, including yoga and meditation You might also try taking over- ...

  15. Periodized wavelets

    SciTech Connect

    Schlossnagle, G.; Restrepo, J.M.; Leaf, G.K.

    1993-12-01

    The properties of periodized Daubechies wavelets on [0,1] are detailed and contrasted against their counterparts which form a basis for L{sup 2}(R). Numerical examples illustrate the analytical estimates for convergence and demonstrate by comparison with Fourier spectral methods the superiority of wavelet projection methods for approximations. The analytical solution to inner products of periodized wavelets and their derivatives, which are known as connection coefficients, is presented, and several tabulated values are included.

  16. Stupid statistics!

    PubMed

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  17. Plastic Surgery Statistics

    MedlinePlus

    ... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...

  18. Statistical evaluation of forecasts.

    PubMed

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  19. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  20. Time domain period determination techniques

    NASA Technical Reports Server (NTRS)

    Stellingwerf, R. F.

    1980-01-01

    Two simple period determination schemes are discussed. They are well suited to problems involving non-sinusoidal periodic phenomena sampled at a few irregularly spaced points. Statistical properties are discussed. The techniques are applied to the double mode Cepheids BK Cen and TU Cas as test cases.

  1. Time domain period determination techniques

    NASA Astrophysics Data System (ADS)

    Stellingwerf, R. F.

    1980-05-01

    Two simple period determination schemes are discussed. They are well suited to problems involving non-sinusoidal periodic phenomena sampled at a few irregularly spaced points. Statistical properties are discussed. The techniques are applied to the double mode Cepheids BK Cen and TU Cas as test cases.

  2. Periodic Polymers

    NASA Astrophysics Data System (ADS)

    Thomas, Edwin

    2013-03-01

    Periodic polymers can be made by self assembly, directed self assembly and by photolithography. Such materials provide a versatile platform for 1, 2 and 3D periodic nano-micro scale composites with either dielectric or impedance contrast or both, and these can serve for example, as photonic and or phononic crystals for electromagnetic and elastic waves as well as mechanical frames/trusses. Compared to electromagnetic waves, elastic waves are both less complex (longitudinal modes in fluids) and more complex (longitudinal, transverse in-plane and transverse out-of-plane modes in solids). Engineering of the dispersion relation between wave frequency w and wave vector, k enables the opening of band gaps in the density of modes and detailed shaping of w(k). Band gaps can be opened by Bragg scattering, anti-crossing of bands and discrete shape resonances. Current interest is in our group focuses using design - modeling, fabrication and measurement of polymer-based periodic materials for applications as tunable optics and control of phonon flow. Several examples will be described including the design of structures for multispectral band gaps for elastic waves to alter the phonon density of states, the creation of block polymer and bicontinuous metal-carbon nanoframes for structures that are robust against ballistic projectiles and quasi-crystalline solid/fluid structures that can steer shock waves.

  3. MQSA National Statistics

    MedlinePlus

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  4. Education Statistics Quarterly, Fall 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2001-01-01

    The publication gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message from…

  5. Education Statistics Quarterly, Fall 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  6. Education Statistics Quarterly, Spring 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  7. Education Statistics Quarterly, Summer 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  8. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  9. Education Statistics Quarterly, Winter 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  10. Education Statistics Quarterly, Summer 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2001-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released during a 3-month period. Each issue also contains a message from the NCES on a…

  11. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  12. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  13. Middle atmosphere general circulation statistics

    NASA Technical Reports Server (NTRS)

    Geller, M. A.

    1985-01-01

    With the increased availability of remote sensing data for the middle atmosphere from satellites, more analyses of the middle atmosphere circulation are being published. Some of these are process studies for limited periods, and some are statistical analyses of middle atmosphere general circulation statistics. Results from the latter class of studies will be reviewed. These include analysis of the zonally averaged middle atmosphere structure, temperature, and zonal winds; analysis of planetary wave structures, analysis of heat and momentum fluxes; and analysis of Eliassen-and-Palm flux vectors and flux divergences. Emphasis is on the annual march of these quantities; Northern and Southern Hemisphere asymmetries; and interannual variability in these statistics. Statistics involving the global ozone distribution and transports of ozone are also discussed.

  14. Periodicity In The Intervals Between Primes

    DTIC Science & Technology

    2015-07-02

    statistically strong periodicity is identified in the counting function giving the total number of intervals of a certain size. The nature of the periodic...positive intervals among the first n<=10^6 prime numbers as a probe of the global nature of the sequence of primes. A statistically strong periodicity is...Let x = x1, x2, . . . be an increasing sequence of real numbers which may be either finite or infinitely long. Throughout the following every bold

  15. Neuroendocrine Tumor: Statistics

    MedlinePlus

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on how many people survive this type of ...

  16. Adrenal Gland Tumors: Statistics

    MedlinePlus

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  17. PROBABILITY AND STATISTICS.

    DTIC Science & Technology

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  18. About the Solar Activity Rotation Periods

    NASA Astrophysics Data System (ADS)

    Mouradian, Zadig

    2007-03-01

    The purpose of this paper is to evidence, from a statistical point of view, the different periods of solar activity. The well known period is that of 150-160 days, but many others were detected between 9 and 4750 days (length of solar cycle). We tabulated 49 articles revealing 231 periods. In order to explain them, different hypotheses were suggested.

  19. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  20. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  1. Australia 31-GHz brightness temperature exceedance statistics

    NASA Technical Reports Server (NTRS)

    Gary, B. L.

    1988-01-01

    Water vapor radiometer measurements were made at DSS 43 during an 18 month period. Brightness temperatures at 31 GHz were subjected to a statistical analysis which included correction for the effects of occasional water on the radiometer radome. An exceedance plot was constructed, and the 1 percent exceedance statistics occurs at 120 K. The 5 percent exceedance statistics occurs at 70 K, compared with 75 K in Spain. These values are valid for all of the three month groupings that were studied.

  2. Idaho Public Library Statistics, FY 1995.

    ERIC Educational Resources Information Center

    Bolles, Charles

    This document is a compilation of input and output measures and other statistics in reference to Idaho's public libraries, covering the period October 1, 1994 to September 30, 1995. This report includes data gathered by the State Library with the "Public and District Library Annual Statistical Report Form" which reflects all monies…

  3. On More Sensitive Periodogram Statistics

    NASA Astrophysics Data System (ADS)

    Bélanger, G.

    2016-05-01

    Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.

  4. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  5. Uterine Cancer Statistics

    MedlinePlus

    ... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  6. Experiment in Elementary Statistics

    ERIC Educational Resources Information Center

    Fernando, P. C. B.

    1976-01-01

    Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)

  7. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  8. Teaching Statistics Using SAS.

    ERIC Educational Resources Information Center

    Mandeville, Garrett K.

    The Statistical Analysis System (SAS) is presented as the single most appropriate statistical package to use as an aid in teaching statistics. A brief review of literature in which SAS is compared to SPSS, BMDP, and other packages is followed by six examples which demonstrate features unique to SAS which have pedagogical utility. Of particular…

  9. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  10. Periodicity in marine extinction events

    NASA Technical Reports Server (NTRS)

    Sepkoski, J. John, Jr.; Raup, David M.

    1986-01-01

    The periodicity of extinction events is examined in detail. In particular, the temporal distribution of specific, identifiable extinction events is analyzed. The nature and limitations of the data base on the global fossil record is discussed in order to establish limits of resolution in statistical analyses. Peaks in extinction intensity which appear to differ significantly from background levels are considered, and new analyses of the temporal distribution of these peaks are presented. Finally, some possible causes of periodicity and of interdependence among extinction events over the last quarter billion years of earth history are examined.

  11. Statistical Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Statistical methodology, with deep roots in probability theory, providesquantitative procedures for extracting scientific knowledge from astronomical dataand for testing astrophysical theory. In recent decades, statistics has enormouslyincreased in scope and sophistication. After a historical perspective, this reviewoutlines concepts of mathematical statistics, elements of probability theory,hypothesis tests, and point estimation. Least squares, maximum likelihood, andBayesian approaches to statistical inference are outlined. Resampling methods,particularly the bootstrap, provide valuable procedures when distributionsfunctions of statistics are not known. Several approaches to model selection andgoodness of fit are considered.

  12. Evolution of periodicity in periodical cicadas.

    PubMed

    Ito, Hiromu; Kakishima, Satoshi; Uehara, Takashi; Morita, Satoru; Koyama, Takuya; Sota, Teiji; Cooley, John R; Yoshimura, Jin

    2015-09-14

    Periodical cicadas (Magicicada spp.) in the USA are famous for their unique prime-numbered life cycles of 13 and 17 years and their nearly perfectly synchronized mass emergences. Because almost all known species of cicada are non-periodical, periodicity is assumed to be a derived state. A leading hypothesis for the evolution of periodicity in Magicicada implicates the decline in average temperature during glacial periods. During the evolution of periodicity, the determinant of maturation in ancestral cicadas is hypothesized to have switched from size dependence to time (period) dependence. The selection for the prime-numbered cycles should have taken place only after the fixation of periodicity. Here, we build an individual-based model of cicadas under conditions of climatic cooling to explore the fixation of periodicity. In our model, under cold environments, extremely long juvenile stages lead to extremely low adult densities, limiting mating opportunities and favouring the evolution of synchronized emergence. Our results indicate that these changes, which were triggered by glacial cooling, could have led to the fixation of periodicity in the non-periodical ancestors.

  13. Evolution of periodicity in periodical cicadas

    PubMed Central

    Ito, Hiromu; Kakishima, Satoshi; Uehara, Takashi; Morita, Satoru; Koyama, Takuya; Sota, Teiji; Cooley, John R.; Yoshimura, Jin

    2015-01-01

    Periodical cicadas (Magicicada spp.) in the USA are famous for their unique prime-numbered life cycles of 13 and 17 years and their nearly perfectly synchronized mass emergences. Because almost all known species of cicada are non-periodical, periodicity is assumed to be a derived state. A leading hypothesis for the evolution of periodicity in Magicicada implicates the decline in average temperature during glacial periods. During the evolution of periodicity, the determinant of maturation in ancestral cicadas is hypothesized to have switched from size dependence to time (period) dependence. The selection for the prime-numbered cycles should have taken place only after the fixation of periodicity. Here, we build an individual-based model of cicadas under conditions of climatic cooling to explore the fixation of periodicity. In our model, under cold environments, extremely long juvenile stages lead to extremely low adult densities, limiting mating opportunities and favouring the evolution of synchronized emergence. Our results indicate that these changes, which were triggered by glacial cooling, could have led to the fixation of periodicity in the non-periodical ancestors. PMID:26365061

  14. Model Valid Prediction Period

    NASA Astrophysics Data System (ADS)

    Chu, P. C.

    2002-12-01

    A new concept, valid prediction period (VPP), is presented here to evaluate model predictability. VPP is defined as the time period when the prediction error first exceeds a pre-determined criterion (i.e., the tolerance level). It depends not only on the instantaneous error growth, but also on the noise level, the initial error, and tolerance level. The model predictability skill is then represented by a single scalar, VPP. The longer the VPP, the higher the model predictability skill is. A theoretical framework on the base of the backward Fokker-Planck equation is developed to determine the probability density function (pdf) of VPP. Verification of a Gulf of Mexico nowcast/forecast model is used as an example to demonstrate the usefulness of VPP. Power law scaling is found in the mean square error of displacement between drifting buoy and model trajectories (both at 50 m depth). The pdf of VPP is asymmetric with a long and broad tail on the higher value side, which suggests long-term predictability. The calculations demonstrate that the long-term (extreme long such as 50-60 day) predictability is not an "outlier" and shares the same statistical properties as the short-term predictions. References Chu P. C., L. M. Ivanov, and C.W. Fan, Backward Fokker-Plank equation for determining model predictability with unknown initial error distribution. J. Geophys. Res., in press, 2002. Chu P.C., L.M.Ivanov, T.M. Margolina, and O.V.Melnichenko, 2002b: On probabilistic stability of an atmospheric model to various amplitude perturbations. J. Atmos. Sci., in press Chu P.C., L.M. Ivanov, L. Kantha, O.V. Melnichenko and Y.A. Poberezhny, 2002c: The long-term correlations and power decay law in model prediction skill. Geophys. Res. Let., in press.

  15. Heroin: Statistics and Trends

    MedlinePlus

    ... Naloxone Pain Prevention Treatment Trends & Statistics Women and Drugs Publications Funding Funding Opportunities Clinical Research Post-Award Concerns General Information Grant & Contract Application ...

  16. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  17. Quantifying periodicity in omics data

    PubMed Central

    Amariei, Cornelia; Tomita, Masaru; Murray, Douglas B.

    2014-01-01

    Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar, and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover, we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively. PMID:25364747

  18. Searching for periodicity in weighted time point series.

    NASA Astrophysics Data System (ADS)

    Jetsu, L.; Pelt, J.

    1996-09-01

    Consistent statistics for two methods of searching for periodicity in a series of weighted time points are formulated. An approach based on the bootstrap method to estimate the accuracy of detected periodicity is presented.

  19. Statistical Fault Detection & Diagnosis Expert System

    SciTech Connect

    Wegerich, Stephan

    1996-12-18

    STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component has degraded.

  20. Model for neural signaling leap statistics

    NASA Astrophysics Data System (ADS)

    Chevrollier, Martine; Oriá, Marcos

    2011-03-01

    We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T = 37.5°C, awaken regime) and Lévy statistics (T = 35.5°C, sleeping period), characterized by rare events of long range connections.

  1. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…

  2. Teaching Statistics without Sadistics.

    ERIC Educational Resources Information Center

    Forte, James A.

    1995-01-01

    Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…

  3. STATSIM: Exercises in Statistics.

    ERIC Educational Resources Information Center

    Thomas, David B.; And Others

    A computer-based learning simulation was developed at Florida State University which allows for high interactive responding via a time-sharing terminal for the purpose of demonstrating descriptive and inferential statistics. The statistical simulation (STATSIM) is comprised of four modules--chi square, t, z, and F distribution--and elucidates the…

  4. Understanding Undergraduate Statistical Anxiety

    ERIC Educational Resources Information Center

    McKim, Courtney

    2014-01-01

    The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…

  5. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  6. Towards Statistically Undetectable Steganography

    DTIC Science & Technology

    2011-06-30

    Statistically Undciectable Steganography 5a. CONTRACT NUMBER FA9550-08-1-0084 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Prof. Jessica...approved for public release: distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Fundamental asymptotic laws for imperfect steganography ...formats. 15. SUBJECT TERMS Steganography . covert communication, statistical detectability. asymptotic performance, secure pay load, minimum

  7. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…

  8. Option Y, Statistics.

    ERIC Educational Resources Information Center

    Singer, Arlene

    This guide outlines a one semester Option Y course, which has seven learner objectives. The course is designed to provide students with an introduction to the concerns and methods of statistics, and to equip them to deal with the many statistical matters of importance to society. Topics covered include graphs and charts, collection and…

  9. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  10. Statistics and Measurements

    PubMed Central

    Croarkin, M. Carroll

    2001-01-01

    For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023

  11. [Statistics quantum satis].

    PubMed

    Pestana, Dinis

    2013-01-01

    Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.

  12. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  13. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  14. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  15. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  16. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  17. Hypokalemic periodic paralysis

    MedlinePlus

    Periodic paralysis - hypokalemic; Familial hypokalemic periodic paralysis; HOKPP; HypoKPP; HypoPP ... is not inherited. Unlike other forms of periodic paralysis, people with hypoPP have normal thyroid function. But ...

  18. Hyperkalemic periodic paralysis

    MedlinePlus

    Periodic paralysis - hyperkalemic; Familial hyperkalemic periodic paralysis; HyperKPP; HyperPP; Gamstorp disease ... factors include having other family members with periodic paralysis. It affects men more often than women.

  19. Vaginal bleeding between periods

    MedlinePlus

    ... periods; Intermenstrual bleeding; Spotting; Metrorrhagia Images Female reproductive anatomy Bleeding between periods Uterus References Bulun SE. The physiology and pathology of the female reproductive axis. In: ...

  20. Dynamic Statistics of Crayfish Caudal Photoreceptors

    PubMed Central

    Hermann, Howard T.; Olsen, Richard E.

    1967-01-01

    Crayfish caudal photoreceptor units were monitored during transient and steady-state responses to light stimuli (step on, step off). A statistical analysis of interpulse interval distributions during quasi-stationary time periods was carried out. Firing statistics during transient conditions were superposable with statistics under whatever steady stimulation produced the same firing rate, indicating that mean firing rate is a sufficient statistic. Distributions encountered formed a continuum of possible shapes. Considerable variation in shape was found with temperature and also among species, with Orconectes clarkii tending to fire more regularly than Orconectes virilis. Some properties of O. virilis statistics are described, including a linear relation between mean and standard deviation, and a tendency for intervals to be nonindependent. The data are considered as constraints on closed form models of the photoreceptor nerve pulse generator. PMID:6035125

  1. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  2. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  3. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  4. Commentary: statistics for biomarkers.

    PubMed

    Lovell, David P

    2012-05-01

    This short commentary discusses Biomarkers' requirements for the reporting of statistical analyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statistical analyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.

  5. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions.

  6. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  7. Hemophilia Data and Statistics

    MedlinePlus

    ... Hemophilia Women Healthcare Providers Partners Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... at a very young age. Based on CDC data, the median age at diagnosis is 36 months ...

  8. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  9. Statistics of the sagas

    NASA Astrophysics Data System (ADS)

    Richfield, Jon; bookfeller

    2016-07-01

    In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.

  10. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  11. Plague Maps and Statistics

    MedlinePlus

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...

  12. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  13. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  14. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  15. Boosted Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Testa, Massimo

    2015-08-01

    Starting with the basic principles of Relativistic Quantum Mechanics, we give a rigorous, but completely elementary proof of the relation between fundamental observables of a statistical system, when measured within two inertial reference frames, related by a Lorentz transformation.

  16. How Statistics "Excel" Online.

    ERIC Educational Resources Information Center

    Chao, Faith; Davis, James

    2000-01-01

    Discusses the use of Microsoft Excel software and provides examples of its use in an online statistics course at Golden Gate University in the areas of randomness and probability, sampling distributions, confidence intervals, and regression analysis. (LRW)

  17. Statistical Perspectives on Stratospheric Transport

    NASA Technical Reports Server (NTRS)

    Sparling, L. C.

    1999-01-01

    Long-lived tropospheric source gases, such as nitrous oxide, enter the stratosphere through the tropical tropopause, are transported throughout the stratosphere by the Brewer-Dobson circulation, and are photochemically destroyed in the upper stratosphere. These chemical constituents, or "tracers" can be used to track mixing and transport by the stratospheric winds. Much of our understanding about the stratospheric circulation is based on large scale gradients and other spatial features in tracer fields constructed from satellite measurements. The point of view presented in this paper is different, but complementary, in that transport is described in terms of tracer probability distribution functions (PDFs). The PDF is computed from the measurements, and is proportional to the area occupied by tracer values in a given range. The flavor of this paper is tutorial, and the ideas are illustrated with several examples of transport-related phenomena, annotated with remarks that summarize the main point or suggest new directions. One example shows how the multimodal shape of the PDF gives information about the different branches of the circulation. Another example shows how the statistics of fluctuations from the most probable tracer value give insight into mixing between different regions of the atmosphere. Also included is an analysis of the time-dependence of the PDF during the onset and decline of the winter circulation, and a study of how "bursts" in the circulation are reflected in transient periods of rapid evolution of the PDF. The dependence of the statistics on location and time are also shown to be important for practical problems related to statistical robustness and satellite sampling. The examples illustrate how physically-based statistical analysis can shed some light on aspects of stratospheric transport that may not be obvious or quantifiable with other types of analyses. An important motivation for the work presented here is the need for synthesis of the

  18. Education Statistics Quarterly. Volume 5, Issue 1.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data product, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  19. Rate statistics for radio noise from lightning

    NASA Technical Reports Server (NTRS)

    Levine, D. M.; Meneghini, R.; Tretter, S. A.

    1980-01-01

    Radio frequency noise from lightning was measured at several frequencies in the HF - VHF range at the Kennedy Space Center, Florida. The data were examined to determine flashing rate statistics during periods of strong activity from nearby storms. It was found that the time between flashes is modeled reasonably well by a random variable with a lognormal distribution.

  20. Statistical analysis of extreme river flows

    NASA Astrophysics Data System (ADS)

    Mateus, Ayana; Caeiro, Frederico; Gomes, Dora Prata; Sequeira, Inês J.

    2016-12-01

    Floods are recurrent events that can have a catastrophic impact. In this work we are interested in the analysis of a data set of gauged daily flows from the Whiteadder Water river, Scotland. Using statistic techniques based on extreme value theory, we estimate several extreme value parameters, including extreme quantiles and return periods of high levels.

  1. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  2. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  3. Period meter for reactors

    DOEpatents

    Rusch, Gordon K.

    1976-01-06

    An improved log N amplifier type nuclear reactor period meter with reduced probability for noise-induced scrams is provided. With the reactor at low power levels a sampling circuit is provided to determine the reactor period by measuring the finite change in the amplitude of the log N amplifier output signal for a predetermined time period, while at high power levels, differentiation of the log N amplifier output signal provides an additional measure of the reactor period.

  4. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  5. The Periodic Pyramid

    ERIC Educational Resources Information Center

    Hennigan, Jennifer N.; Grubbs, W. Tandy

    2013-01-01

    The chemical elements present in the modern periodic table are arranged in terms of atomic numbers and chemical periodicity. Periodicity arises from quantum mechanical limitations on how many electrons can occupy various shells and subshells of an atom. The shell model of the atom predicts that a maximum of 2, 8, 18, and 32 electrons can occupy…

  6. Community College Periodicals.

    ERIC Educational Resources Information Center

    Pederson, Eldor O.

    Drawing from an examination of community college periodicals, their availability and characteristics, the academic affiliations of contributing authors, and the topics of their articles, this paper discusses the minor role which community college periodicals appear to play. A list of 35 periodicals dealing primary with community college education…

  7. Flipping the statistics classroom in nursing education.

    PubMed

    Schwartz, Todd A

    2014-04-01

    Flipped classrooms are so named because they substitute the traditional lecture that commonly encompasses the entire class period with active learning techniques, such as small-group work. The lectures are delivered instead by using an alternative mode--video recordings--that are made available for viewing online outside the class period. Due to this inverted approach, students are engaged with the course material during the class period, rather than participating only passively. This flipped approach is gaining popularity in many areas of education due to its enhancement of student learning and represents an opportunity for utilization by instructors of statistics courses in nursing education. This article presents the author's recent experiences with flipping a statistics course for nursing students in a PhD program, including practical considerations and student outcomes and reaction. This transformative experience deepened the level of student learning in a way that may not have occurred using a traditional format.

  8. Statistical origin of gravity

    SciTech Connect

    Banerjee, Rabin; Majhi, Bibhas Ranjan

    2010-06-15

    Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.

  9. Statistical Reasoning over Lunch

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.

    2011-01-01

    Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…

  10. Analogies for Understanding Statistics

    ERIC Educational Resources Information Center

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  11. Statistical Significance Testing.

    ERIC Educational Resources Information Center

    McLean, James E., Ed.; Kaufman, Alan S., Ed.

    1998-01-01

    The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…

  12. Structurally Sound Statistics Instruction

    ERIC Educational Resources Information Center

    Casey, Stephanie A.; Bostic, Jonathan D.

    2016-01-01

    The Common Core's Standards for Mathematical Practice (SMP) call for all K-grade 12 students to develop expertise in the processes and proficiencies of doing mathematics. However, the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) as a whole addresses students' learning of not only mathematics but also statistics. This situation…

  13. General Aviation Avionics Statistics.

    DTIC Science & Technology

    1980-12-01

    No. 2. Government Accession No. 3. Recipient’s Catalog No. 5" FAA-MS-80-7* a and. SubtitleDecember 1&80 "GENERAL AVIATION AVIONICS STATISTICS 0 6...Altimeter 8. Fuel gage 3. Compass 9. Landing gear 4. Tachometer 10. Belts 5. Oil temperature 11. Special equipment for 6. Emergency locator over water

  14. NACME Statistical Report 1986.

    ERIC Educational Resources Information Center

    Miranda, Luis A.; Ruiz, Esther

    This statistical report summarizes data on enrollment and graduation of minority students in engineering degree programs from 1974 to 1985. First, an introduction identifies major trends and briefly describes the Incentive Grants Program (IGP), the nation's largest privately supported source of scholarship funds available to minority engineering…

  15. Probability and Statistics.

    ERIC Educational Resources Information Center

    Barnes, Bernis, Ed.; And Others

    This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…

  16. Selected Manpower Statistics.

    ERIC Educational Resources Information Center

    Office of the Assistant Secretary of Defense -- Comptroller (DOD), Washington, DC.

    This document contains summaries of basic manpower statistical data for the Department of Defense, with the Army, Navy, Marine Corps, and Air Force totals shown separately and collectively. Included are figures for active duty military personnel, civilian personnel, reserve components, and retired military personnel. Some of the data show…

  17. Statistics of mass production

    NASA Astrophysics Data System (ADS)

    Williams, R. L.; Gateley, Wilson Y.

    1993-05-01

    This paper summarizes the statistical quality control methods and procedures that can be employed in mass producing electronic parts (integrated circuits, buffers, capacitors, connectors) to reduce variability and ensure performance to specified radiation, current, voltage, temperature, shock, and vibration levels. Producing such quality parts reduces uncertainties in performance and will aid materially in validating the survivability of components, subsystems, and systems to specified threats.

  18. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  19. Whither Statistics Education Research?

    ERIC Educational Resources Information Center

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  20. Quartiles in Elementary Statistics

    ERIC Educational Resources Information Center

    Langford, Eric

    2006-01-01

    The calculation of the upper and lower quartile values of a data set in an elementary statistics course is done in at least a dozen different ways, depending on the text or computer/calculator package being used (such as SAS, JMP, MINITAB, "Excel," and the TI-83 Plus). In this paper, we examine the various methods and offer a suggestion for a new…

  1. Mental Illness Statistics

    MedlinePlus

    ... of benign genes ID’s ASD suspects More Additional Mental Health Information from NIMH Medications Statistics Clinical Trials Coping ... Finder Publicaciones en Español The National Institute of Mental Health (NIMH) is part of the National Institutes of ...

  2. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  3. Library Research and Statistics.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.

    2001-01-01

    These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…

  4. Statistics for Learning Genetics

    NASA Astrophysics Data System (ADS)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  5. Programmable Periodicity of Quantum Dot Arrays with DNA Origami Nanotubes

    PubMed Central

    2010-01-01

    To fabricate quantum dot arrays with programmable periodicity, functionalized DNA origami nanotubes were developed. Selected DNA staple strands were biotin-labeled to form periodic binding sites for streptavidin-conjugated quantum dots. Successful formation of arrays with periods of 43 and 71 nm demonstrates precise, programmable, large-scale nanoparticle patterning; however, limitations in array periodicity were also observed. Statistical analysis of AFM images revealed evidence for steric hindrance or site bridging that limited the minimum array periodicity. PMID:20681601

  6. Cancer statistics for African Americans.

    PubMed

    Ghafoor, Asma; Jemal, Ahmedin; Cokkinides, Vilma; Cardinez, Cheryll; Murray, Taylor; Samuels, Alicia; Thun, Michael J

    2002-01-01

    The American Cancer Society provides estimates on the number of new cancer cases and deaths, and compiles health statistics on African Americans in a biennial publication, Cancer Facts and Figures for African Americans. The compiled statistics include cancer incidence, mortality, survival, and lifestyle behaviors using the most recent data on incidence and survival from the National Cancer Institute's (NCI) Surveillance, Epidemiology, and End Results (SEER) program, mortality data from the National Center for Health Statistics (NCHS), and behavioral information from the Behavior Risk Factor Surveillance System (BRFSS), Youth Risk Behavior Surveillance System (YRBSS), and National Health Interview Survey (NHIS). It is estimated that 132,700 new cases of cancer and 63,100 deaths will occur among African Americans in the year 2003. Although African Americans have experienced higher incidence and mortality rates of cancer than whites for many years, incidence rates have declined by 2.7 percent per year in African-American males since 1992, while stabilizing in African-American females. During the same period, death rates declined by 2.1 percent and 0.4 percent per year among African-American males and females, respectively. The decrease in both incidence and death rates from cancer among African-American males was the largest of any racial or ethnic group. Nonetheless, African Americans still carry the highest cancer burden among US racial and ethnic groups. Most cancers detectable by screening are diagnosed at a later stage and survival rates are lower within each stage of disease in African Americans than in whites. The extent to which these disparities reflect unequal access to health care versus other factors is an active area of research.

  7. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  8. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  9. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  10. [Human development and log-periodic law].

    PubMed

    Cash, Roland; Chaline, Jean; Nottale, Laurent; Grou, Pierre

    2002-05-01

    We suggest applying the log-periodic law formerly used to describe various crisis phenomena, in biology (evolutionary leaps), inorganic systems (earthquakes), societies and economy (economic crisis, market crashes) to the various steps of human ontogeny. We find a statistically significant agreement between this model and the data.

  11. Pain: A Statistical Account

    PubMed Central

    Thacker, Michael A.; Moseley, G. Lorimer

    2017-01-01

    Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134

  12. Periodic chiral structures

    NASA Technical Reports Server (NTRS)

    Jaggard, Dwight L.; Engheta, Nader; Pelet, Philippe; Liu, John C.; Kowarz, Marek W.; Kim, Yunjin

    1989-01-01

    The electromagnetic properties of a structure that is both chiral and periodic are investigated using coupled-mode equations. The periodicity is described by a sinusoidal perturbation of the permittivity, permeability, and chiral admittance. The coupled-mode equations are derived from physical considerations and used to examine bandgap structure and reflected and transmitted fields. Chirality is observed predominantly in transmission, whereas periodicity is present in both reflection and transmission.

  13. 1979 DOE statistical symposium

    SciTech Connect

    Gardiner, D.A.; Truett T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  14. Relativistic statistical arbitrage

    NASA Astrophysics Data System (ADS)

    Wissner-Gross, A. D.; Freer, C. E.

    2010-11-01

    Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.

  15. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.

  16. Statistical Challenges of Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Digital sky surveys, data from orbiting telescopes, and advances in computation have increased the quantity and quality of astronomical data by several orders of magnitude in recent years. Making sense of this wealth of data requires sophisticated statistical and data analytic techniques. Fortunately, statistical methodologies have similarly made great strides in recent years. Powerful synergies thus emerge when astronomers and statisticians join in examining astrostatistical problems and approaches. The volume focuses on several themes: · The increasing power of Bayesian approaches to modeling astronomical data · The growth of enormous databases, leading an emerging federated Virtual Observatory, and their impact on modern astronomical research · Statistical modeling of critical datasets, such as galaxy clustering and fluctuations in the microwave background radiation, leading to a new era of precision cosmology · Methodologies for uncovering clusters and patterns in multivariate data · The characterization of multiscale patterns in imaging and time series data As in earlier volumes in this series, research contributions discussing topics in one field are joined with commentary from scholars in the other. Short contributed papers covering dozens of astrostatistical topics are also included.

  17. Statistics in fusion experiments

    NASA Astrophysics Data System (ADS)

    McNeill, D. H.

    1997-11-01

    Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).

  18. Statistical Analysis of Refractivity in UAE

    NASA Astrophysics Data System (ADS)

    Al-Ansari, Kifah; Al-Mal, Abdulhadi Abu; Kamel, Rami

    2007-07-01

    This paper presents the results of the refractivity statistics in the UAE (United Arab Emirates) for a period of 14 years (1990-2003). Six sites have been considered using meteorological surface data (Abu Dhabi, Dubai, Sharjah, Al-Ain, Ras Al-Kaimah, and Al-Fujairah). Upper air (radiosonde) data were available at one site only, Abu Dhabi airport, which has been considered for the refractivity gradient statistics. Monthly and yearly averages are obtained for the two parameters, refractivity and refractivity gradient. Cumulative distributions are also provided.

  19. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  20. Periodic Solutions of Spatially Periodic Hamiltonian Systems

    DTIC Science & Technology

    1989-07-10

    Theorem 0.2 generalizes Theorem 1.5 of Rabinowitz in [201. 3 Equation (0.1), under spatially periodic assumptions has been studied by several au...n x n symmetric matrix, H satisfying (HO), (H1) and (H2), and f = (0, fq) satisfying (fM), (fl) and (f2), Rabinowitz in [201 showed the existence of... Rabinowitz [17]. We consider a functional I : E x M ) R of class C’, where E is a Hilbert space and M is a compact manifold. Assuming that I satisfies a

  1. The Living Periodic Table

    ERIC Educational Resources Information Center

    Nahlik, Mary Schrodt

    2005-01-01

    To help make the abstract world of chemistry more concrete eighth-grade students, the author has them create a living periodic table that can be displayed in the classroom or hallway. This display includes information about the elements arranged in the traditional periodic table format, but also includes visual real-world representations of the…

  2. Multidimensional period doubling structures.

    PubMed

    Lee, Jeong Yup; Flom, Dvir; Ben-Abraham, Shelomo I

    2016-05-01

    This paper develops the formalism necessary to generalize the period doubling sequence to arbitrary dimension by straightforward extension of the substitution and recursion rules. It is shown that the period doubling structures of arbitrary dimension are pure point diffractive. The symmetries of the structures are pointed out.

  3. Latent Period of Relaxation.

    PubMed

    Kobayashi, M; Irisawa, H

    1961-10-27

    The latent period of relaxation of molluscan myocardium due to anodal current is much longer than that of contraction. Although the rate and the grade of relaxation are intimately related to both the stimulus condition and the muscle tension, the latent period of relaxation remains constant, except when the temperature of the bathing fluid is changed.

  4. Truth, Damn Truth, and Statistics

    ERIC Educational Resources Information Center

    Velleman, Paul F.

    2008-01-01

    Statisticians and Statistics teachers often have to push back against the popular impression that Statistics teaches how to lie with data. Those who believe incorrectly that Statistics is solely a branch of Mathematics (and thus algorithmic), often see the use of judgment in Statistics as evidence that we do indeed manipulate our results. In the…

  5. Experimental Mathematics and Computational Statistics

    SciTech Connect

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  6. Who Needs Statistics? | Poster

    Cancer.gov

    You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.

  7. International petroleum statistics report

    SciTech Connect

    1995-10-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.

  8. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  9. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  10. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This booklet of pocket statistics includes the 1996 NASA Major Launch Record, NASA Procurement, Financial, and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Luanch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  11. Genealogy of periodic trajectories

    SciTech Connect

    de Adguiar, M.A.M.; Maldta, C.P.; de Passos, E.J.V.

    1986-05-20

    The periodic solutions of non-integrable classical Hamiltonian systems with two degrees of freedom are numerically investigated. Curves of periodic families are given in plots of energy vs. period. Results are presented for this Hamiltonian: H = 1/2(p/sub x//sup 2/ + p/sub y//sup 2/) + 1/2 x/sup 2/ + 3/2 y/sup 2/ - x/sup 2/y + 1/12 x/sup 4/. Properties of the families of curves are pointed out. (LEW)

  12. Periodically poled silicon

    NASA Astrophysics Data System (ADS)

    Hon, Nick K.; Tsia, Kevin K.; Solli, Daniel R.; Jalali, Bahram

    2009-03-01

    We propose a new class of photonic devices based on periodic stress fields in silicon that enable second-order nonlinearity as well as quasi-phase matching. Periodically poled silicon (PePSi) adds the periodic poling capability to silicon photonics and allows the excellent crystal quality and advanced manufacturing capabilities of silicon to be harnessed for devices based on second-order nonlinear effects. As an example of the utility of the PePSi technology, we present simulations showing that midwave infrared radiation can be efficiently generated through difference frequency generation from near-infrared with a conversion efficiency of 50%.

  13. Statistics of superior records

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Krapivsky, P. L.

    2013-08-01

    We study statistics of records in a sequence of random variables. These identical and independently distributed variables are drawn from the parent distribution ρ. The running record equals the maximum of all elements in the sequence up to a given point. We define a superior sequence as one where all running records are above the average record expected for the parent distribution ρ. We find that the fraction of superior sequences SN decays algebraically with sequence length N, SN˜N-β in the limit N→∞. Interestingly, the decay exponent β is nontrivial, being the root of an integral equation. For example, when ρ is a uniform distribution with compact support, we find β=0.450265. In general, the tail of the parent distribution governs the exponent β. We also consider the dual problem of inferior sequences, where all records are below average, and find that the fraction of inferior sequences IN decays algebraically, albeit with a different decay exponent, IN˜N-α. We use the above statistical measures to analyze earthquake data.

  14. Taking a statistical approach

    SciTech Connect

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate ore concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.

  15. Fragile entanglement statistics

    NASA Astrophysics Data System (ADS)

    Brody, Dorje C.; Hughston, Lane P.; Meier, David M.

    2015-10-01

    If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.

  16. International petroleum statistics report

    SciTech Connect

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  17. Elements of Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Sachs, Ivo; Sen, Siddhartha; Sexton, James

    2006-05-01

    This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics

  18. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  19. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  20. Update on Saturn's energetic electron periodicities

    NASA Astrophysics Data System (ADS)

    Carbary, James F.

    2017-01-01

    The periodicities in fluxes of energetic electrons (110-365 keV) in Saturn's magnetosphere were determined from late 2004 to mid-2016. The electron periods were calculated using Lomb periodogram analyses within windows of 200 days at sliding intervals of 10 days, which tracked changes in the periodicity. Sometimes the periodicity showed a clear duality, as in 2007-2008, while at other times the two periods came together so closely as to be indistinguishable, as after equinox in 2010 and in 2015. At still other times, the periodicity apparently vanished altogether, as in 2014. These periodicities generally agreed with those of other phenomena such as the magnetic field and radio emissions. Whether dual or mono, the periods generally remained between 10.58 h and 10.84 h, with two statistical peaks at 10.68 h and 10.81 h. This observation suggests that magnetospheric periodicities at Saturn lie within a limited range of values, which places constraints on the generative mechanism for the phenomena.

  1. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  2. Setting the Periodic Table.

    ERIC Educational Resources Information Center

    Saturnelli, Annette

    1985-01-01

    Examines problems resulting from different forms of the periodic table, indicating that New York State schools use a form reflecting the International Union of Pure and Applied Chemistry's 1984 recommendations. Other formats used and reasons for standardization are discussed. (DH)

  3. The Periodic Table CD.

    ERIC Educational Resources Information Center

    Banks, Alton J.; Holmes, Jon L.

    1995-01-01

    Describes the characteristics of the digitized version of The Periodic Table Videodisc. Provides details about the organization of information and access to the data via Macintosh and Windows computers. (DDR)

  4. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  5. Statistical crack mechanics

    SciTech Connect

    Dienes, J.K.

    1983-01-01

    An alternative to the use of plasticity theory to characterize the inelastic behavior of solids is to represent the flaws by statistical methods. We have taken such an approach to study fragmentation because it offers a number of advantages. Foremost among these is that, by considering the effects of flaws, it becomes possible to address the underlying physics directly. For example, we have been able to explain why rocks exhibit large strain-rate effects (a consequence of the finite growth rate of cracks), why a spherical explosive imbedded in oil shale produces a cavity with a nearly square section (opening of bedding cracks) and why propellants may detonate following low-speed impact (a consequence of frictional hot spots).

  6. Statistics of lattice animals

    NASA Astrophysics Data System (ADS)

    Hsu, Hsiao-Ping; Nadler, Walder; Grassberger, Peter

    2005-07-01

    The scaling behavior of randomly branched polymers in a good solvent is studied in two to nine dimensions, modeled by lattice animals on simple hypercubic lattices. For the simulations, we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PERM) used extensively for linear polymers. We obtain high statistics of animals with up to several thousand sites in all dimension 2⩽d⩽9. The partition sum (number of different animals) and gyration radii are estimated. In all dimensions we verify the Parisi-Sourlas prediction, and we verify all exactly known critical exponents in dimensions 2, 3, 4, and ⩾8. In addition, we present the hitherto most precise estimates for growth constants in d⩾3. For clusters with one site attached to an attractive surface, we verify the superuniversality of the cross-over exponent at the adsorption transition predicted by Janssen and Lyssy.

  7. Conditional statistical model building

    NASA Astrophysics Data System (ADS)

    Hansen, Mads Fogtmann; Hansen, Michael Sass; Larsen, Rasmus

    2008-03-01

    We present a new statistical deformation model suited for parameterized grids with different resolutions. Our method models the covariances between multiple grid levels explicitly, and allows for very efficient fitting of the model to data on multiple scales. The model is validated on a data set consisting of 62 annotated MR images of Corpus Callosum. One fifth of the data set was used as a training set, which was non-rigidly registered to each other without a shape prior. From the non-rigidly registered training set a shape prior was constructed by performing principal component analysis on each grid level and using the results to construct a conditional shape model, conditioning the finer parameters with the coarser grid levels. The remaining shapes were registered with the constructed shape prior. The dice measures for the registration without prior and the registration with a prior were 0.875 +/- 0.042 and 0.8615 +/- 0.051, respectively.

  8. Statistical physics ""Beyond equilibrium

    SciTech Connect

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  9. Statistical Thermodynamics of Biomembranes

    PubMed Central

    Devireddy, Ram V.

    2010-01-01

    An overview of the major issues involved in the statistical thermodynamic treatment of phospholipid membranes at the atomistic level is summarized: thermodynamic ensembles, initial configuration (or the physical system being modeled), force field representation as well as the representation of long-range interactions. This is followed by a description of the various ways that the simulated ensembles can be analyzed: area of the lipid, mass density profiles, radial distribution functions (RDFs), water orientation profile, Deuteurium order parameter, free energy profiles and void (pore) formation; with particular focus on the results obtained from our recent molecular dynamic (MD) simulations of phospholipids interacting with dimethylsulfoxide (Me2SO), a commonly used cryoprotective agent (CPA). PMID:19460363

  10. Statistical physics of vaccination

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  11. Newspapers and Crime: What Happens During Strike Periods

    ERIC Educational Resources Information Center

    Payne, David E.

    1974-01-01

    Refutes the thesis that relates various types of mass media, especially the newspaper, to rates of aggression, violence, and deviance by comparing crime statistics during strike and nonstrike periods.

  12. Statistical Literacy: Developing a Youth and Adult Education Statistical Project

    ERIC Educational Resources Information Center

    Conti, Keli Cristina; Lucchesi de Carvalho, Dione

    2014-01-01

    This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…

  13. Dissociable behavioural outcomes of visual statistical learning.

    PubMed

    Bays, Brett C; Turk-Browne, Nicholas B; Seitz, Aaron R

    Statistical learning refers to the extraction of probabilistic relationships between stimuli and is increasingly used as a method to understand learning processes. However, numerous cognitive processes are sensitive to the statistical relationships between stimuli and any one measure of learning may conflate these processes; to date little research has focused on differentiating these processes. To understand how multiple processes underlie statistical learning, here we compared, within the same study, operational measures of learning from different tasks that may be differentially sensitive to these processes. In Experiment 1, participants were visually exposed to temporal regularities embedded in a stream of shapes. Their task was to periodically detect whether a shape, whose contrast was staircased to a threshold level, was present or absent. Afterwards, they completed a search task, where statistically predictable shapes were found more quickly. We used the search task to label shape pairs as "learned" or "non-learned", and then used these labels to analyse the detection task. We found a dissociation between learning on the search task and the detection task where only non-learned pairs showed learning effects in the detection task. This finding was replicated in further experiments with recognition memory (Experiment 2) and associative learning tasks (Experiment 3). Taken together, these findings are consistent with the view that statistical learning may comprise a family of processes that can produce dissociable effects on different aspects of behaviour.

  14. Dissociable behavioural outcomes of visual statistical learning

    PubMed Central

    Turk-Browne, Nicholas B.; Seitz, Aaron R.

    2016-01-01

    Statistical learning refers to the extraction of probabilistic relationships between stimuli and is increasingly used as a method to understand learning processes. However, numerous cognitive processes are sensitive to the statistical relationships between stimuli and any one measure of learning may conflate these processes; to date little research has focused on differentiating these processes. To understand how multiple processes underlie statistical learning, here we compared, within the same study, operational measures of learning from different tasks that may be differentially sensitive to these processes. In Experiment 1, participants were visually exposed to temporal regularities embedded in a stream of shapes. Their task was to periodically detect whether a shape, whose contrast was staircased to a threshold level, was present or absent. Afterwards, they completed a search task, where statistically predictable shapes were found more quickly. We used the search task to label shape pairs as “learned” or “non-learned”, and then used these labels to analyse the detection task. We found a dissociation between learning on the search task and the detection task where only non-learned pairs showed learning effects in the detection task. This finding was replicated in further experiments with recognition memory (Experiment 2) and associative learning tasks (Experiment 3). Taken together, these findings are consistent with the view that statistical learning may comprise a family of processes that can produce dissociable effects on different aspects of behaviour. PMID:27478399

  15. On the Statistics of Macrospicules

    NASA Astrophysics Data System (ADS)

    Bennett, S. M.; Erdélyi, R.

    2015-08-01

    A new generation of solar telescopes has led to an increase in the resolution of localized features seen on the Sun spatially, temporally, and spectrally, enabling a detailed study of macrospicules. Macrospicules are members of a wide variety of solar ejecta and ascertaining where they belong in this family is vitally important, particularly given that they are chromospheric events which penetrate the transition region and lower corona. We examine the overall properties of macrospicules, both temporal and spatial. We also investigate possible relationships between the macrospicule properties and the sample time period itself, which is selected as a proxy for the ramp from solar minimum to solar maximum. Measurements are taken using the Solar Dynamic Observatory to provide the necessary temporal resolution and coverage. At each point in time, the length of the macrospicule is measured from base to tip and the width is recorded at half the length at each step. The measurements were then applied to determine the statistical properties and relationships between them. It is evident that the properties of maximum velocity, maximum length, and lifetime are all related in specific, established terms. We provide appropriate scaling in terms of the physical properties, which would be a useful test bed for modeling. Also, we note that the maximum lengths and lifetimes of the features show some correlation with the sample epoch and, therefore, by proxy the solar minimum to maximum ramp.

  16. Periodic power spectrum with applications in detection of latent periodicities in DNA sequences.

    PubMed

    Yin, Changchuan; Wang, Jiasong

    2016-11-01

    Periodic elements play important roles in genomic structures and functions, yet some complex periodic elements in genomes are difficult to detect by conventional methods such as digital signal processing and statistical analysis. We propose a periodic power spectrum (PPS) method for analyzing periodicities of DNA sequences. The PPS method employs periodic nucleotide distributions of DNA sequences and directly calculates power spectra at specific periodicities. The magnitude of a PPS reflects the strength of a signal on periodic positions. In comparison with Fourier transform, the PPS method avoids spectral leakage, and reduces background noise that appears high in Fourier power spectrum. Thus, the PPS method can effectively capture hidden periodicities in DNA sequences. Using a sliding window approach, the PPS method can precisely locate periodic regions in DNA sequences. We apply the PPS method for detection of hidden periodicities in different genome elements, including exons, microsatellite DNA sequences, and whole genomes. The results show that the PPS method can minimize the impact of spectral leakage and thus capture true hidden periodicities in genomes. In addition, performance tests indicate that the PPS method is more effective and efficient than a fast Fourier transform. The computational complexity of the PPS algorithm is [Formula: see text]. Therefore, the PPS method may have a broad range of applications in genomic analysis. The MATLAB programs for implementing the PPS method are available from MATLAB Central ( http://www.mathworks.com/matlabcentral/fileexchange/55298 ).

  17. Key Statistics for Thyroid Cancer

    MedlinePlus

    ... and Treatment? Thyroid Cancer About Thyroid Cancer Key Statistics for Thyroid Cancer How common is thyroid cancer? ... remains very low compared with most other cancers. Statistics on survival rates for thyroid cancer are discussed ...

  18. HPV-Associated Cancers Statistics

    MedlinePlus

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  19. Muscular Dystrophy: Data and Statistics

    MedlinePlus

    ... Statistics Recommend on Facebook Tweet Share Compartir MD STAR net Data and Statistics The following data and ... research [ Read Article ] For more information on MD STAR net see Research and Tracking . Key Findings Feature ...

  20. Hyperthyroid hypokalemic periodic paralysis

    PubMed Central

    Neki, N.S.

    2016-01-01

    Hyperthyroid periodic paralysis (HPP) is a rare life threatening complication of hyperthyroidism commonly occurring in young Asian males but sporadically found in other races. It is characterised by hypokalemia and acute onset paraparesis with prevalence of one in one hundred thousand (1 in 100000). The symptoms resolve promptly with potassium supplementation. Nonselective beta blockers like propranol can also be used to ameliorate and prevent subsequent paralytic attack. We report a case of 22 year old male presenting with hyperthyroid periodic paralysis (HPP) having very low serum potassium level. PMID:27648066

  1. International petroleum statistics report

    SciTech Connect

    1996-05-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.

  2. Topics in statistical mechanics

    SciTech Connect

    Elser, V.

    1984-05-01

    This thesis deals with four independent topics in statistical mechanics: (1) the dimer problem is solved exactly for a hexagonal lattice with general boundary using a known generating function from the theory of partitions. It is shown that the leading term in the entropy depends on the shape of the boundary; (2) continuum models of percolation and self-avoiding walks are introduced with the property that their series expansions are sums over linear graphs with intrinsic combinatorial weights and explicit dimension dependence; (3) a constrained SOS model is used to describe the edge of a simple cubic crystal. Low and high temperature results are derived as well as the detailed behavior near the crystal facet; (4) the microscopic model of the lambda-transition involving atomic permutation cycles is reexamined. In particular, a new derivation of the two-component field theory model of the critical behavior is presented. Results for a lattice model originally proposed by Kikuchi are extended with a high temperature series expansion and Monte Carlo simulation. 30 references.

  3. Statistics of indistinguishable particles.

    PubMed

    Wittig, Curt

    2009-07-02

    The wave function of a system containing identical particles takes into account the relationship between a particle's intrinsic spin and its statistical property. Specifically, the exchange of two identical particles having odd-half-integer spin results in the wave function changing sign, whereas the exchange of two identical particles having integer spin is accompanied by no such sign change. This is embodied in a term (-1)(2s), which has the value +1 for integer s (bosons), and -1 for odd-half-integer s (fermions), where s is the particle spin. All of this is well-known. In the nonrelativistic limit, a detailed consideration of the exchange of two identical particles shows that exchange is accompanied by a 2pi reorientation that yields the (-1)(2s) term. The same bookkeeping is applicable to the relativistic case described by the proper orthochronous Lorentz group, because any proper orthochronous Lorentz transformation can be expressed as the product of spatial rotations and a boost along the direction of motion.

  4. International petroleum statistics report

    SciTech Connect

    1996-10-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  5. International petroleum statistics report

    SciTech Connect

    1995-11-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  6. International petroleum statistics report

    SciTech Connect

    1995-07-27

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  7. Statistical Mechanics of Zooplankton.

    PubMed

    Hinow, Peter; Nihongi, Ai; Strickler, J Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.

  8. Statistical Mechanics of Zooplankton

    PubMed Central

    Hinow, Peter; Nihongi, Ai; Strickler, J. Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar “microscopic” quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the “ecological temperature” of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean’s swimming behavior. PMID:26270537

  9. International petroleum statistics report

    SciTech Connect

    1997-07-01

    The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.

  10. Information in statistical physics

    NASA Astrophysics Data System (ADS)

    Balian, Roger

    We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its outcomes can be given a direct justification based on the principle of indifference of Laplace. We introduce the concept of relevant entropy associated with some set of relevant variables; it characterizes the information that is missing at the microscopic level when only these variables are known. For equilibrium problems, the relevant variables are the conserved ones, and the Second Law is recovered as a second step of the inference process. For non-equilibrium problems, the increase of the relevant entropy expresses an irretrievable loss of information from the relevant variables towards the irrelevant ones. Two examples illustrate the flexibility of the choice of relevant variables and the multiplicity of the associated entropies: the thermodynamic entropy (satisfying the Clausius-Duhem inequality) and the Boltzmann entropy (satisfying the H -theorem). The identification of entropy with missing information is also supported by the paradox of Maxwell's demon. Spin-echo experiments show that irreversibility itself is not an absolute concept: use of hidden information may overcome the arrow of time.

  11. Statistical Mechanics of Money

    NASA Astrophysics Data System (ADS)

    Dragulescu, Adrian; Yakovenko, Victor

    2000-03-01

    We study a network of agents exchanging money between themselves. We find that the stationary probability distribution of money M is the Gibbs distribution exp(-M/T), where T is an effective ``temperature'' equal to the average amount of money per agent. This is in agreement with the general laws of statistical mechanics, because money is conserved during each transaction and the number of agents is held constant. We have verified the emergence of the Gibbs distribution in computer simulations of various trading rules and models. When the time-reversal symmetry of the trading rules is explicitly broken, deviations from the Gibbs distribution may occur, as follows from the Boltzmann-equation approach to the problem. Money distribution characterizes the purchasing power of a system. A seller would maximize his/her income by setting the price of a product equal to the temperature T of the system. Buying products from a system of temperature T1 and selling it to a system of temperature T2 would generate profit T_2-T_1>0, as in a thermal machine.

  12. Statistical mechanics of nucleosomes

    NASA Astrophysics Data System (ADS)

    Chereji, Razvan V.

    Eukaryotic cells contain long DNA molecules (about two meters for a human cell) which are tightly packed inside the micrometric nuclei. Nucleosomes are the basic packaging unit of the DNA which allows this millionfold compactification. A longstanding puzzle is to understand the principles which allow cells to both organize their genomes into chromatin fibers in the crowded space of their nuclei, and also to keep the DNA accessible to many factors and enzymes. With the nucleosomes covering about three quarters of the DNA, their positions are essential because these influence which genes can be regulated by the transcription factors and which cannot. We study physical models which predict the genome-wide organization of the nucleosomes and also the relevant energies which dictate this organization. In the last five years, the study of chromatin knew many important advances. In particular, in the field of nucleosome positioning, new techniques of identifying nucleosomes and the competing DNA-binding factors appeared, as chemical mapping with hydroxyl radicals, ChIP-exo, among others, the resolution of the nucleosome maps increased by using paired-end sequencing, and the price of sequencing an entire genome decreased. We present a rigorous statistical mechanics model which is able to explain the recent experimental results by taking into account nucleosome unwrapping, competition between different DNA-binding proteins, and both the interaction between histones and DNA, and between neighboring histones. We show a series of predictions of our new model, all in agreement with the experimental observations.

  13. Statistics: It's in the Numbers!

    ERIC Educational Resources Information Center

    Deal, Mary M.; Deal, Walter F., III

    2007-01-01

    Mathematics and statistics play important roles in peoples' lives today. A day hardly passes that they are not bombarded with many different kinds of statistics. As consumers they see statistical information as they surf the web, watch television, listen to their satellite radios, or even read the nutrition facts panel on a cereal box in the…

  14. Digest of Education Statistics, 1980.

    ERIC Educational Resources Information Center

    Grant, W. Vance; Eiden, Leo J.

    The primary purpose of this publication is to provide an abstract of statistical information covering the broad field of American education from prekindergarten through graduate school. Statistical information is presented in 14 figures and 200 tables with brief trend analyses. In addition to updating many of the statistics that have appeared in…

  15. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  16. Invention Activities Support Statistical Reasoning

    ERIC Educational Resources Information Center

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  17. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  18. Periodic sequence patterns in human exons

    SciTech Connect

    Baldi, P.; Brunak, S.; Engelbrecht, J.; Chauvin, Y.; Krogh, A.

    1995-12-31

    We analyze the sequential structure of human exons and their flanking introns by hidden Markov models. Together, models of donor site regions, acceptor site regions and flanked internal exons, show that exons -- besides the reading frame -- hold a specific periodic pattern. The pattern, which has the consensus: non-T(A/T)G and a minimal periodicity of roughly 10 nucleotides, is not a consequence of the nucleotide statistics in the three codon positions, nor of the well known nucleosome positioning signal. We discuss the relation between the pattern and other known sequence elements responsible for the intrinsic bending or curvature of DNA.

  19. A Modern Periodic Table.

    ERIC Educational Resources Information Center

    Herrenden-Harker, B. D.

    1997-01-01

    Presents a modern Periodic Table based on the electron distribution in the outermost shell and the order of filling of the sublevels within the shells. Enables a student to read off directly the electronic configuration of the element and the order in which filling occurs. (JRH)

  20. Periodic Table of Students.

    ERIC Educational Resources Information Center

    Johnson, Mike

    1998-01-01

    Presents an exercise in which an eighth-grade science teacher decorated the classroom with a periodic table of students. Student photographs were arranged according to similarities into vertical columns. Students were each assigned an atomic number according to their placement in the table. The table is then used to teach students about…

  1. Periodically poled silicon

    NASA Astrophysics Data System (ADS)

    Hon, Nick K.; Tsia, Kevin K.; Solli, Daniel R.; Khurgin, Jacob B.; Jalali, Bahram

    2010-02-01

    Bulk centrosymmetric silicon lacks second-order optical nonlinearity χ(2) - a foundational component of nonlinear optics. Here, we propose a new class of photonic device which enables χ(2) as well as quasi-phase matching based on periodic stress fields in silicon - periodically-poled silicon (PePSi). This concept adds the periodic poling capability to silicon photonics, and allows the excellent crystal quality and advanced manufacturing capabilities of silicon to be harnessed for devices based on χ(2)) effects. The concept can also be simply achieved by having periodic arrangement of stressed thin films along a silicon waveguide. As an example of the utility, we present simulations showing that mid-wave infrared radiation can be efficiently generated through difference frequency generation from near-infrared with a conversion efficiency of 50% based on χ(2) values measurements for strained silicon reported in the literature [Jacobson et al. Nature 441, 199 (2006)]. The use of PePSi for frequency conversion can also be extended to terahertz generation. With integrated piezoelectric material, dynamically control of χ(2)nonlinearity in PePSi waveguide may also be achieved. The successful realization of PePSi based devices depends on the strength of the stress induced χ(2) in silicon. Presently, there exists a significant discrepancy in the literature between the theoretical and experimentally measured values. We present a simple theoretical model that produces result consistent with prior theoretical works and use this model to identify possible reasons for this discrepancy.

  2. Oscillations following periodic reinforcement.

    PubMed

    Monteiro, Tiago; Machado, Armando

    2009-06-01

    Three experiments examined behavior in extinction following periodic reinforcement. During the first phase of Experiment 1, four groups of pigeons were exposed to fixed interval (FI 16s or FI 48s) or variable interval (VI 16s or VI 48s) reinforcement schedules. Next, during the second phase, each session started with reinforcement trials and ended with an extinction segment. Experiment 2 was similar except that the extinction segment was considerably longer. Experiment 3 replaced the FI schedules with a peak procedure, with FI trials interspersed with non-food peak interval (PI) trials that were four times longer. One group of pigeons was exposed to FI 20s PI 80s trials, and another to FI 40s PI 160s trials. Results showed that, during the extinction segment, most pigeons trained with FI schedules, but not with VI schedules, displayed pause-peck oscillations with a period close to, but slightly greater than the FI parameter. These oscillations did not start immediately after the onset of extinction. Comparing the oscillations from Experiments 1 and 2 suggested that the alternation of reconditioning and re-extinction increases the reliability and earlier onset of the oscillations. In Experiment 3 the pigeons exhibited well-defined pause-peck cycles since the onset of extinction. These cycles had periods close to twice the value of the FI and lasted for long intervals of time. We discuss some hypotheses concerning the processes underlying behavioral oscillations following periodic reinforcement.

  3. Astrophysical implications of periodicity

    NASA Technical Reports Server (NTRS)

    Muller, Richard A.

    1988-01-01

    Two remarkable discoveries of the last decade have profound implications for astrophysics and for geophysics. These are the discovery by Alvarez et al., that certain mass extinctions are caused by the impact on the earth of a large asteroid or comet, and the discovery by Raup and Sepkoski that such extinctions are periodic, with a cycle time of 26 to 30 million years. The validity of both of these discoveries is assumed and the implications are examined. Most of the phenomena described depend not on periodicity, but just on the weaker assumption that the impacts on the earth take place primarily in showers. Proposed explanations for the periodicity include galactic oscillations, the Planet X model, and the possibility of Nemesis, a solar companion star. These hypotheses are critically examined. Results of the search for the solar companion are reported. The Deccan flood basalts of India have been proposed as the impact site for the Cretaceous impact, but this hypotheisis is in contradiction with the conclusion of Courtillot et al., that the magma flow began during a period of normal magnetic field. A possible resolution of this contradiction is proposed.

  4. Periodically structured plasmonic waveguides

    NASA Astrophysics Data System (ADS)

    Saj, W. M.; Foteinopoulou, S.; Kafesaki, M.; Soukoulis, C. M.; Economou, E. N.

    2008-04-01

    We study surface plasmon polariton (SPP) guiding structures, which are a modification of the Metal-Insulator-Metal (MIM) waveguide. The designs are constructed by introducing a periodic modulation in a MIM waveguide, with a glass core and silver claddings. This periodic modulation is created either by causing periodic indentations in the silver slabs encompassing the glass core, or by increasing the glass spacer material in certain periodic locations. Our objective is to achieve long range sub-wavelength waveguiding with vast dispersion engineering capabilities. We employ the Finite Difference Time Domain Method (FDTD) with the Auxiliary Differential Equation method (ADE) for the calculation of the dispersion relation of the guided modes, as well as the real time propagation suggests that the guiding mechnism in the examined structures is based on the electromagnetic (EM) couping between the slit plasmon modes. These - depending on the design - exist in the grooves between the silver plates or in the larger areas of the glass core spacer. Put it different, the guiding mechanism in the examined SPP waveguide designs is analogous to the EM energy transfer along metallic nanoparticle chains.

  5. Getting Your Period

    MedlinePlus

    ... for a woman to have a baby. During sexual intercourse, the egg can get fertilized by a male’s sperm and then attach to the lining of the uterus ( endometrium ) and grow into a baby. ( Read more about reproduction. ) Does your period come each month? top Menstrual ...

  6. Breast cancer statistics, 2011.

    PubMed

    DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin

    2011-01-01

    In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population.

  7. Periods of High Intensity Solar Proton Flux

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adams, James H.; Dietrich, William F.

    2012-01-01

    Analysis is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  8. Rotorcraft Smoothing Via Linear Time Periodic Methods

    DTIC Science & Technology

    2007-07-01

    Optimal Control Methodology for Rotor Vibration Smoothing . . 30 vii Page IV. Mathematic Foundations of Linear Time Periodic Systems . . . . 33 4.1 The...62 6.3 The Maximum Likelihood Estimator . . . . . . . . . . . 63 6.4 The Cramer-Rao Inequality . . . . . . . . . . . . . . . . 66 6.4.1 Statistical ...adjustments for vibration reduction. 2.2.2.4 1980’s to late 1990’s. Rotor vibrational reduction methods during the 1980’s began to adopt a mathematical

  9. Ideal statistically quasi Cauchy sequences

    NASA Astrophysics Data System (ADS)

    Savas, Ekrem; Cakalli, Huseyin

    2016-08-01

    An ideal I is a family of subsets of N, the set of positive integers which is closed under taking finite unions and subsets of its elements. A sequence (xk) of real numbers is said to be S(I)-statistically convergent to a real number L, if for each ɛ > 0 and for each δ > 0 the set { n ∈N :1/n | { k ≤n :| xk-L | ≥ɛ } | ≥δ } belongs to I. We introduce S(I)-statistically ward compactness of a subset of R, the set of real numbers, and S(I)-statistically ward continuity of a real function in the senses that a subset E of R is S(I)-statistically ward compact if any sequence of points in E has an S(I)-statistically quasi-Cauchy subsequence, and a real function is S(I)-statistically ward continuous if it preserves S(I)-statistically quasi-Cauchy sequences where a sequence (xk) is called to be S(I)-statistically quasi-Cauchy when (Δxk) is S(I)-statistically convergent to 0. We obtain results related to S(I)-statistically ward continuity, S(I)-statistically ward compactness, Nθ-ward continuity, and slowly oscillating continuity.

  10. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  11. Cancer Statistics, 2017.

    PubMed

    Siegel, Rebecca L; Miller, Kimberly D; Jemal, Ahmedin

    2017-01-01

    Each year, the American Cancer Society estimates the numbers of new cancer cases and deaths that will occur in the United States in the current year and compiles the most recent data on cancer incidence, mortality, and survival. Incidence data were collected by the Surveillance, Epidemiology, and End Results Program; the National Program of Cancer Registries; and the North American Association of Central Cancer Registries. Mortality data were collected by the National Center for Health Statistics. In 2017, 1,688,780 new cancer cases and 600,920 cancer deaths are projected to occur in the United States. For all sites combined, the cancer incidence rate is 20% higher in men than in women, while the cancer death rate is 40% higher. However, sex disparities vary by cancer type. For example, thyroid cancer incidence rates are 3-fold higher in women than in men (21 vs 7 per 100,000 population), despite equivalent death rates (0.5 per 100,000 population), largely reflecting sex differences in the "epidemic of diagnosis." Over the past decade of available data, the overall cancer incidence rate (2004-2013) was stable in women and declined by approximately 2% annually in men, while the cancer death rate (2005-2014) declined by about 1.5% annually in both men and women. From 1991 to 2014, the overall cancer death rate dropped 25%, translating to approximately 2,143,200 fewer cancer deaths than would have been expected if death rates had remained at their peak. Although the cancer death rate was 15% higher in blacks than in whites in 2014, increasing access to care as a result of the Patient Protection and Affordable Care Act may expedite the narrowing racial gap; from 2010 to 2015, the proportion of blacks who were uninsured halved, from 21% to 11%, as it did for Hispanics (31% to 16%). Gains in coverage for traditionally underserved Americans will facilitate the broader application of existing cancer control knowledge across every segment of the population. CA Cancer J Clin

  12. Cells anticipate periodic events

    NASA Astrophysics Data System (ADS)

    Nakagaki, Toshiyuki

    2009-03-01

    We show that an amoeboid organism can anticipate the timing of periodic events. The plasmodium of the true slime mold Physarum polycephalum moves rapidly under favourable conditions, but stops moving when transferred to less-favourable conditions. Plasmodia exposed to unfavourable conditions, presented in three consecutive pulses at constant intervals, reduced their locomotive speed in response to each episode. When subsequently subjected to favourable conditions, the plasmodia spontaneously reduced their locomotive speed at the time point when the next unfavourable episode would have occurred. This implied anticipation of impending environmental change. After this behaviour had been evoked several times, the locomotion of the plasmodia returned to normal; however, the anticipatory response could subsequently be induced by a single unfavourable pulse, implying recall of the memorized periodicity. We explored the mechanisms underlying these behaviours from a dynamical systems perspective. Our results hint at the cellular origins of primitive intelligence and imply that simple dynamics might be sufficient to explain its emergence.

  13. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  14. Chemists, Access, Statistics

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.

    2000-06-01

    IP-number access. Current subscriptions can be upgraded to IP-number access at little additional cost. We are pleased to be able to offer to institutions and libraries this convenient mode of access to subscriber only resources at JCE Online. JCE Online Usage Statistics We are continually amazed by the activity at JCE Online. So far, the year 2000 has shown a marked increase. Given the phenomenal overall growth of the Internet, perhaps our surprise is not warranted. However, during the months of January and February 2000, over 38,000 visitors requested over 275,000 pages. This is a monthly increase of over 33% from the October-December 1999 levels. It is good to know that people are visiting, but we would very much like to know what you would most like to see at JCE Online. Please send your suggestions to JCEOnline@chem.wisc.edu. For those who are interested, JCE Online year-to-date statistics are available. Biographical Snapshots of Famous Chemists: Mission Statement Feature Editor: Barbara Burke Chemistry Department, California State Polytechnic University-Pomona, Pomona, CA 91768 phone: 909/869-3664 fax: 909/869-4616 email: baburke@csupomona.edu The primary goal of this JCE Internet column is to provide information about chemists who have made important contributions to chemistry. For each chemist, there is a short biographical "snapshot" that provides basic information about the person's chemical work, gender, ethnicity, and cultural background. Each snapshot includes links to related websites and to a biobibliographic database. The database provides references for the individual and can be searched through key words listed at the end of each snapshot. All students, not just science majors, need to understand science as it really is: an exciting, challenging, human, and creative way of learning about our natural world. Investigating the life experiences of chemists can provide a means for students to gain a more realistic view of chemistry. In addition students

  15. New statistical downscaling for Canada

    NASA Astrophysics Data System (ADS)

    Murdock, T. Q.; Cannon, A. J.; Sobie, S.

    2013-12-01

    This poster will document the production of a set of statistically downscaled future climate projections for Canada based on the latest available RCM and GCM simulations - the North American Regional Climate Change Assessment Program (NARCCAP; Mearns et al. 2007) and the Coupled Model Intercomparison Project Phase 5 (CMIP5). The main stages of the project included (1) downscaling method evaluation, (2) scenarios selection, (3) production of statistically downscaled results, and (4) applications of results. We build upon a previous downscaling evaluation project (Bürger et al. 2012, Bürger et al. 2013) in which a quantile-based method (Bias Correction/Spatial Disaggregation - BCSD; Werner 2011) provided high skill compared with four other methods representing the majority of types of downscaling used in Canada. Additional quantile-based methods (Bias-Correction/Constructed Analogues; Maurer et al. 2010 and Bias-Correction/Climate Imprint ; Hunter and Meentemeyer 2005) were evaluated. A subset of 12 CMIP5 simulations was chosen based on an objective set of selection criteria. This included hemispheric skill assessment based on the CLIMDEX indices (Sillmann et al. 2013), historical criteria used previously at the Pacific Climate Impacts Consortium (Werner 2011), and refinement based on a modified clustering algorithm (Houle et al. 2012; Katsavounidis et al. 1994). Statistical downscaling was carried out on the NARCCAP ensemble and a subset of the CMIP5 ensemble. We produced downscaled scenarios over Canada at a daily time resolution and 300 arc second (~10 km) spatial resolution from historical runs for 1951-2005 and from RCP 2.6, 4.5, and 8.5 projections for 2006-2100. The ANUSPLIN gridded daily dataset (McKenney et al. 2011) was used as a target. It has national coverage, spans the historical period of interest 1951-2005, and has daily time resolution. It uses interpolation of station data based on thin-plate splines. This type of method has been shown to have

  16. "Just Another Statistic"

    PubMed

    Machtay; Glatstein

    1998-01-01

    On returning from a medical meeting, we learned that sadly a patient, "Mr. B.," had passed away. His death was a completely unexpected surprise. He had been doing well nine months after a course of intensive radiotherapy for a locally advanced head and neck cancer; in his most recent follow-up notes, he was described as a "complete remission." Nonetheless, he apparently died peacefully in his sleep from a cardiac arrest one night and was found the next day by a concerned neighbor. In our absence, after Mr. B. expired, his death certificate was filled out by a physician who didn't know him in detail, but did know why he recently was treated in our department. The cause of death was listed as head and neck cancer. It wasn't long after his death before we began to receive those notorious "requests for additional information," letters from the statistical office of a well-known cooperative group. Mr. B., as it turns out, was on a clinical trial, and it was "vital" to know further details of the circumstances of his passing. Perhaps this very large cancer had been controlled and Mr. B. succumbed to old age (helped along by the tobacco industry). On the other hand, maybe the residual "fibrosis" in his neck was actually packed with active tumor and his left carotid artery was finally 100% pinched off, or maybe he suffered a massive pulmonary embolism from cancer-related hypercoagulability. The forms and requests were completed with a succinct "cause of death uncertain," adding, "please have the Study Chairs call to discuss this difficult case." Often clinical reports of outcomes utilize and emphasize the endpoint "disease specific survival" (DSS). Like overall survival (OS), the DSS can be calculated by actuarial methods, with patients who have incomplete follow-up "censored" at the time of last follow-up pending further information. In the DSS, however, deaths unrelated to the index cancer of interest are censored at the time of death; thus, a death from intercurrent

  17. Statistical Analysis of Iberian Peninsula Megaliths Orientations

    NASA Astrophysics Data System (ADS)

    González-García, A. C.

    2009-08-01

    Megalithic monuments have been intensively surveyed and studied from the archaeoastronomical point of view in the past decades. We have orientation measurements for over one thousand megalithic burial monuments in the Iberian Peninsula, from several different periods. These data, however, lack a sound understanding. A way to classify and start to understand such orientations is by means of statistical analysis of the data. A first attempt is done with simple statistical variables and a mere comparison between the different areas. In order to minimise the subjectivity in the process a further more complicated analysis is performed. Some interesting results linking the orientation and the geographical location will be presented. Finally I will present some models comparing the orientation of the megaliths in the Iberian Peninsula with the rising of the sun and the moon at several times of the year.

  18. Methods in probability and statistical inference. Final report, June 15, 1975-June 30, 1979. [Dept. of Statistics, Univ. of Chicago

    SciTech Connect

    Wallace, D L; Perlman, M D

    1980-06-01

    This report describes the research activities of the Department of Statistics, University of Chicago, during the period June 15, 1975 to July 30, 1979. Nine research projects are briefly described on the following subjects: statistical computing and approximation techniques in statistics; numerical computation of first passage distributions; probabilities of large deviations; combining independent tests of significance; small-sample efficiencies of tests and estimates; improved procedures for simultaneous estimation and testing of many correlations; statistical computing and improved regression methods; comparison of several populations; and unbiasedness in multivariate statistics. A description of the statistical consultation activities of the Department that are of interest to DOE, in particular, the scientific interactions between the Department and the scientists at Argonne National Laboratories, is given. A list of publications issued during the term of the contract is included.

  19. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  20. The postanesthetic period. Complications.

    PubMed

    Malamed, S F

    1987-01-01

    Postanesthetic complications can occur even in the best of circumstances. Proper preparation of the staff, aggressive monitoring of the recovering patient, and early recognition and management of the complications are essential if the outcome is to be successful. In reviewing postanesthetic complications, two factors are present in the overwhelming majority of situations--hypoxia and hypercarbia--often the direct result of inadequate monitoring during the postanesthetic period. The anesthetic procedure is not over once the anesthetic agents are discontinued. The skillful anesthetist is aware of the possibilities of postoperative complications and prevents problems by employing enhanced monitoring techniques during the recovery phase.

  1. Controls on geyser periodicity.

    PubMed

    Ingebritsen, S E; Rojstaczer, S A

    1993-11-05

    Geyser eruption frequency is not constant over time and has been shown to vary with small (periodicity. Much of the responsiveness to remote seismicity and other small strains in the Earth can be explained in terms of variations in permeability and lateral recharge rates.

  2. Controls on geyser periodicity

    USGS Publications Warehouse

    Ingebritsen, S.E.; Rojstaczer, S.A.

    1993-01-01

    Geyser eruption frequency is not constant over time and has been shown to vary with small (???10-6) strains induced by seismic events, atmospheric loading, and Earth tides. The geyser system is approximated as a permeable conduit of intensely fractured rock surrounded by a less permeable rock matrix. Numerical simulation of this conceptual model yields a set of parameters that controls geyser existence and periodicity. Much of the responsiveness to remote seismicity and other small strains in the Earth can be explained in terms of variations in permeability and lateral recharge rates.

  3. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  4. Statistical summaries of selected Iowa streamflow data through September 2013

    USGS Publications Warehouse

    Eash, David A.; O'Shea, Padraic S.; Weber, Jared R.; Nguyen, Kevin T.; Montgomery, Nicholas L.; Simonson, Adrian J.

    2016-01-04

    Statistical summaries of streamflow data collected at 184 streamgages in Iowa are presented in this report. All streamgages included for analysis have at least 10 years of continuous record collected before or through September 2013. This report is an update to two previously published reports that presented statistical summaries of selected Iowa streamflow data through September 1988 and September 1996. The statistical summaries include (1) monthly and annual flow durations, (2) annual exceedance probabilities of instantaneous peak discharges (flood frequencies), (3) annual exceedance probabilities of high discharges, and (4) annual nonexceedance probabilities of low discharges and seasonal low discharges. Also presented for each streamgage are graphs of the annual mean discharges, mean annual mean discharges, 50-percent annual flow-duration discharges (median flows), harmonic mean flows, mean daily mean discharges, and flow-duration curves. Two sets of statistical summaries are presented for each streamgage, which include (1) long-term statistics for the entire period of streamflow record and (2) recent-term statistics for or during the 30-year period of record from 1984 to 2013. The recent-term statistics are only calculated for streamgages with streamflow records pre-dating the 1984 water year and with at least 10 years of record during 1984–2013. The streamflow statistics in this report are not adjusted for the effects of water use; although some of this water is used consumptively, most of it is returned to the streams.

  5. Global stroke statistics.

    PubMed

    Thrift, Amanda G; Thayabaranathan, Tharshanah; Howard, George; Howard, Virginia J; Rothwell, Peter M; Feigin, Valery L; Norrving, Bo; Donnan, Geoffrey A; Cadilhac, Dominique A

    2017-01-01

    Background Up to date data on incidence, mortality, and case-fatality for stroke are important for setting the agenda for prevention and healthcare. Aims and/or hypothesis We aim to update the most current incidence and mortality data on stroke available by country, and to expand the scope to case-fatality and explore how registry data might be complementary. Methods Data were compiled using two approaches: (1) an updated literature review building from our previous review and (2) direct acquisition and analysis of stroke events in the World Health Organization (WHO) mortality database for each country providing these data. To assess new and/or updated data on incidence, we searched multiple databases to identify new original papers and review articles that met ideal criteria for stroke incidence studies and were published between 15 May 2013 and 31 May 2016. For data on case-fatality, we searched between 1980 and 31 May 2016. We further screened reference lists and citation history of papers to identify other studies not obtained from these sources. Mortality codes for ICD-8, ICD-9, and ICD-10 were extracted. Using population denominators provided for each country, we calculated both the crude mortality from stroke and mortality adjusted to the WHO world population. We used only the most recent year reported to the WHO for which both population and mortality data were available. Results Fifty-one countries had data on stroke incidence, some with data over many time periods, and some with data in more than one region. Since our last review, there were new incidence studies from 12 countries, with four meeting pre-determined quality criteria. In these four studies, the incidence of stroke, adjusted to the WHO World standard population, ranged from 76 per 100,000 population per year in Australia (2009-10) up to 119 per 100,000 population per year in New Zealand (2011-12), with the latter being in those aged at least 15 years. Only in Martinique (2011-12) was the

  6. Dichlorphenamide: A Review in Primary Periodic Paralyses.

    PubMed

    Greig, Sarah L

    2016-03-01

    Oral dichlorphenamide (Keveyis™) is a carbonic anhydrase inhibitor that is approved in the USA for the treatment of primary hyperkalaemic and hypokalaemic periodic paralyses and related variants. The efficacy and safety of dichlorphenamide in patients with primary periodic paralyses have been evaluated in four 9-week, randomized, double-blind, placebo-controlled, phase III trials [two parallel-group trials (HOP and HYP) and two crossover trials]. In two trials in patients with hypokalaemic periodic paralysis, dichlorphenamide was associated with a significantly (eightfold) lower paralytic attack rate and fewer patients with acute intolerable worsening compared with placebo. In two trials in patients with hyperkalaemic periodic paralysis, the attack rate was lower with dichlorphenamide than placebo, with this comparison reaching statistical significance in one trial (crossover) but not the other (HYP), although the attack rate was approximately fivefold lower with dichlorphenamide than placebo in the HYP trial. In 52-week, open-label extensions of the HOP and HYP trials, dichlorphenamide provided sustained efficacy in patients with hypokalaemic or hyperkalaemic periodic paralysis. Dichlorphenamide was generally well tolerated in all four phase III trials and during the extension trials; the most common adverse events were paraesthesia, cognitive disorders and dysgeusia. As the first agent to be approved in the USA for this indication, dichlorphenamide is a valuable treatment option for patients with primary hyperkalaemic or hypokalaemic periodic paralysis.

  7. Multifunctional periodic cellular metals.

    PubMed

    Wadley, Haydn N G

    2006-01-15

    Periodic cellular metals with honeycomb and corrugated topologies are widely used for the cores of light weight sandwich panel structures. Honeycombs have closed cell pores and are well suited for thermal protection while also providing efficient load support. Corrugated core structures provide less efficient and highly anisotropic load support, but enable cross flow heat exchange opportunities because their pores are continuous in one direction. Recent advances in topology design and fabrication have led to the emergence of lattice truss structures with open cell structures. These three classes of periodic cellular metals can now be fabricated from a wide variety of structural alloys. Many topologies are found to provide adequate stiffness and strength for structural load support when configured as the cores of sandwich panels. Sandwich panels with core relative densities of 2-10% and cell sizes in the millimetre range are being assessed for use as multifunctional structures. The open, three-dimensional interconnected pore networks of lattice truss topologies provide opportunities for simultaneously supporting high stresses while also enabling cross flow heat exchange. These highly compressible structures also provide opportunities for the mitigation of high intensity dynamic loads created by impacts and shock waves in air or water. By filling the voids with polymers and hard ceramics, these structures have also been found to offer significant resistance to penetration by projectiles.

  8. Periodic truss structures

    NASA Astrophysics Data System (ADS)

    Zok, Frank W.; Latture, Ryan M.; Begley, Matthew R.

    2016-11-01

    Despite the recognition of the enormous potential of periodic trusses for use in a broad range of technologies, there are no widely-accepted descriptors of their structure. The terminology has been based loosely either on geometry of polyhedra or of point lattices: neither of which, on its own, has an appropriate structure to fully define periodic trusses. The present article lays out a system for classification of truss structure types. The system employs concepts from crystallography and geometry to describe nodal locations and connectivity of struts. Through a series of illustrative examples of progressively increasing complexity, a rational taxonomy of truss structure is developed. Its conceptual evolution begins with elementary cubic trusses, increasing in complexity with non-cubic and compound trusses as well as supertrusses, and, finally, with complex trusses. The conventions and terminology adopted to define truss structure yield concise yet unambiguous descriptions of structure types and of specific (finite) trusses. The utility of the taxonomy is demonstrated by bringing into alignment a disparate set of ad hoc and incomplete truss designations previously employed in a broad range of science and engineering fields. Additionally, the merits of a particular compound truss (comprising two interpenetrating elementary trusses) is shown to be superior to the octet truss for applications requiring high stiffness and elastic isotropy. By systematically stepping through and analyzing the finite number of structure types identified through the present classification system, optimal structures for prescribed mechanical and functional requirements are expected to be ascertained in an expeditious manner.

  9. Zemstvo Statistics on Public Education.

    ERIC Educational Resources Information Center

    Abramov, V. F.

    1997-01-01

    Surveys the general organizational principles and forms of keeping the zemstvo (regional) statistics on Russian public education. Conveys that they were subdivided into three types: (1) the current statistics that continuously monitored schools; (2) basic surveys that provided a comprehensive characterization of a given territory's public…

  10. Representational Versatility in Learning Statistics

    ERIC Educational Resources Information Center

    Graham, Alan T.; Thomas, Michael O. J.

    2005-01-01

    Statistical data can be represented in a number of qualitatively different ways, the choice depending on the following three conditions: the concepts to be investigated; the nature of the data; and the purpose for which they were collected. This paper begins by setting out frameworks that describe the nature of statistical thinking in schools, and…

  11. Modern Statistical Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    2012-07-01

    1. Introduction; 2. Probability; 3. Statistical inference; 4. Probability distribution functions; 5. Nonparametric statistics; 6. Density estimation or data smoothing; 7. Regression; 8. Multivariate analysis; 9. Clustering, classification and data mining; 10. Nondetections: censored and truncated data; 11. Time series analysis; 12. Spatial point processes; Appendices; Index.

  12. Digest of Education Statistics, 1998.

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Hoffman, Charlene M.; Geddes, Claire M.

    This 1998 edition of the "Digest of Education Statistics" is the 34th in a series of publications initiated in 1962. Its primary purpose is to provide a compilation of statistical information covering the broad field of American education from kindergarten through graduate school. The digest includes data from many government and private…

  13. Statistics Anxiety among Postgraduate Students

    ERIC Educational Resources Information Center

    Koh, Denise; Zawi, Mohd Khairi

    2014-01-01

    Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…

  14. Explorations in Statistics: Confidence Intervals

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

  15. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Bosch, Stephen; Ink, Gary; Lofquist, William S.

    1998-01-01

    Provides data on prices of U.S. and foreign materials; book title output and average prices, 1996 final and 1997 preliminary figures; book sales statistics, 1997--AAP preliminary estimates; U.S. trade in books, 1997; international book title output, 1990-95; book review media statistics; and number of book outlets in the U.S. and Canada. (PEN)

  16. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Sullivan, Sharon G.; Ink, Gary; Grabois, Andrew; Barr, Catherine

    2001-01-01

    Includes six articles that discuss research and statistics relating to the book trade. Topics include prices of U.S. and foreign materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and books and other media reviewed. (LRW)

  17. Canadian Statistics in the Classroom.

    ERIC Educational Resources Information Center

    School Libraries in Canada, 2002

    2002-01-01

    Includes 22 articles that address the use of Canadian statistics in the classroom. Highlights include the Statistics Canada Web site; other Web resources; original sources; critical thinking; debating with talented and gifted students; teaching marketing; environmental resources; data management; social issues and values; math instruction; reading…

  18. Statistical Factors in Complexation Reactions.

    ERIC Educational Resources Information Center

    Chung, Chung-Sun

    1985-01-01

    Four cases which illustrate statistical factors in complexation reactions (where two of the reactants are monodentate ligands) are presented. Included are tables showing statistical factors for the reactions of: (1) square-planar complexes; (2) tetrahedral complexes; and (3) octahedral complexes. (JN)

  19. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  20. Statistical Methods in Psychology Journals.

    ERIC Educational Resources Information Center

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  1. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Alexander, Adrian W.; And Others

    1994-01-01

    The six articles in this section examine prices of U.S. and foreign materials; book title output and average prices; book sales statistics; U.S. book exports and imports; number of book outlets in the United States and Canada; and book review media statistics. (LRW)

  2. Nursing student attitudes toward statistics.

    PubMed

    Mathew, Lizy; Aktan, Nadine M

    2014-04-01

    Nursing is guided by evidence-based practice. To understand and apply research to practice, nurses must be knowledgeable in statistics; therefore, it is crucial to promote a positive attitude toward statistics among nursing students. The purpose of this quantitative cross-sectional study was to assess differences in attitudes toward statistics among undergraduate nursing, graduate nursing, and undergraduate non-nursing students. The Survey of Attitudes Toward Statistics Scale-36 (SATS-36) was used to measure student attitudes, with higher scores denoting more positive attitudes. The convenience sample was composed of 175 students from a public university in the northeastern United States. Statistically significant relationships were found among some of the key demographic variables. Graduate nursing students had a significantly lower score on the SATS-36, compared with baccalaureate nursing and non-nursing students. Therefore, an innovative nursing curriculum that incorporates knowledge of student attitudes and key demographic variables may result in favorable outcomes.

  3. Measuring Student Learning in Social Statistics: A Pretest-Posttest Study of Knowledge Gain

    ERIC Educational Resources Information Center

    Delucchi, Michael

    2014-01-01

    This study used a pretest-posttest design to measure student learning in undergraduate statistics. Data were derived from 185 students enrolled in six different sections of a social statistics course taught over a seven-year period by the same sociology instructor. The pretest-posttest instrument reveals statistically significant gains in…

  4. Cross-correlation search for periodic gravitational waves

    SciTech Connect

    Dhurandhar, Sanjeev; Mukhopadhyay, Himan; Krishnan, Badri; Whelan, John T.

    2008-04-15

    In this paper we study the use of cross correlations between multiple gravitational wave (GW) data streams for detecting long-lived periodic signals. Cross-correlation searches between data from multiple detectors have traditionally been used to search for stochastic GW signals, but recently they have also been used in directed searches for periodic GWs. Here we further adapt the cross-correlation statistic for periodic GW searches by taking into account both the nonstationarity and the long-term-phase coherence of the signal. We study the statistical properties and sensitivity of this search and its relation to existing periodic wave searches, and describe the precise way in which the cross-correlation statistic interpolates between semicoherent and fully coherent methods. Depending on the maximum duration over which we wish to preserve phase coherence, the cross-correlation statistic can be tuned to go from a standard cross-correlation statistic using data from distinct detectors, to the semicoherent time-frequency methods with increasing coherent time baselines, and all the way to a full coherent search. This leads to a unified framework for studying periodic wave searches and can be used to make informed trade-offs between computational cost, sensitivity, and robustness against signal uncertainties.

  5. Statistical anisotropies in gravitational waves in solid inflation

    SciTech Connect

    Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk

    2014-09-01

    Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.

  6. [Childhood periodic syndromes].

    PubMed

    Cuvellier, J-C; Lépine, A

    2010-01-01

    This review focuses on the so-called "periodic syndromes of childhood that are precursors to migraine", as included in the Second Edition of the International Classification of Headache Disorders. Three periodic syndromes of childhood are included in the Second Edition of the International Classification of Headache Disorders: abdominal migraine, cyclic vomiting syndrome and benign paroxysmal vertigo, and a fourth, benign paroxysmal torticollis is presented in the Appendix. The key clinical features of this group of disorders are the episodic pattern and intervals of complete health. Episodes of benign paroxysmal torticollis begin between 2 and 8 months of age. Attacks are characterized by an abnormal inclination and/or rotation of the head to one side, due to cervical dystonia. They usually resolve by 5 years. Benign paroxysmal vertigo presents as sudden attacks of vertigo, accompanied by inability to stand without support, and lasting seconds to minutes. Age at onset is between 2 and 4 years, and the symptoms disappear by the age of 5. Cyclic vomiting syndrome is characterized in young infants and children by repeated stereotyped episodes of pernicious vomiting, at times to the point of dehydration, and impacting quality of life. Mean age of onset is 5 years. Abdominal migraine remains a controversial issue and presents in childhood with repeated stereotyped episodes of unexplained abdominal pain, nausea and vomiting occurring in the absence of headache. Mean age of onset is 7 years. Both cyclic vomiting syndrome and abdominal migraine are noted for the absence of pathognomonic clinical features but also for the large number of other conditions to be considered in their differential diagnoses. Diagnostic criteria, such as those of the Second Edition of the International Classification of Headache Disorders and the North American Society for Pediatric Gastroenterology, Hepatology and Nutrition, have made diagnostic approach and management easier. Their diagnosis

  7. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  8. On the Sensitivity of Period Searches

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2012-04-01

    Astronomical time series are special in that time sampling in them is uneven yet often with periodic gaps due to daytime, moon and seasons. There is therefore a need for special-purpose time-series analysis (TSA) methods. The emergence of massive CCD photometric surveys from the ground and space raises the question of an automatic period search in >> 105 light curves. We caution that already at the planning stage it is important to account for the effects of time sampling and analysis methods on the sensitivity of detections. We present a transparent scheme for the classification of period-search methods. We employ tools for evaluating the performance of those methods, according to the type of light curves investigated. In particular we consider sinusoidal and non-sinusoidal oscillations as well as eclipse or transit light curves. From these considerations we draw recommendations for the optimum analysis of astronomical time series. We present briefly the capability of an automatic period-search package Tatry. Finally we discuss the role of Monte Carlo simulations in the analysis of detection sensitivity. As an example, we demonstrate a practical method to account for the bandwidth (multi-trial) penalty in the statistical evaluation of detected periods.

  9. Statistical Downscaling for the Northern Great Plains

    NASA Astrophysics Data System (ADS)

    Coburn, J.

    2014-12-01

    The need for detailed, local scale information about the warming climate has led to the use of ever more complex and geographically realistic computer models as well as the use of regional models capable of capturing much finer details. Another class of methods for ascertaining localized data is known as statistical downscaling, which offers some advantages over regional models, especially in the realm of computational efficiency. Statistical downscaling can be described as the process of linking coarse resolution climate model output to that of fine resolution or even station-level data via statistical relationships with the purpose of correcting model biases at the local scale. The development and application of downscaling has given rise to a plethora of techniques which have been applied to many spatial scales and multiple climate variables. In this study two downscaling processes, bias-corrected statistical downscaling (BCSD) and canonical correlation analysis (CCA), are applied to minimum and maximum temperatures and precipitation for the Northern Great Plains (NGP, 40 - 53°N and 95 - 120°W) region at both daily and monthly time steps. The abilities of the methods were tested by assessing their ability to recreate local variations in a set of both spatial and temporal climate metrics obtained through the analysis of 1/16 degree station data for the period 1950 to 2000. Model data for temperature, precipitation and a set of predictor variables were obtained from CMIP5 for 15 models. BCSD was applied using direct comparison and correction of the variable distributions via quadrant mapping. CCA was calibrated on the data for the period 1950 to 1980 using a series of model-based predictor variables screened for increasing skill, with the derived model being applied to the period 1980 to 2000 so as to verify that it could recreate the overall climate patterns and trends. As in previous studies done on other regions, it was found that the CCA method recreated

  10. Statistical properties of ionospheric stimulated electromagnetic emissions

    NASA Astrophysics Data System (ADS)

    Karlsson, R. L.; Carozzi, T. D.; Norin, L.; Bergman, J. E. S.; Thidé, B.

    2006-08-01

    We have analysed the statistical properties of the stimulated electromagnetic emissions (SEE) spectral features in the steady state, reached after a long period of continuous HF pumping of the ionosphere in experiments performed at the Sura ionospheric radio research facility in Russia. Using a digital filter bank method, we have been able to analyse complex valued signals within narrow frequency bands. Each of the SEE spectral features are thereby separated into a number of narrow spectral components. Statistical tests were performed for all these spectral components and the distributions of the spectral amplitudes and phases were evaluated. Also, a test for sinusoidal components was performed. These tests showed that all observed SEE features were indistinguishable from coloured Gaussian noise. The test results exclude that the SEE features can be the result of a single isolated coherent process, but does not rule out that there could be many statistically independent parametric wave-wave processes taking place simultaneously at various parts of the HF-pumped ionosphere, as long as the superposition from all these is incoherent. Furthermore, from the test results, we cannot exclude the possibility that the waveforms of some, or all, of the SEE features may be chaotic.

  11. (Errors in statistical tests)3.

    PubMed

    Phillips, Carl V; MacLehose, Richard F; Kaufman, Jay S

    2008-07-14

    In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis.The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored) tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data.In response to these limitations, we gathered more data to improve the statistical precision, and analyzed the actual pattern of the

  12. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  13. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  14. Spina Bifida Data and Statistics

    MedlinePlus

    ... Materials About Us Information For... Media Policy Makers Data and Statistics Recommend on Facebook Tweet Share Compartir ... non-Hispanic white and non-Hispanic black women. Data from 12 state-based birth defects tracking programs ...

  15. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  16. Summary statistics in auditory perception.

    PubMed

    McDermott, Josh H; Schemitsch, Michael; Simoncelli, Eero P

    2013-04-01

    Sensory signals are transduced at high resolution, but their structure must be stored in a more compact format. Here we provide evidence that the auditory system summarizes the temporal details of sounds using time-averaged statistics. We measured discrimination of 'sound textures' that were characterized by particular statistical properties, as normally result from the superposition of many acoustic features in auditory scenes. When listeners discriminated examples of different textures, performance improved with excerpt duration. In contrast, when listeners discriminated different examples of the same texture, performance declined with duration, a paradoxical result given that the information available for discrimination grows with duration. These results indicate that once these sounds are of moderate length, the brain's representation is limited to time-averaged statistics, which, for different examples of the same texture, converge to the same values with increasing duration. Such statistical representations produce good categorical discrimination, but limit the ability to discern temporal detail.

  17. National Center for Health Statistics

    MedlinePlus

    ... Topics Data and Tools Publications News and Events Population Surveys National Health and Nutrition Examination Survey National Health Interview Survey National Survey of Family Growth Vital Records National Vital Statistics System National Death ...

  18. FUNSTAT and statistical image representations

    NASA Technical Reports Server (NTRS)

    Parzen, E.

    1983-01-01

    General ideas of functional statistical inference analysis of one sample and two samples, univariate and bivariate are outlined. ONESAM program is applied to analyze the univariate probability distributions of multi-spectral image data.

  19. Heart Disease and Stroke Statistics

    MedlinePlus

    ... failure on the rise; cardiovascular diseases remain leading killer AHA News: Heart failure projected to increase dramatically, ... failure on the rise; cardiovascular diseases remain leading killer 2017 Statistics At-a-Glance Heart Disease and ...

  20. Statistical Theory of Breakup Reactions

    NASA Astrophysics Data System (ADS)

    Bertulani, Carlos A.; Descouvemont, Pierre; Hussein, Mahir S.

    2014-04-01

    We propose an alternative for Coupled-Channels calculations with looselybound exotic nuclei(CDCC), based on the the Random Matrix Model of the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCCs), able in principle to take into account many pseudo channels.

  1. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  2. Statistics

    NASA Astrophysics Data System (ADS)

    Gorzkowski, Waldemar

    The following sections are included: * NATIONAL PHYSICS OLYMPIADS * DISTRIBUTION OF PRIZES IN TWENTY INTERNATIONAL PHYSICS OLYMPIADS * NUMBERS OF PRIZES IN SUBSEQUENT INTERNATIONAL PHYSICS OLYMPIADS * PROBLEMS AND THEIR MARKING

  3. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  4. Fractional statistical potential in graphene

    NASA Astrophysics Data System (ADS)

    Ardenghi, J. S.

    2017-03-01

    In this work the fractional statistics is applied to an anyon gas in graphene to obtain the special features that the arbitrary phase interchange of the particle coordinates introduce in the thermodynamic properties. The electron gas is constituted by N anyons in the long wavelength approximation obeying fractional exclusion statistics and the partition function is analyzed in terms of a perturbation expansion up to first order in the dimensionless constant λ / L being L the length of the graphene sheet and λ = βℏvF the thermal wavelength. By considering the correct permutation expansion of the many-anyons wavefunction, taking into account that the phase changes with the number of inversions in each permutation, the statistical fermionic/bosonic potential is obtained and the intermediate statistical behavior is found. It is shown that "extra" fermonic and bosonic particles states appears and this "statistical particle" distribution depends on N. Entropy and specific heat is obtained up to first order in λ / L showing that the results obtained differs from those obtained in different approximation to the fractional exclusion statistics.

  5. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    SciTech Connect

    Dai, Wu-Sheng Xie, Mi

    2013-05-15

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a{sup †}b=Λ(N) or N=Λ{sup −1}(a{sup †}b), where N, a{sup †}, and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete.

  6. Localizing periodicity in near-field images

    NASA Astrophysics Data System (ADS)

    Fraundorf, P.

    1990-02-01

    We show that Bayesian Physical inference, like that used in statistical mechanics, can guide the systematic construction of Fourier dark-field methods for localizing periodicity in near-field (e.g., scanning tunneling and electron phase contrast) images. For crystals in an aperiodic field, the Fourier coefficient Zeicphi combines with a prior estimate for background amplitude B to predict background phase (β) values distributed with a probability p(β-φ||Z,φ,B) inversely proportional to amplitude P of the signal of interest, when the latter is treated as an unknown translation scaled to B.

  7. Lagrangian statistics of light particles in turbulence

    NASA Astrophysics Data System (ADS)

    Mercado, Julián Martínez; Prakash, Vivek N.; Tagawa, Yoshiyuki; Sun, Chao; Lohse, Detlef; (International CollaborationTurbulence Research)

    2012-05-01

    We study the Lagrangian velocity and acceleration statistics of light particles (micro-bubbles in water) in homogeneous isotropic turbulence. Micro-bubbles with a diameter db = 340 μm and Stokes number from 0.02 to 0.09 are dispersed in a turbulent water tunnel operated at Taylor-Reynolds numbers (Reλ) ranging from 160 to 265. We reconstruct the bubble trajectories by employing three-dimensional particle tracking velocimetry. It is found that the probability density functions (PDFs) of the micro-bubble acceleration show a highly non-Gaussian behavior with flatness values in the range 23 to 30. The acceleration flatness values show an increasing trend with Reλ, consistent with previous experiments [G. Voth, A. La Porta, A. M. Crawford, J. Alexander, and E. Bodenschatz, "Measurement of particle accelerations in fully developed turbulence," J. Fluid Mech. 469, 121 (2002)], 10.1017/S0022112002001842 and numerics [T. Ishihara, Y. Kaneda, M. Yokokawa, K. Itakura, and A. Uno, "Small-scale statistics in highresolution direct numerical simulation of turbulence: Reynolds number dependence of one-point velocity gradient statistics," J. Fluid Mech. 592, 335 (2007)], 10.1017/S0022112007008531. These acceleration PDFs show a higher intermittency compared to tracers [S. Ayyalasomayajula, Z. Warhaft, and L. R. Collins, "Modeling inertial particle acceleration statistics in isotropic turbulence," Phys. Fluids. 20, 095104 (2008)], 10.1063/1.2976174 and heavy particles [S. Ayyalasomayajula, A. Gylfason, L. R. Collins, E. Bodenschatz, and Z. Warhaft, "Lagrangian measurements of inertial particle accelerations in grid generated wind tunnel turbulence," Phys. Rev. Lett. 97, 144507 (2006)], 10.1103/PhysRevLett.97.144507 in wind tunnel experiments. In addition, the micro-bubble acceleration autocorrelation function decorrelates slower with increasing Reλ. We also compare our results with experiments in von Kármán flows and point-particle direct numerical simulations with periodic

  8. 49 CFR Appendix A to Part 379 - Schedule of Records and Periods of Retention

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Note A. K. Supporting Data for Reports and Statistics 1. Supporting data for reports filed with the... Transportation's Bureau of Transportation Statistics and regulatory bodies: (a) Supporting data for annual financial, operating and statistical reports 3 years. (b) Supporting data for periodical reports...

  9. Periodically distributed objects with quasicrystalline diffraction pattern

    SciTech Connect

    Wolny, Janusz Strzalka, Radoslaw; Kuczera, Pawel

    2015-03-30

    It is possible to construct fully periodically distributed objects with a diffraction pattern identical to the one obtained for quasicrystals. These objects are probability distributions of distances obtained in the statistical approach to aperiodic structures distributed periodically. The diffraction patterns have been derived by using a two-mode Fourier transform—a very powerful method not used in classical crystallography. It is shown that if scaling is present in the structure, this two-mode Fourier transform can be reduced to a regular Fourier transform with appropriately rescaled scattering vectors and added phases. Detailed case studies for model sets 1D Fibonacci chain and 2D Penrose tiling are discussed. Finally, it is shown that crystalline, quasicrystalline, and approximant structures can be treated in the same way.

  10. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  11. Solar wind correlations: Statistical and case studies

    NASA Astrophysics Data System (ADS)

    Paularena, K. I.; Richardson, J. D.; Zastenker, G. N.; Dalin, P. A.

    1999-06-01

    Recent work on solar wind plasma correlations using data from several widely-separated spacecraft (IMP 8, INTERBALL-1, WIND, and ISEE-3) has shown that, for 6-hour periods, the average plasma correlation is ~0.7. The focus of these studies has been directed toward a statistical understanding of gross solar wind correlation behavior. In all correlations examined, lower average correlations are caused by the presence of many points from the low correlation subpopulation; nevertheless, data points from the high correlation population are still present. No single organizational factor has yet been found which adequately separates low-correlation periods from high-correlation periods. Some of the spread in correlations is due to the spatial orientations and dimensions of solar wind structures, and thus to the locational alignments of the spacecraft being correlated, but this does not adequately explain all the good or poor correlations since sometimes three nearby spacecraft show poor correlations, while sometimes three widely-separated space-craft show good correlations. Thus, in order to understand the underlying physics, detailed investigation of individual cases has been undertaken. These results will be important in assigning quality measures to space weather predictions using satellite measurements taken at L1, for example.

  12. A BAYESIAN STATISTICAL APPROACH FOR THE EVALUATION OF CMAQ

    EPA Science Inventory

    Bayesian statistical methods are used to evaluate Community Multiscale Air Quality (CMAQ) model simulations of sulfate aerosol over a section of the eastern US for 4-week periods in summer and winter 2001. The observed data come from two U.S. Environmental Protection Agency data ...

  13. Education Statistics Quarterly. Volume 4 Issue 4, 2002.

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2002

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  14. Progress of Education in the Asian Region: A Statistical Review.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand).

    Statistical data for the period 1950-1965, with some data up to 1967, are examined for implications for educational development of the rapidly expanding school-age population in Asia. The quantitative aspects of educational progress comprise the bulk of the review, with qualitative factors discussed in context with future planning. Numerous tables…

  15. The redoubtable ecological periodic table

    EPA Science Inventory

    Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...

  16. Doubly Resonant Optical Periodic Structure

    PubMed Central

    Alagappan, G.; Png, C. E.

    2016-01-01

    Periodic structures are well known in various branches of physics for their ability to provide a stopband. In this article, using optical periodic structures we showed that, when a second periodicity – very closed to the original periodicity is introduced, large number of states appears in the stopband corresponding to the first periodicity. In the limit where the two periods matches, we have a continuum of states, and the original stopband completely disappears. This intriguing phenomena is uncovered by noticing that, regardless of the proximities of the two periodicities, there is an array of spatial points where the dielectric functions corresponding to the two periodicities interfere destructively. These spatial points mimic photonic atoms by satisfying the standards equations of quantum harmonic oscillators, and exhibit lossless, atom-like dispersions. PMID:26853945

  17. Doubly Resonant Optical Periodic Structure.

    PubMed

    Alagappan, G; Png, C E

    2016-02-08

    Periodic structures are well known in various branches of physics for their ability to provide a stopband. In this article, using optical periodic structures we showed that, when a second periodicity--very closed to the original periodicity is introduced, large number of states appears in the stopband corresponding to the first periodicity. In the limit where the two periods matches, we have a continuum of states, and the original stopband completely disappears. This intriguing phenomena is uncovered by noticing that, regardless of the proximities of the two periodicities, there is an array of spatial points where the dielectric functions corresponding to the two periodicities interfere destructively. These spatial points mimic photonic atoms by satisfying the standards equations of quantum harmonic oscillators, and exhibit lossless, atom-like dispersions.

  18. Statistical methods in translational medicine.

    PubMed

    Chow, Shein-Chung; Tse, Siu-Keung; Lin, Min

    2008-12-01

    This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials), information (e.g. translation of basic discoveries to the clinic) and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints) in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physicianscientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change) are reviewed.

  19. Integrable matrix theory: Level statistics.

    PubMed

    Scaramazza, Jasen A; Shastry, B Sriram; Yuzbashyan, Emil A

    2016-09-01

    We study level statistics in ensembles of integrable N×N matrices linear in a real parameter x. The matrix H(x) is considered integrable if it has a prescribed number n>1 of linearly independent commuting partners H^{i}(x) (integrals of motion) [H(x),H^{i}(x)]=0, [H^{i}(x),H^{j}(x)]=0, for all x. In a recent work [Phys. Rev. E 93, 052114 (2016)2470-004510.1103/PhysRevE.93.052114], we developed a basis-independent construction of H(x) for any n from which we derived the probability density function, thereby determining how to choose a typical integrable matrix from the ensemble. Here, we find that typical integrable matrices have Poisson statistics in the N→∞ limit provided n scales at least as logN; otherwise, they exhibit level repulsion. Exceptions to the Poisson case occur at isolated coupling values x=x_{0} or when correlations are introduced between typically independent matrix parameters. However, level statistics cross over to Poisson at O(N^{-0.5}) deviations from these exceptions, indicating that non-Poissonian statistics characterize only subsets of measure zero in the parameter space. Furthermore, we present strong numerical evidence that ensembles of integrable matrices are stationary and ergodic with respect to nearest-neighbor level statistics.

  20. Equivalent statistics and data interpretation.

    PubMed

    Francis, Gregory

    2016-10-14

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  1. Thermodynamic Limit in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2014-03-01

    The thermodynamic limit in statistical thermodynamics of many-particle systems is an important but often overlooked issue in the various applied studies of condensed matter physics. To settle this issue, we review tersely the past and present disposition of thermodynamic limiting procedure in the structure of the contemporary statistical mechanics and our current understanding of this problem. We pick out the ingenious approach by Bogoliubov, who developed a general formalism for establishing the limiting distribution functions in the form of formal series in powers of the density. In that study, he outlined the method of justification of the thermodynamic limit when he derived the generalized Boltzmann equations. To enrich and to weave our discussion, we take this opportunity to give a brief survey of the closely related problems, such as the equipartition of energy and the equivalence and nonequivalence of statistical ensembles. The validity of the equipartition of energy permits one to decide what are the boundaries of applicability of statistical mechanics. The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern statistical physics of the infinite and "small" many-particle systems.

  2. 76 FR 297 - Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... 39 CFR Part 3050 Periodic Reporting AGENCY: Postal Regulatory Commission. ACTION: Notice of proposed... a proposed change in certain analytical methods used in periodic reporting. This action responds to... proceeding to consider changes in the analytical methods approved for use in periodic reporting.\\1\\...

  3. 75 FR 1301 - Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... 39 CFR Part 3050 Periodic Reporting AGENCY: Postal Regulatory Commission. ACTION: Proposed rule... rulemaking proceeding to consider changes in the analytical methods approved for use in periodic reporting.\\1... Docket No. RM2009-10, Order on Analytical Principles Used in Periodic Reporting (Proposals Three...

  4. Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Nordenhaug, Erik

    2004-01-01

    This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…

  5. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  6. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  7. Statistical validation of stochastic models

    SciTech Connect

    Hunter, N.F.; Barney, P.; Paez, T.L.; Ferregut, C.; Perez, L.

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  8. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  9. The Relationship between Statistics Self-Efficacy, Statistics Anxiety, and Performance in an Introductory Graduate Statistics Course

    ERIC Educational Resources Information Center

    Schneider, William R.

    2011-01-01

    The purpose of this study was to determine the relationship between statistics self-efficacy, statistics anxiety, and performance in introductory graduate statistics courses. The study design compared two statistics self-efficacy measures developed by Finney and Schraw (2003), a statistics anxiety measure developed by Cruise and Wilkins (1980),…

  10. Illustrating the practice of statistics

    SciTech Connect

    Hamada, Christina A; Hamada, Michael S

    2009-01-01

    The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.

  11. A Test Model for Fluctuation-Dissipation Theorems with Time Periodic Statistics (PREPRINT)

    DTIC Science & Technology

    2010-03-09

    Cov(u2(, u ∗ 2) ∂γ2 = − ∫ t −∞ ∫ t −∞ (2t− s− r)e−γ2(2t−s−r) ( 〈ψ(s, t)ψ(r, t)〉 − 〈ψ(s, t)〉〈ψ(r, t)〉 ) f2(s)f2(r)dsdr, (88) References [1] R. Abramov ...Short-time linear response with reduced-rank tangent map. Chinese Annals of Mathematics, Series B., 30:447–462, 2009. [2] R. Abramov and A.J. Majda... Abramov and A.J. Majda. New approximations and tests of linear fluctuation-response for chaotic nonlinear forced-dissipative dynamical sys- tems. J

  12. The Genesis Mission Solar Wind Collection: Solar-Wind Statistics over the Period of Collection

    NASA Technical Reports Server (NTRS)

    Barraclough, B. L.; Wiens, R. C.; Steinberg, J. E.; Reisenfeld, D. B.; Neugebauer, M.; Burnett, D. S.; Gosling, J.; Bremmer, R. R.

    2004-01-01

    The NASA Genesis spacecraft was launched August 8, 2001 on a mission to collect samples of solar wind for 2 years and return them to earth September 8, 2004. Detailed analyses of the solar wind ions implanted into high-purity collection substrates will be carried out using various mass spectrometry techniques. These analyses are expected to determine key isotopic ratios and elemental abundances in the solar wind, and by extension, in the solar photosphere. Further, the photospheric composition is thought to be representative of the solar nebula with a few exceptions, so that the Genesis mission will provide a baseline for the average solar nebula composition with which to compare present-day compositions of planets, meteorites, and asteroids. The collection of solar wind samples is almost complete. Collection began for most substrates in early December, 2001, and is scheduled to be complete on April 2 of this year. It is critical to understand the solar-wind conditions during the collection phase of the mission. For this reason, plasma ion and electron spectrometers are continuously monitoring the solar wind proton density, velocity, temperature, the alpha/proton ratio, and angular distribution of suprathermal electrons. Here we report on the solar-wind conditions as observed by these in-situ instruments during the first half of the collection phase of the mission, from December, 2001 to present.

  13. Key China Energy Statistics 2012

    SciTech Connect

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-05-01

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  14. Statistical parameters for gloss evaluation

    SciTech Connect

    Peiponen, Kai-Erik; Juuti, Mikko

    2006-02-13

    The measurement of minute changes in local gloss has not been presented in international standards due to a lack of suitable glossmeters. The development of a diffractive-element-based glossmeter (DOG) made it possible to detect local variation of gloss from planar and complex-shaped surfaces. Hence, a demand for proper statistical gloss parameters for classifying surface quality by gloss, similar to the standardized surface roughness classification, has become necessary. In this letter, we define statistical gloss parameters and utilize them as an example in the characterization of gloss from metal surface roughness standards by the DOG.

  15. Statistical inference for inverse problems

    NASA Astrophysics Data System (ADS)

    Bissantz, Nicolai; Holzmann, Hajo

    2008-06-01

    In this paper we study statistical inference for certain inverse problems. We go beyond mere estimation purposes and review and develop the construction of confidence intervals and confidence bands in some inverse problems, including deconvolution and the backward heat equation. Further, we discuss the construction of certain hypothesis tests, in particular concerning the number of local maxima of the unknown function. The methods are illustrated in a case study, where we analyze the distribution of heliocentric escape velocities of galaxies in the Centaurus galaxy cluster, and provide statistical evidence for its bimodality.

  16. Ranald Macdonald and statistical inference.

    PubMed

    Smith, Philip T

    2009-05-01

    Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.

  17. Statistical Mechanics of Prion Diseases

    SciTech Connect

    Slepoy, A.; Singh, R. R. P.; Pazmandi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-30

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ''species barriers'' to prion infection and assess a related treatment protocol.

  18. The Statistics of Visual Representation

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.

    2002-01-01

    The experience of retinex image processing has prompted us to reconsider fundamental aspects of imaging and image processing. Foremost is the idea that a good visual representation requires a non-linear transformation of the recorded (approximately linear) image data. Further, this transformation appears to converge on a specific distribution. Here we investigate the connection between numerical and visual phenomena. Specifically the questions explored are: (1) Is there a well-defined consistent statistical character associated with good visual representations? (2) Does there exist an ideal visual image? And (3) what are its statistical properties?

  19. Key China Energy Statistics 2011

    SciTech Connect

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-01-15

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  20. Vector statistics of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.; Underwood, D.

    1977-01-01

    A digitized multispectral image, such as LANDSAT data, is composed of numerous four dimensional vectors, which quantitatively describe the ground scene from which the data are acquired. The statistics of unique vectors that occur in LANDSAT imagery are studied to determine if that information can provide some guidance on reducing image processing costs. A second purpose of this report is to investigate how the vector statistics are changed by various types of image processing techniques and determine if that information can be useful in choosing one processing approach over another.

  1. Fluid turbulence - Deterministic or statistical

    NASA Astrophysics Data System (ADS)

    Cheng, Sin-I.

    The deterministic view of turbulence suggests that the classical theory of fluid turbulence may be treating the wrong entity. The paper explores the physical implications of such an abstract mathematical result, and provides a constructive computational demonstration of the deterministic and the wave nature of fluid turbulence. The associated pressure disturbance for restoring solenoidal velocity is the primary agent, and its reflection from solid surface(s) the dominant mechanism of turbulence production. Statistical properties and their modeling must address to the statistics of the uncertainties of initial boundary data of the ensemble.

  2. Statistical summaries of New Jersey streamflow records

    USGS Publications Warehouse

    Laskowski, Stanley L.

    1970-01-01

    In 1961 the U.S. Geological Survey prepared a report which was published by the State of New Jersey as Water Resources Circular 6, "New Jersey Streamflow Records analyzed with Electronic Computer" by Miller and McCall. Basic discharge data for periods of record through 1958 were analyzed for 59 stream-gaging stations in New Jersey and flow-duration, low-flow, and high-flow tables were presented. The purpose of the current report is to update and expand Circular 6 by presenting, with a few meaningful statistics and tables, the bulk of the information that may be obtained from the mass of streamflow records available. The records for 79 of approximately 110 stream-gaging stations presently or previously operated in New Jersey, plus records for three stations in Pennsylvania, and one in New York are presented in summarized form. In addition to inclusing a great number of stations in this report, more years of record and more tables are listed for each station. A description of the station, three arrangements of data summarizing the daily flow records and one table listing statistics of the monthly mean flows are provided. No data representing instantaneous extreme flows are given. Plotting positions for the three types of curves describing the characteristics of daily discharge are listed for each station. Statistical parameters are also presented so that alternate curves may be drawn. All stations included in this report have 5 or more years of record. The data presented herein are based on observed flow past the gaging station. For any station where the observed flow is affected by regulation or diversion, a "Remarks" paragraph, explaining the possible effect on the data, is included in the station description. Since any streamflow record is a sample in time, the data derived from these records can provide only a guide to expected future flows. For this reason the flow records are analyzed by statistical techniques, and the magnitude of sampling errors should be

  3. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs....

  4. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs....

  5. Teaching Statistics in Integration with Psychology

    ERIC Educational Resources Information Center

    Wiberg, Marie

    2009-01-01

    The aim was to revise a statistics course in order to get the students motivated to learn statistics and to integrate statistics more throughout a psychology course. Further, we wish to make students become more interested in statistics and to help them see the importance of using statistics in psychology research. To achieve this goal, several…

  6. Understanding Statistics Using Computer Demonstrations

    ERIC Educational Resources Information Center

    Dunn, Peter K.

    2004-01-01

    This paper discusses programs that clarify some statistical ideas often discussed yet poorly understood by students. The programs adopt the approach of demonstrating what is happening, rather than using the computer to do the work for the students (and hide the understanding). The programs demonstrate normal probability plots, overfitting of…

  7. Introductory Statistics and Fish Management.

    ERIC Educational Resources Information Center

    Jardine, Dick

    2002-01-01

    Describes how fisheries research and management data (available on a website) have been incorporated into an Introductory Statistics course. In addition to the motivation gained from seeing the practical relevance of the course, some students have participated in the data collection and analysis for the New Hampshire Fish and Game Department. (MM)

  8. China's Statistical System and Resources

    ERIC Educational Resources Information Center

    Xue, Susan

    2004-01-01

    As the People's Republic of China plays an increasingly important role in international politics and trade, countries with economic interests there find they need to know more about this nation. Access to primary information sources, including official statistics from China, however, is very limited, as little exploration has been done into this…

  9. Digest of Education Statistics, 1990.

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Hoffman, Charlene M.

    This document, consisting of 7 chapters, 35 figures, and 380 tables, provides statistical data on most aspects of United States education, both public and private, from kindergarten through graduate school. The chapters cover the following topics; (1) all levels of education; (2) elementary and secondary education; (3) postsecondary, college,…

  10. Concept Maps in Introductory Statistics

    ERIC Educational Resources Information Center

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.

  11. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  12. QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES

    SciTech Connect

    G. GEIGER; ET AL

    2000-11-01

    The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.

  13. Undergraduate experiments on statistical optics

    NASA Astrophysics Data System (ADS)

    Scholz, Ruediger; Friege, Gunnar; Weber, Kim-Alessandro

    2016-09-01

    Since the pioneering experiments of Forrester et al (1955 Phys. Rev. 99 1691) and Hanbury Brown and Twiss (1956 Nature 177 27; Nature 178 1046), along with the introduction of the laser in the 1960s, the systematic analysis of random fluctuations of optical fields has developed to become an indispensible part of physical optics for gaining insight into features of the fields. In 1985 Joseph W Goodman prefaced his textbook on statistical optics with a strong commitment to the ‘tools of probability and statistics’ (Goodman 2000 Statistical Optics (New York: John Wiley & Sons Inc.)) in the education of advanced optics. Since then a wide range of novel undergraduate optical counting experiments and corresponding pedagogical approaches have been introduced to underpin the rapid growth of the interest in coherence and photon statistics. We propose low cost experimental steps that are a fair way off ‘real’ quantum optics, but that give deep insight into random optical fluctuation phenomena: (1) the introduction of statistical methods into undergraduate university optical lab work, and (2) the connection between the photoelectrical signal and the characteristics of the light source. We describe three experiments and theoretical approaches which may be used to pave the way for a well balanced growth of knowledge, providing students with an opportunity to enhance their abilities to adapt the ‘tools of probability and statistics’.

  14. Statistical methods for evolutionary trees.

    PubMed

    Edwards, A W F

    2009-09-01

    In 1963 and 1964, L. L. Cavalli-Sforza and A. W. F. Edwards introduced novel methods for computing evolutionary trees from genetical data, initially for human populations from blood-group gene frequencies. The most important development was their introduction of statistical methods of estimation applied to stochastic models of evolution.

  15. What Price Statistical Tables Now?

    ERIC Educational Resources Information Center

    Hunt, Neville

    1997-01-01

    Describes the generation of all the tables required for school-level study of statistics using Microsoft's Excel spreadsheet package. Highlights cumulative binomial probabilities, cumulative Poisson probabilities, normal distribution, t-distribution, chi-squared distribution, F-distribution, random numbers, and accuracy. (JRH)

  16. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Bosch, Stephen; Ink, Gary; Greco, Albert N.

    1999-01-01

    Presents: "Prices of United States and Foreign Published Materials"; "Book Title Output and Average Prices"; "Book Sales Statistics, 1998"; "United States Book Exports and Imports: 1998"; "International Book Title Output: 1990-96"; "Number of Book Outlets in the United States and Canada";…

  17. American Youth: A Statistical Snapshot.

    ERIC Educational Resources Information Center

    Wetzel, James R.

    This document presents a statistics snapshot of young people, aged 15 to 24 years. It provides a broad overview of trends documenting the direction of changes in social behavior and economic circumstances. The projected decline in the total number of youth from 43 million in 1980 to 35 million in 1995 will affect marriage and childbearing…

  18. Gender Issues in Labour Statistics.

    ERIC Educational Resources Information Center

    Greenwood, Adriana Mata

    1999-01-01

    Presents the main features needed for labor statistics to reflect the respective situations for women and men in the labor market. Identifies topics to be covered and detail needed for significant distinctions to emerge. Explains how the choice of measurement method and data presentation can influence the final result. (Author/JOW)

  19. The Statistical Handbook on Technology.

    ERIC Educational Resources Information Center

    Berinstein, Paula

    This volume tells stories about the tools we use, but these narratives are told in numbers rather than in words. Organized by various aspects of society, each chapter uses tables and statistics to examine everything from budgets, costs, sales, trade, employment, patents, prices, usage, access and consumption. In each chapter, each major topic is…

  20. Discussion on Statistics Teaching Management

    ERIC Educational Resources Information Center

    Wu, Qingjun

    2008-01-01

    The teaching management requires reasonable deployment of all kinds of teaching essential factors in teaching process to promote students' comprehensive and harmonious development. Having analyzing questions which appears frequently in statistics teaching management of college, the article finds out their causes, according to which this article…

  1. Central Statistical Libraries in Europe.

    ERIC Educational Resources Information Center

    Kaiser, Lisa

    The paper tries to clarify the position special governmental libraries hold in the system of libraries of today by investigating only one specific type of library mainly from a formal and historical point of view. Central statistical libraries in Europe were first regarded as administrative and archival libraries. Their early holdings of foreign…

  2. Instructional Theory for Teaching Statistics.

    ERIC Educational Resources Information Center

    Atwood, Jan R.; Dinham, Sarah M.

    Metatheoretical analysis of Ausubel's Theory of Meaningful Verbal Learning and Gagne's Theory of Instruction using the Dickoff and James paradigm produced two instructional systems for basic statistics. The systems were tested with a pretest-posttest control group design utilizing students enrolled in an introductory-level graduate statistics…

  3. Statistics of premixed flame cells

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1991-01-01

    The statistics of random cellular patterns in premixed flames are analyzed. Agreement is found with a variety of topological relations previously found for other networks, namely, Lewis's law and Aboav's law. Despite the diverse underlying physics, flame cells are shown to share a broad class of geometric properties with other random networks-metal grains, soap foams, bioconvection, and Langmuir monolayers.

  4. Statistics of premixed flame cells

    SciTech Connect

    Noever, D.A. )

    1991-07-15

    The statistics of random cellular patterns in premixed flames are analyzed. Agreement is found with a variety of topological relations previously found for other networks, namely, Lewis's law and Aboav's law. Despite the diverse underlying physics, flame cells are shown to share a broad class of geometric properties with other random networks---metal grains, soap foams, bioconvection, and Langmuir monolayers.

  5. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  6. Statistical description of tectonic motions

    NASA Technical Reports Server (NTRS)

    Agnew, Duncan Carr

    1993-01-01

    This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.

  7. Tsallis statistics and neurodegenerative disorders

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.

    2016-08-01

    In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.

  8. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  9. Measuring Skewness: A Forgotten Statistic?

    ERIC Educational Resources Information Center

    Doane, David P.; Seward, Lori E.

    2011-01-01

    This paper discusses common approaches to presenting the topic of skewness in the classroom, and explains why students need to know how to measure it. Two skewness statistics are examined: the Fisher-Pearson standardized third moment coefficient, and the Pearson 2 coefficient that compares the mean and median. The former is reported in statistical…

  10. Teaching Statistics through Learning Projects

    ERIC Educational Resources Information Center

    Moreira da Silva, Mauren Porciúncula; Pinto, Suzi Samá

    2014-01-01

    This paper aims to reflect on the teaching of statistics through student research, in the form of projects carried out by students on self-selected topics. The paper reports on a study carried out with two undergraduate classes using a methodology of teaching that we call "learning projects." Monitoring the development of the various…

  11. Statistical Prediction in Proprietary Rehabilitation.

    ERIC Educational Resources Information Center

    Johnson, Kurt L.; And Others

    1987-01-01

    Applied statistical methods to predict case expenditures for low back pain rehabilitation cases in proprietary rehabilitation. Extracted predictor variables from case records of 175 workers compensation claimants with some degree of permanent disability due to back injury. Performed several multiple regression analyses resulting in a formula that…

  12. Statistics by Example, Detecting Patterns.

    ERIC Educational Resources Information Center

    Mosteller, Frederick; And Others

    This booklet is part of a series of four pamphlets, each intended to stand alone, which provide problems in probability and statistics at the secondary school level. Twelve different real-life examples (written by professional statisticians and experienced teachers) have been collected in this booklet to illustrate the ideas of mean, variation,…

  13. Statistical properties of record-breaking temperatures.

    PubMed

    Newman, William I; Malamud, Bruce D; Turcotte, Donald L

    2010-12-01

    A record-breaking temperature is the highest or lowest temperature at a station since the period of time considered began. The temperatures at a station constitute a time series. After the removal of daily and annual periodicities, the primary considerations are trends (i.e., global warming) and long-range correlations. We first carry out Monte Carlo simulations to determine the influence of trends and long-range correlations on record-breaking statistics. We take a time series that is a Gaussian white noise and give the classic record-breaking theory results for an independent and identically distributed process. We then carry out simulations to determine the influence of long-range correlations and linear temperature trends. For the range of fractional Gaussian noises that are observed to be applicable to temperature time series, the influence on the record-breaking statistics is less than 10%. We next superimpose a linear trend on a Gaussian white noise and extend the theory to include the effect of an additive trend. We determine the ratios of the number of maximum to the number of minimum record-breaking temperatures. We find the single governing parameter to be the ratio of the temperature change per year to the standard deviation of the underlying white noise. To test our approach, we consider a 30 yr record of temperatures at the Mauna Loa Observatory for 1977-2006. We determine the temperature trends by direct measurements and use our simulations to infer trends from the number of record-breaking temperatures. The two approaches give values that are in good agreement. We find that the warming trend is primarily due to an increase in the (overnight) minimum temperatures, while the maximum (daytime) temperatures are approximately constant.

  14. Credibility of statistical downscaling under nonstationary climate

    NASA Astrophysics Data System (ADS)

    Salvi, Kaustubh; Ghosh, Subimal; Ganguly, Auroop R.

    2016-03-01

    Statistical downscaling (SD) establishes empirical relationships between coarse-resolution climate model simulations with higher-resolution climate variables of interest to stakeholders. These statistical relations are estimated based on historical observations at the finer resolutions and used for future projections. The implicit assumption is that the SD relations, extracted from data are stationary or remain unaltered, despite non-stationary change in climate. The validity of this assumption relates directly to the credibility of SD. Falsifiability of climate projections is a challenging proposition. Calibration and verification, while necessary for SD, are unlikely to be able to reproduce the full range of behavior that could manifest at decadal to century scale lead times. We propose a design-of-experiments (DOE) strategy to assess SD performance under nonstationary climate and evaluate the strategy via a transfer-function based SD approach. The strategy relies on selection of calibration and validation periods such that they represent contrasting climatic conditions like hot-versus-cold and ENSO-versus-non-ENSO years. The underlying assumption is that conditions such as warming or predominance of El Niño may be more prevalent under climate change. In addition, two different historical time periods are identified, which resemble pre-industrial and the most severe future emissions scenarios. The ability of the empirical relations to generalize under these proxy conditions is considered an indicator of their performance under future nonstationarity. Case studies over two climatologically disjoint study regions, specifically India and Northeast United States, reveal robustness of DOE in identifying the locations where nonstationarity prevails as well as the role of effective predictor selection under nonstationarity.

  15. Photon Counts Statistics in Leukocyte Cell Dynamics

    NASA Astrophysics Data System (ADS)

    van Wijk, Eduard; van der Greef, Jan; van Wijk, Roeland

    2011-12-01

    In the present experiment ultra-weak photon emission/ chemiluminescence from isolated neutrophils was recorded. It is associated with the production of reactive oxygen species (ROS) in the "respiratory burst" process which can be activated by PMA (Phorbol 12-Myristate 13-Acetate). Commonly, the reaction is demonstrated utilizing the enhancer luminol. However, with the use of highly sensitive photomultiplier equipment it is also recorded without enhancer. In that case, it can be hypothesized that photon count statistics may assist in understanding the underlying metabolic activity and cooperation of these cells. To study this hypothesis leukocytes were stimulated with PMA and increased photon signals were recorded in the quasi stable period utilizing Fano factor analysis at different window sizes. The Fano factor is defined by the variance over the mean of the number of photon within the observation time. The analysis demonstrated that the Fano factor of true signal and not of the surrogate signals obtained by random shuffling increases when the window size increased. It is concluded that photon count statistics, in particular Fano factor analysis, provides information regarding leukocyte interactions. It opens the perspective to utilize this analytical procedure in (in vivo) inflammation research. However, this needs further validation.

  16. Statistical tests for prediction of lignite quality

    SciTech Connect

    C.J. Kolovos

    2007-06-15

    Domestic lignite from large, bucket wheel excavators based open pit mines is the main fuel for electricity generation in Greece. Lignite from one or more mines may arrive at any power plant stockyard. The mixture obtained constitutes the lignite fuel fed to the power plant. The fuel is sampled in regular time intervals. These samples are considered as results of observations of values of spatial random variables. The aim was to form and statistically test many small sample populations. Statistical tests on the values of the humidity content, the ash-water free content, and the lower heating value of the lignite fuel indicated that the sample values form a normal population. The Kolmogorov-Smirnov test was applied for testing goodness-of-fit of sample distribution for a three year period and different power plants of the Kozani-Ptolemais area, western Macedonia, Greece. The normal distribution hypothesis can be widely accepted for forecasting the distribution of values of the basic quality characteristics even for a small number of samples.

  17. Statistical pattern recognition for rock joint images

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Bin, Cui

    2005-10-01

    As a cooperation project between Sweden and China, we sampled a number of rock specimens for analyze rock fracture network by optical image technique. The samples are resin injected, in which way; opened fractures can be seen clearly by means of UV (Ultraviolet) light illumination. In the study period, Recognition of rock fractures is crucial in many rock engineering applications. In order to successfully applying automatic image processing techniques for the problem of automatic (or semi-automatic) rock fracture detection and description, the key (and hardest task) is the automatic detection of fractures robustly in images. When statistical pattern recognition is used to segment a rock joint color image, features of different samples can be learned first, then, each pixel of the image is classified by these features. As the testing result showing, an attribute rock fracture image is segmented satisfactorily by using this way. The method can be widely used for other complicated images too. In this paper, Kernel Fisher discrimination (KFD) is employed to construct a statistical pattern recognition classifier. KFD can transform nonlinear discrimination in an attribute space with high dimension, into linear discrimination in a feature space with low dimension. While one needs not know the detailed mapping form from attribute space to feature space in the process of transformation. It is proved that this method performs well by segmenting complicated rock joint color images.

  18. Supratransmission in a disordered nonlinear periodic structure

    NASA Astrophysics Data System (ADS)

    Yousefzadeh, B.; Phani, A. Srikantha

    2016-10-01

    We study the interaction among dispersion, nonlinearity, and disorder effects in the context of wave transmission through a discrete periodic structure, subjected to continuous harmonic excitation in its stop band. We consider a damped nonlinear periodic structure of finite length with disorder. Disorder is introduced throughout the structure by small changes in the stiffness parameters drawn from a uniform statistical distribution. Dispersion effects forbid wave transmission within the stop band of the linear periodic structure. However, nonlinearity leads to supratransmission phenomenon, by which enhanced wave transmission occurs within the stop band of the periodic structure when forced at an amplitude exceeding a certain threshold. The frequency components of the transmitted waves lie within the pass band of the linear structure, where disorder is known to cause Anderson localization. There is therefore a competition between dispersion, nonlinearity, and disorder in the context of supratransmission. We show that supratransmission persists in the presence of disorder. The influence of disorder decreases in general as the forcing frequency moves away from the pass band edge, reminiscent of dispersion effects subsuming disorder effects in linear periodic structures. We compute the dependence of the supratransmission force threshold on nonlinearity and strength of coupling between units. We observe that nonlinear forces are confined to the driven unit for weakly coupled systems. This observation, together with the truncation of higher-order nonlinear terms, permits us to develop closed-form expressions for the supratransmission force threshold. In sum, in the frequency range studied here, disorder does not influence the supratransmission force threshold in the ensemble-average sense, but it does reduce the average transmitted wave energy.

  19. Ideas for Effective Communication of Statistical Results

    DOE PAGES

    Anderson-Cook, Christine M.

    2015-03-01

    Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.

  20. A Perspective on Teaching Elementary Statistics.

    ERIC Educational Resources Information Center

    Wainwright, Barbara A.; Austin, Homer W.

    1997-01-01

    Shares the perspectives of two instructors of elementary statistics at the college level. Describes a course developed to increase statistics learning and student motivation to learn statistics by introducing writing into the course content. (DDR)

  1. Experiencing the Research Process in a Single Class Period

    ERIC Educational Resources Information Center

    Cook, Kathleen E.

    2008-01-01

    Books and courses on research methods, statistics, or both, often necessarily focus on one topic at a time. This compartmentalized approach prevents students from seeing the big picture. To address this shortcoming, I developed an exercise through which students experience the whole research process in a single class period. From posing a…

  2. An Analysis of P-3 Aircraft Service Period Adjustment Criteria

    DTIC Science & Technology

    1986-12-01

    A. ASPA(AIRCRAFT SERVICE PERIOD ADJUSTMENT) ....... 29 B. A REVIEW OF ANALYSIS METHODS ................... 310 1. Delphi ...uncertainty. A brief review of their procedures, advantages, and disadvantages is helpful to justify selecting the most appropriate method. 1. Delphi ...Technique The+ Delphi Technique is a method of statistically refining tie opinions of a group of experts or especially knowledgeable personnel. The

  3. From Periodic Properties to a Periodic Table Arrangement

    ERIC Educational Resources Information Center

    Besalú, Emili

    2013-01-01

    A periodic table is constructed from the consideration of periodic properties and the application of the principal components analysis technique. This procedure is useful for objects classification and data reduction and has been used in the field of chemistry for many applications, such as lanthanides, molecules, or conformers classification.…

  4. 24 CFR 203.266 - Period covered by periodic MIP.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HOUSING AND URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Period covered by periodic MIP. 203.266 Section 203.266 Housing and Urban Development Regulations Relating to Housing and...

  5. 24 CFR 203.266 - Period covered by periodic MIP.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... HOUSING AND URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER... 24 Housing and Urban Development 2 2013-04-01 2013-04-01 false Period covered by periodic MIP. 203.266 Section 203.266 Housing and Urban Development Regulations Relating to Housing and...

  6. 24 CFR 203.266 - Period covered by periodic MIP.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... HOUSING AND URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Period covered by periodic MIP. 203.266 Section 203.266 Housing and Urban Development Regulations Relating to Housing and...

  7. 24 CFR 203.266 - Period covered by periodic MIP.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... HOUSING AND URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER... 24 Housing and Urban Development 2 2012-04-01 2012-04-01 false Period covered by periodic MIP. 203.266 Section 203.266 Housing and Urban Development Regulations Relating to Housing and...

  8. 24 CFR 203.266 - Period covered by periodic MIP.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... HOUSING AND URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER... 24 Housing and Urban Development 2 2014-04-01 2014-04-01 false Period covered by periodic MIP. 203.266 Section 203.266 Housing and Urban Development Regulations Relating to Housing and...

  9. Societal Statistics by virtue of the Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2012-09-01

    The Drake equation, first proposed by Frank D. Drake in 1961, is the foundational equation of SETI. It yields an estimate of the number N of extraterrestrial communicating civilizations in the Galaxy given by the product N=Ns×fp×ne×fl×fi×fc×fL, where: Ns is the number of stars in the Milky Way Galaxy; fp is the fraction of stars that have planetary systems; ne is the number of planets in a given system that are ecologically suitable for life; fl is the fraction of otherwise suitable planets on which life actually arises; fi is the fraction of inhabited planets on which an intelligent form of life evolves; fc is the fraction of planets inhabited by intelligent beings on which a communicative technical civilization develops; and fL is the fraction of planetary lifetime graced by a technical civilization. The first three terms may be called "the astrophysical terms" in the Drake equation since their numerical value is provided by astrophysical considerations. The fourth term, fl, may be called "the origin-of-life term" and entails biology. The last three terms may be called "the societal terms" inasmuch as their respective numerical values are provided by anthropology, telecommunication science and "futuristic science", respectively. In this paper, we seek to provide a statistical estimate of the three societal terms in the Drake equation basing our calculations on the Statistical Drake Equation first proposed by this author at the 2008 IAC. In that paper the author extended the simple 7-factor product so as to embody Statistics. He proved that, no matter which probability distribution may be assigned to each factor, if the number of factors tends to infinity, then the random variable N follows the lognormal distribution (central limit theorem of Statistics). This author also proved at the 2009 IAC that the Dole (1964) [7] equation, yielding the number of Habitable Planets for Man in the Galaxy, has the same mathematical structure as the Drake equation. So the

  10. Periodic cometary showers: Real or imaginary?

    NASA Technical Reports Server (NTRS)

    Grieve, R. A. F.; Sharpton, V. L.; Goodacre, A. K.; Garvin, J. B.

    1985-01-01

    Since the initial reports in 1980, a considerable body of chemical and physical evidence has been accumulated to indicate that a major impact event occurred on earth 65 million years ago. The effects of this event were global in extent and have been suggested as the cause of the sudden demise or mass extinction of a large percentage of life, including the dinosaurs, at the end of the geologic time period known as the Cretaceous. Recent statistical analyses of extinctions in the marine faunal record for the last 250 million years have suggested that mass extinctions may occur with a periodicity of every 26 to 30 million years. Following these results, other workers have attempted to demonstrate that these extinction events, like that at the end of the Cretaceous, are temporally correlated with large impact events. A recent scenario suggests that they are the result of periodic showers of comets produced by either the passage of the solar system through the galactic plane or by perturbations of the cometary cloud in the outer solar system by a, as yet unseen, solar companion. This hypothesized solar companion has been given the name Nemesis.

  11. Statistics of particle time-temperature histories :

    SciTech Connect

    Hewson, John C.; Gin, Craig; Lignell, David O.; Sun, Guangyuan

    2013-10-01

    Progress toward predictions of the statistics of particle time-temperature histories is presented. These predictions are to be made using Lagrangian particle models within the one-dimensional turbulence (ODT) model. In the present reporting period we have further characterized the performance, behavior and capabilities of the particle dispersion models that were added to the ODT model in the first period. We have also extended the capabilities in two manners. First we provide alternate implementations of the particle transport process within ODT; within this context the original implementation is referred to as the type-I and the new implementations are referred to as the type-C and type-IC interactions. Second we have developed and implemented models for two-way coupling between the particle and fluid phase. This allows us to predict the reduced rate of turbulent mixing associated with particle dissipation of energy and similar phenomena. Work in characterizing these capabilities has taken place in homogeneous decaying turbulence, in free shear layers, in jets and in channel flow with walls, and selected results are presented.

  12. Role of CMEs in IMF winding statistics

    NASA Technical Reports Server (NTRS)

    Smith, Charles W.; Phillips, John L.

    1995-01-01

    Past studies of the spiral winding of the IMF have revealed an over winding relative to the Parker prediction that is evident in both the omnitape and Pioneer-Venus Orbiter data sets. An asymmetry between the winding of the northern and southern hemispheres is also observed. Both results are seen to be persistent over many years and statistically significant. Both results have implications for cosmic ray propagation in the heliosphere. There has been a suggestion in past analyses that the digression from the Parker prediction is greatest during times of heightened CME activity. We examine the possible role of CMEs in these past analyses by extracting CME observations from the ISEE-3 dataset and analyzing CME and undisturbed periods separately. We use the full ISEE-3 dataset representing the entire L1 mission (1978 1982). This coincides with a period in the solar cycle when CME activity was heightened. Preliminary results are suggestive that CMEs may be responsible for a significant portion of both the spiral angle overwinding and asymmetry. Possible implications for the high latitude fields and cosmic ray propagation will be reviewed in light of this analysis.

  13. Statistical equilibrium of bubble oscillations in dilute bubbly flows

    PubMed Central

    Colonius, Tim; Hagmeijer, Rob; Ando, Keita; Brennen, Christopher E.

    2008-01-01

    The problem of predicting the moments of the distribution of bubble radius in bubbly flows is considered. The particular case where bubble oscillations occur due to a rapid (impulsive or step change) change in pressure is analyzed, and it is mathematically shown that in this case, inviscid bubble oscillations reach a stationary statistical equilibrium, whereby phase cancellations among bubbles with different sizes lead to time-invariant values of the statistics. It is also shown that at statistical equilibrium, moments of the bubble radius may be computed using the period-averaged bubble radius in place of the instantaneous one. For sufficiently broad distributions of bubble equilibrium (or initial) radius, it is demonstrated that bubble statistics reach equilibrium on a time scale that is fast compared to physical damping of bubble oscillations due to viscosity, heat transfer, and liquid compressibility. The period-averaged bubble radius may then be used to predict the slow changes in the moments caused by the damping. A benefit is that period averaging gives a much smoother integrand, and accurate statistics can be obtained by tracking as few as five bubbles from the broad distribution. The period-averaged formula may therefore prove useful in reducing computational effort in models of dilute bubbly flow wherein bubbles are forced by shock waves or other rapid pressure changes, for which, at present, the strong effects caused by a distribution in bubble size can only be accurately predicted by tracking thousands of bubbles. Some challenges associated with extending the results to more general (nonimpulsive) forcing and strong two-way coupled bubbly flows are briefly discussed. PMID:19547725

  14. Statistical learning and selective inference

    PubMed Central

    Taylor, Jonathan; Tibshirani, Robert J.

    2015-01-01

    We describe the problem of “selective inference.” This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have “cherry-picked”—searched for the strongest associations—means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis. PMID:26100887

  15. Statistical learning and selective inference.

    PubMed

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  16. Statistical phenomena in particle beams

    SciTech Connect

    Bisognano, J.J.

    1984-09-01

    Particle beams are subject to a variety of apparently distinct statistical phenomena such as intrabeam scattering, stochastic cooling, electron cooling, coherent instabilities, and radiofrequency noise diffusion. In fact, both the physics and mathematical description of these mechanisms are quite similar, with the notion of correlation as a powerful unifying principle. In this presentation we will attempt to provide both a physical and a mathematical basis for understanding the wide range of statistical phenomena that have been discussed. In the course of this study the tools of the trade will be introduced, e.g., the Vlasov and Fokker-Planck equations, noise theory, correlation functions, and beam transfer functions. Although a major concern will be to provide equations for analyzing machine design, the primary goal is to introduce a basic set of physical concepts having a very broad range of applicability.

  17. Parallel contingency statistics with Titan.

    SciTech Connect

    Thompson, David C.; Pebay, Philippe Pierre

    2009-09-01

    This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.

  18. Performance Measures For Statistical Segmentation

    NASA Astrophysics Data System (ADS)

    Shazeer, Dov J.

    1983-03-01

    Performance measures for statistical segmentation have been developed for a space-and-time critical Bayesian statistical tracker. They are intended to become an integral part of a knowledge-based tracking algorithm, which has been developed by RCA. The performance measures are serving to quantify the usefulness of the processed input, to assist in the identification of each tracking state and give its reliability, and to predict impending changes of state. They have been tested using stochastically generated target-background frames. Performance measure results have correlated well with the parameters which characterize the difference in the target and background distributions. A host of possible performance measures are discussed in relation to their strengths and weaknesses. Experimental results for the measures currently being employed by RCA are given, and areas for future research are indicated.

  19. Statistical mechanics and Lorentz violation

    NASA Astrophysics Data System (ADS)

    Colladay, Don; McDonald, Patrick

    2004-12-01

    The theory of statistical mechanics is studied in the presence of Lorentz-violating background fields. The analysis is performed using the Standard-Model Extension (SME) together with a Jaynesian formulation of statistical inference. Conventional laws of thermodynamics are obtained in the presence of a perturbed hamiltonian that contains the Lorentz-violating terms. As an example, properties of the nonrelativistic ideal gas are calculated in detail. To lowest order in Lorentz violation, the scalar thermodynamic variables are only corrected by a rotationally invariant combination of parameters that mimics a (frame dependent) effective mass. Spin-couplings can induce a temperature-independent polarization in the classical gas that is not present in the conventional case. Precision measurements in the residual expectation values of the magnetic moment of Fermi gases in the limit of high temperature may provide interesting limits on these parameters.

  20. Introduction to Statistically Designed Experiments

    SciTech Connect

    Heaney, Mike

    2016-09-13

    Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introduced and finally a case study will be presented to demonstrate this methodology.

  1. Statistical description for survival data

    PubMed Central

    2016-01-01

    Statistical description is always the first step in data analysis. It gives investigator a general impression of the data at hand. Traditionally, data are described as central tendency and deviation. However, this framework does not fit to the survival data (also termed time-to-event data). Such data type contains two components. One is the survival time and the other is the status. Researchers are usually interested in the probability of event at a given survival time point. Hazard function, cumulative hazard function and survival function are commonly used to describe survival data. Survival function can be estimated using Kaplan-Meier estimator, which is also the default method in most statistical packages. Alternatively, Nelson-Aalen estimator is available to estimate survival function. Survival functions of subgroups can be compared using log-rank test. Furthermore, the article also introduces how to describe time-to-event data with parametric modeling. PMID:27867953

  2. Statistical Methods for Cardiovascular Researchers

    PubMed Central

    Moyé, Lem

    2016-01-01

    Rationale Biostatistics continues to play an essential role in contemporary cardiovascular investigations, but successful implementation of biostatistical methods can be complex. Objective To present the rationale behind statistical applications and to review useful tools for cardiology research. Methods and Results Prospective declaration of the research question, clear methodology, and study execution that adheres to the protocol together serve as the critical foundation of a research endeavor. Both parametric and distribution-free measures of central tendency and dispersion are presented. T-testing, analysis of variance, and regression analyses are reviewed. Survival analysis, logistic regression, and interim monitoring are also discussed. Finally, common weaknesses in statistical analyses are considered. Conclusion Biostatistics can be productively applied to cardiovascular research if investigators 1) develop and rely on a well-written protocol and analysis plan, 2) consult with a biostatistician when necessary, and 3) write results clearly, differentiating confirmatory from exploratory findings. PMID:26846639

  3. Inverse statistics and information content

    NASA Astrophysics Data System (ADS)

    Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.

    2010-12-01

    Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.

  4. SHARE: Statistical hadronization with resonances

    NASA Astrophysics Data System (ADS)

    Torrieri, G.; Steinke, S.; Broniowski, W.; Florkowski, W.; Letessier, J.; Rafelski, J.

    2005-05-01

    SHARE is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. With the physical input of intensive statistical parameters, it generates the ratios of particle abundances. The program includes cascade decays of all confirmed resonances from the Particle Data Tables. The complete treatment of these resonances has been known to be a crucial factor behind the success of the statistical approach. An optional feature implemented is the Breit-Wigner distribution for strong resonances. An interface for fitting the parameters of the model to the experimental data is provided. Program summaryTitle of the program:SHARE, October 2004, version 1.2 Catalogue identifier: ADVD Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: PC, Pentium III, 512 MB RAM (not hardware dependent) Operating system: Linux: RedHat 6.1, 7.2, FEDORA, etc. (not system dependent) Programming language:FORTRAN77: g77, f77 as well as Mathematica, ver. 4 or 5, for the case of full chemical equilibrium and particle widths set to zero Size of the package: 645 KB directory including example programs (87 KB compressed distribution archive) External routines: KERNLIB, MATHLIB and PACKLIB from the CERN Program Library (see http://cernlib.web.cern.ch for download and installation instructions) Distribution format: tar.gz Number of lines in distributed program, including test data, etc.: 15 277 Number of bytes in distributed program, including test data, etc.: 88 522 Computer: Any computer with an f77 compiler Nature of the physical problem: Statistical analysis of particle production in relativistic heavy-ion collisions involves the formation and the subsequent decays of a large number of resonances. With the physical input of thermal parameters, such as the temperature and fugacities, and considering cascading decays, along with weak

  5. Statistical Mechanics of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Mori, H.; Hata, H.; Horita, T.; Kobayashi, T.

    A statistical-mechanical formalism of chaos based on the geometry of invariant sets in phase space is discussed to show that chaotic dynamical systems can be treated by a formalism analogous to that of thermodynamic systems if one takes a relevant coarse-grained quantity, but their statistical laws are quite different from those of thermodynamic systems. This is a generalization of statistical mechanics for dealing with dissipative and hamiltonian (i.e., conservative) dynamical systems of a few degrees of freedom. Thus the sum of the local expansion rate of nearby orbits along relevant orbit over a long but finite time has been introduced in order to describe and characterize (1) a drastic change of the structure of a chaotic attractor at a bifurcation and anomalous phenomena associated, (2) a critical scaling of chaos in the neighborhood of a critical point for the bifurcation to a nonexotic state, and a self-similar temporal structure of a critical orbit on the critical 2^∞ attractor an the critical golden tori without mixing, (3) the critical KAM torus, diffusion and repeated sticking of a chaotic orbit to a critical torus in hamiltonian systems. Here a q-phase transition, analogous to the ferromagnetic phase transition, plays an important role. They are illustrated numerically and theoretically by treating the driven damped pendulum, the driven Duffing equation, the Henon map, and the dissipative and conservative standard maps. This description of chaos breaks the time-reversal symmetry of hamiltonian dynamical laws analogously to statistical mechanics of irreversible processes. The broken time-reversal symmetry is brought about by orbital instability of chaos.

  6. Introduction to Modern Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Chandler, David

    1987-09-01

    Leading physical chemist David Chandler takes a new approach to statistical mechanics to provide the only introductory-level work on the modern topics of renormalization group theory, Monte Carlo simulations, time correlation functions, and liquid structure. The author provides compact summaries of the fundamentals of this branch of physics and discussions of many of its traditional elementary applications, interspersed with over 150 exercises and microcomputer programs.

  7. [Pro Familia statistics for 1974].

    PubMed

    1975-09-01

    Statistics for 1974 for the West German family planning organization Pro Familia are reported. 56 offices are now operating, and 23,726 clients were seen. Men were seen more frequently than previously. 10,000 telephone calls were also handled. 16-25 year olds were increasingly represented in the clientele, as were unmarried persons of all ages. 1,242 patients were referred to physicians or clinics for clinical diagnosis.

  8. Measuring Fractional Statistics with Fabry-Perot Quantum Hall Interferometers

    NASA Astrophysics Data System (ADS)

    Goldman, Vladimir J.

    2008-03-01

    Laughlin quasiparticles are the elementary excitations of a highly-correlated fractional quantum Hall electron fluid. They have fractional charge and obey fractional statistics. The quasiparticles can propagate quantum-coherently in chiral edge channels, and constructively or destructively interfere. Unlike electrons, the interference condition for Laughlin quasiparticles has a non-vanishing statistical contribution that can be observed experimentally. Two kinds of interferometer devices have been realized. In the primary-filling interferometer, the entire device has filling 1/3, and the e/3 edge channel quasiparticles encircle identical e/3 island quasiparticles. Here the flux period is h/e, same as for electrons, but the back-gate charge period is e/3. In the second kind of interferometer, a lower density edge channel at filling 1/3 forms around a higher density island at filling 2/5, so that e/3 edge quasiparticles encircle e/5 island quasiparticles. Here we observe superperiodic oscillations with 5h/e flux and 2e charge periods, both corresponding to excitation of ten island quasiparticles. These periods can be understood as imposed by the anyonic braiding statistics of Laughlin quasiparticles. This work was done in collaboration with Fernando E. Camino, Ping Lin and Wei Zhou.

  9. The natural statistics of blur

    PubMed Central

    Sprague, William W.; Cooper, Emily A.; Reissier, Sylvain; Yellapragada, Baladitya; Banks, Martin S.

    2016-01-01

    Blur from defocus can be both useful and detrimental for visual perception: It can be useful as a source of depth information and detrimental because it degrades image quality. We examined these aspects of blur by measuring the natural statistics of defocus blur across the visual field. Participants wore an eye-and-scene tracker that measured gaze direction, pupil diameter, and scene distances as they performed everyday tasks. We found that blur magnitude increases with increasing eccentricity. There is a vertical gradient in the distances that generate defocus blur: Blur below the fovea is generally due to scene points nearer than fixation; blur above the fovea is mostly due to points farther than fixation. There is no systematic horizontal gradient. Large blurs are generally caused by points farther rather than nearer than fixation. Consistent with the statistics, participants in a perceptual experiment perceived vertical blur gradients as slanted top-back whereas horizontal gradients were perceived equally as left-back and right-back. The tendency for people to see sharp as near and blurred as far is also consistent with the observed statistics. We calculated how many observations will be perceived as unsharp and found that perceptible blur is rare. Finally, we found that eye shape in ground-dwelling animals conforms to that required to put likely distances in best focus. PMID:27580043

  10. Statistical inference and string theory

    NASA Astrophysics Data System (ADS)

    Heckman, Jonathan J.

    2015-09-01

    In this paper, we expose some surprising connections between string theory and statistical inference. We consider a large collective of agents sweeping out a family of nearby statistical models for an M-dimensional manifold of statistical fitting parameters. When the agents making nearby inferences align along a d-dimensional grid, we find that the pooled probability that the collective reaches a correct inference is the partition function of a nonlinear sigma model in d dimensions. Stability under perturbations to the original inference scheme requires the agents of the collective to distribute along two dimensions. Conformal invariance of the sigma model corresponds to the condition of a stable inference scheme, directly leading to the Einstein field equations for classical gravity. By summing over all possible arrangements of the agents in the collective, we reach a string theory. We also use this perspective to quantify how much an observer can hope to learn about the internal geometry of a superstring compactification. Finally, we present some brief speculative remarks on applications to the AdS/CFT correspondence and Lorentzian signature space-times.

  11. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  12. The Reverse Statistical Disclosure Attack

    NASA Astrophysics Data System (ADS)

    Mallesh, Nayantara; Wright, Matthew

    Statistical disclosure is a well-studied technique that an attacker can use to uncover relations between users in mix-based anonymity systems. Prior work has focused on finding the receivers to whom a given targeted user sends. In this paper, we investigate the effectiveness of statistical disclosure in finding all of a users' contacts, including those from whom she receives messages. To this end, we propose a new attack called the Reverse Statistical Disclosure Attack (RSDA). RSDA uses observations of all users sending patterns to estimate both the targeted user's sending pattern and her receiving pattern. The estimated patterns are combined to find a set of the targeted user's most likely contacts. We study the performance of RSDA in simulation using different mix network configurations and also study the effectiveness of cover traffic as a countermeasure. Our results show that that RSDA outperforms the traditional SDA in finding the user's contacts, particularly as the amounts of user traffic and cover traffic rise.

  13. Statistically significant relational data mining :

    SciTech Connect

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  14. Statistical properties of DNA sequences

    NASA Technical Reports Server (NTRS)

    Peng, C. K.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Simons, M.; Stanley, H. E.

    1995-01-01

    We review evidence supporting the idea that the DNA sequence in genes containing non-coding regions is correlated, and that the correlation is remarkably long range--indeed, nucleotides thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationarity" feature of the sequence of base pairs by applying a new algorithm called detrended fluctuation analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and non-coding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to every DNA sequence (33301 coding and 29453 non-coding) in the entire GenBank database. Finally, we describe briefly some recent work showing that the non-coding sequences have certain statistical features in common with natural and artificial languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts. These statistical properties of non-coding sequences support the possibility that non-coding regions of DNA may carry biological information.

  15. Using scientifically and statistically sufficient statistics in comparing image segmentations.

    PubMed

    Chi, Yueh-Yun; Muller, Keith E

    2010-01-01

    Automatic computer segmentation in three dimensions creates opportunity to reduce the cost of three-dimensional treatment planning of radiotherapy for cancer treatment. Comparisons between human and computer accuracy in segmenting kidneys in CT scans generate distance values far larger in number than the number of CT scans. Such high dimension, low sample size (HDLSS) data present a grand challenge to statisticians: how do we find good estimates and make credible inference? We recommend discovering and using scientifically and statistically sufficient statistics as an additional strategy for overcoming the curse of dimensionality. First, we reduced the three-dimensional array of distances for each image comparison to a histogram to be modeled individually. Second, we used non-parametric kernel density estimation to explore distributional patterns and assess multi-modality. Third, a systematic exploratory search for parametric distributions and truncated variations led to choosing a Gaussian form as approximating the distribution of a cube root transformation of distance. Fourth, representing each histogram by an individually estimated distribution eliminated the HDLSS problem by reducing on average 26,000 distances per histogram to just 2 parameter estimates. In the fifth and final step we used classical statistical methods to demonstrate that the two human observers disagreed significantly less with each other than with the computer segmentation. Nevertheless, the size of all disagreements was clinically unimportant relative to the size of a kidney. The hierarchal modeling approach to object-oriented data created response variables deemed sufficient by both the scientists and statisticians. We believe the same strategy provides a useful addition to the imaging toolkit and will succeed with many other high throughput technologies in genetics, metabolomics and chemical analysis.

  16. 75 FR 7426 - Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... 39 CFR Part 3050 Periodic Reporting AGENCY: Postal Regulatory Commission. ACTION: Advance notice of...-789-6820 or stephen.sharfman@prc.gov . SUPPLEMENTARY INFORMATION: Table of Contents I. Background II... approved for use in periodic reporting.\\1\\ The Postal Service labels its proposal ``Proposal One''...

  17. 76 FR 296 - Periodic Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... consider a proposed change in certain analytical methods used in periodic reporting. The proposed change... rulemaking proceeding to consider changes in the analytical methods approved for use in periodic reporting.\\1... Requesting Initiation of a Proceeding to Consider Proposed Changes in Analytic Principles (Proposals...

  18. Quantum Estimation, meet Computational Statistics; Computational Statistics, meet Quantum Estimation

    NASA Astrophysics Data System (ADS)

    Ferrie, Chris; Granade, Chris; Combes, Joshua

    2013-03-01

    Quantum estimation, that is, post processing data to obtain classical descriptions of quantum states and processes, is an intractable problem--scaling exponentially with the number of interacting systems. Thankfully there is an entire field, Computational Statistics, devoted to designing algorithms to estimate probabilities for seemingly intractable problems. So, why not look to the most advanced machine learning algorithms for quantum estimation tasks? We did. I'll describe how we adapted and combined machine learning methodologies to obtain an online learning algorithm designed to estimate quantum states and processes.

  19. Forward Period Analysis Method of the Periodic Hamiltonian System

    PubMed Central

    Wang, Pengfei

    2016-01-01

    Using the forward period analysis (FPA), we obtain the period of a Morse oscillator and mathematical pendulum system, with the accuracy of 100 significant digits. From these results, the long-term [0, 1060] (time unit) solutions, ranging from the Planck time to the age of the universe, are computed reliably and quickly with a parallel multiple-precision Taylor series (PMT) scheme. The application of FPA to periodic systems can greatly reduce the computation time of long-term reliable simulations. This scheme provides an efficient way to generate reference solutions, against which long-term simulations using other schemes can be tested. PMID:27727295

  20. Streamflow statistics for selected streams in North Dakota, Minnesota, Manitoba, and Saskatchewan

    USGS Publications Warehouse

    Williams-Sether, Tara

    2012-01-01

    Statistical summaries of streamflow data for the periods of record through water year 2009 for selected active and discontinued U.S. Geological Survey streamflow-gaging stations in North Dakota, Minnesota, Manitoba, and Saskatchewan were compiled. The summaries for each streamflow-gaging station include a brief station description, a graph of the annual peak and annual mean discharge for the period of record, statistics of monthly and annual mean discharges, monthly and annual flow durations, probability of occurrence of annual high discharges, annual peak discharge and corresponding gage height for the period of record, and monthly and annual mean discharges for the period of record.

  1. Spectral sum rules and search for periodicities in DNA sequences

    NASA Astrophysics Data System (ADS)

    Chechetkin, V. R.

    2011-04-01

    Periodic patterns play the important regulatory and structural roles in genomic DNA sequences. Commonly, the underlying periodicities should be understood in a broad statistical sense, since the corresponding periodic patterns have been strongly distorted by the random point mutations and insertions/deletions during molecular evolution. The latent periodicities in DNA sequences can be efficiently displayed by Fourier transform. The criteria of significance for observed periodicities are obtained via the comparison versus the counterpart characteristics of the reference random sequences. We show that the restrictions imposed on the significance criteria by the rigorous spectral sum rules can be rationally described with De Finetti distribution. This distribution provides the convenient intermediate asymptotic form between Rayleigh distribution and exact combinatoric theory.

  2. Periodic and Quasi-Periodic Orbitsfor the Standard Map

    NASA Astrophysics Data System (ADS)

    Berretti, Alberto; Gentile, Guido

    We consider both periodic and quasi-periodic solutions for the standard map, and we study the corresponding conjugating functions, i.e. the functions conjugating the motions to trivial rotations. We compare the invariant curves with rotation numbers ω satisfying the Bryuno condition and the sequences of periodic orbits with rotation numbers given by their convergents ωN = pN/qN. We prove the following results for N--> ∞: (1) for rotation numbers ωNN we study the radius of convergence of the conjugating functions and we find lower bounds on them, which tend to a limit which is a lower bound on the corresponding quantity for ω (2) the periodic orbits consist of points which are more and more close to the invariant curve with rotation number ω (3) such orbits lie on analytical curves which tend uniformly to the invariant curve.

  3. Statistical modelling of citation exchange between statistics journals.

    PubMed

    Varin, Cristiano; Cattelan, Manuela; Firth, David

    2016-01-01

    Rankings of scholarly journals based on citation data are often met with scepticism by the scientific community. Part of the scepticism is due to disparity between the common perception of journals' prestige and their ranking based on citation counts. A more serious concern is the inappropriate use of journal rankings to evaluate the scientific influence of researchers. The paper focuses on analysis of the table of cross-citations among a selection of statistics journals. Data are collected from the Web of Science database published by Thomson Reuters. Our results suggest that modelling the exchange of citations between journals is useful to highlight the most prestigious journals, but also that journal citation data are characterized by considerable heterogeneity, which needs to be properly summarized. Inferential conclusions require care to avoid potential overinterpretation of insignificant differences between journal ratings. Comparison with published ratings of institutions from the UK's research assessment exercise shows strong correlation at aggregate level between assessed research quality and journal citation 'export scores' within the discipline of statistics.

  4. Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003

    USGS Publications Warehouse

    Ries, Kernell G.

    2004-01-01

    Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.

  5. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  6. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  7. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling...

  8. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling...

  9. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling...

  10. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  11. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  12. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  13. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling...

  14. Engaging with the Art & Science of Statistics

    ERIC Educational Resources Information Center

    Peters, Susan A.

    2010-01-01

    How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…

  15. Transforming Elementary Statistics To Enhance Student Learning.

    ERIC Educational Resources Information Center

    Lane, Jill L.; Aleksic, Maja

    Undergraduate students often leave statistics courses not fully understanding how to apply statistical concepts (M. Bonsangue, 1994). In order to enhance student learning and improve the understanding and application of statistical concepts, an elementary statistics course was transformed from a lecture-based course into one that integrates…

  16. Statistical Sources for Health Science Librarians.

    ERIC Educational Resources Information Center

    Weise, Frieda

    This continuing education course syllabus presents information on the collection of vital and health statistics, lists of agencies or organizations involved in statistical collection and/or dissemination, annotated bibliographies of statistical sources, and guidelines for accessing statistical information. Topics covered include: (1) the reporting…

  17. Nonlinear Statistical Modeling of Speech

    NASA Astrophysics Data System (ADS)

    Srinivasan, S.; Ma, T.; May, D.; Lazarou, G.; Picone, J.

    2009-12-01

    Contemporary approaches to speech and speaker recognition decompose the problem into four components: feature extraction, acoustic modeling, language modeling and search. Statistical signal processing is an integral part of each of these components, and Bayes Rule is used to merge these components into a single optimal choice. Acoustic models typically use hidden Markov models based on Gaussian mixture models for state output probabilities. This popular approach suffers from an inherent assumption of linearity in speech signal dynamics. Language models often employ a variety of maximum entropy techniques, but can employ many of the same statistical techniques used for acoustic models. In this paper, we focus on introducing nonlinear statistical models to the feature extraction and acoustic modeling problems as a first step towards speech and speaker recognition systems based on notions of chaos and strange attractors. Our goal in this work is to improve the generalization and robustness properties of a speech recognition system. Three nonlinear invariants are proposed for feature extraction: Lyapunov exponents, correlation fractal dimension, and correlation entropy. We demonstrate an 11% relative improvement on speech recorded under noise-free conditions, but show a comparable degradation occurs for mismatched training conditions on noisy speech. We conjecture that the degradation is due to difficulties in estimating invariants reliably from noisy data. To circumvent these problems, we introduce two dynamic models to the acoustic modeling problem: (1) a linear dynamic model (LDM) that uses a state space-like formulation to explicitly model the evolution of hidden states using an autoregressive process, and (2) a data-dependent mixture of autoregressive (MixAR) models. Results show that LDM and MixAR models can achieve comparable performance with HMM systems while using significantly fewer parameters. Currently we are developing Bayesian parameter estimation and

  18. Statistical dynamo theory: Mode excitation.

    PubMed

    Hoyng, P

    2009-04-01

    We compute statistical properties of the lowest-order multipole coefficients of the magnetic field generated by a dynamo of arbitrary shape. To this end we expand the field in a complete biorthogonal set of base functions, viz. B= summation operator_{k}a;{k}(t)b;{k}(r) . The properties of these biorthogonal function sets are treated in detail. We consider a linear problem and the statistical properties of the fluid flow are supposed to be given. The turbulent convection may have an arbitrary distribution of spatial scales. The time evolution of the expansion coefficients a;{k} is governed by a stochastic differential equation from which we infer their averages a;{k} , autocorrelation functions a;{k}(t)a;{k *}(t+tau) , and an equation for the cross correlations a;{k}a;{l *} . The eigenfunctions of the dynamo equation (with eigenvalues lambda_{k} ) turn out to be a preferred set in terms of which our results assume their simplest form. The magnetic field of the dynamo is shown to consist of transiently excited eigenmodes whose frequency and coherence time is given by Ilambda_{k} and -1/Rlambda_{k} , respectively. The relative rms excitation level of the eigenmodes, and hence the distribution of magnetic energy over spatial scales, is determined by linear theory. An expression is derived for |a;{k}|;{2}/|a;{0}|;{2} in case the fundamental mode b;{0} has a dominant amplitude, and we outline how this expression may be evaluated. It is estimated that |a;{k}|;{2}/|a;{0}|;{2} approximately 1/N , where N is the number of convective cells in the dynamo. We show that the old problem of a short correlation time (or first-order smoothing approximation) has been partially eliminated. Finally we prove that for a simple statistically steady dynamo with finite resistivity all eigenvalues obey Rlambda_{k}<0 .

  19. Statistical Mechanics of Jammed Matter

    NASA Astrophysics Data System (ADS)

    Behringer, Bob

    2009-03-01

    Jammed systems consist of large numbers of macroscopic particles. As such, they are inherently statistical in nature. However, in general, key assumptions of ordinary statistical mechanics need not apply. For instance, energy does not flow in a meaningful way from a thermal bath to such systems. And energy need not be conserved. However, experiments and simulations have shown that there are well defined distributions for such important properties as forces, contact numbers, etc. And new theoretical constructions have been proposed, starting with Edwards et al. The present symposium highlights recent developments for the statistics of jammed matter. This talk reviews the overall field, and highlights recent work in granular systems[1]. Brian Tighe[2] will describe new results from a force ensemble approach proposed recently by Snoeijer et al. Silke Henkes will describe a different force-based ensemble approach that yields a generalized partition function[3]. Eric Corwin will describe state-of-the-art experiments on dense emulsions[4]. And Matthias Schr"oter will present novel experiments on fluidized suspensions that address the issue of jamming and glassy behavior[5]. So, do we have a complete description of jammed matter? Not yet, but these talks, as well as other exciting developments in the field, show that there has been enormous progress, towards that end. [4pt] [1] T. S. Majmudar et al., Nature 435, 1079 (2005); Phys. Rev. Lett. 98 058001 (2007). [0pt] [2] B. P. Tighe, A. R. T. van Eerd, and T. J. H. Vlugt , Phys. Rev. Lett. 100, 238001 (2008). [0pt] [3] S. Henkes, C. O'Hern and B. Chakrabory, Phys. Rev. Lett. 99, 038002 (2007). [0pt] [4] J. Bruji'c et al., Phys. Rev. Lett. 98, 248001 (2007). [0pt] [5] M. Schr"ooter, D. I. Goldman, and H. L. Swinney, Phys. Rev. E 71, 030301(R) (2005).

  20. The Stability of Periodic Orbits.

    DTIC Science & Technology

    1981-01-21

    I AOB a7 PRIlNCETON UNIV NJ JOSEPH HENRY LABS OF PHYSICS FD 7/S THE STABILITY OF PERIODIC ORBITS. (U) JAN 81 L SNEDOOM N00014-77-C-0711 UNCLASSIFIE-D...NL I - The Stability of Periodic Orbits Leigh Sneddon* Joseph Henry Laboratories of Physics Princeton University Princeton, New Jersey 08544 ABSTRACT...eigenvalue of the Poincare map passes out through the unit circle at -1 : see Appendix 1) 9,10 are observed and are referred to as subharmonic or period

  1. GEM: Statistical weather forecasting procedure

    NASA Technical Reports Server (NTRS)

    Miller, R. G.

    1983-01-01

    The objective of the Generalized Exponential Markov (GEM) Program was to develop a weather forecast guidance system that would: predict between 0 to 6 hours all elements in the airways observations; respond instantly to the latest observed conditions of the surface weather; process these observations at local sites on minicomputing equipment; exceed the accuracy of current persistence predictions at the shortest prediction of one hour and beyond; exceed the accuracy of current forecast model output statistics inside eight hours; and be capable of making predictions at one location for all locations where weather information is available.

  2. Statistical considerations for preclinical studies.

    PubMed

    Aban, Inmaculada B; George, Brandon

    2015-08-01

    Research studies must always have proper planning, conduct, analysis and reporting in order to preserve scientific integrity. Preclinical studies, the first stage of the drug development process, are no exception to this rule. The decision to advance to clinical trials in humans relies on the results of these studies. Recent observations show that a significant number of preclinical studies lack rigor in their conduct and reporting. This paper discusses statistical aspects, such as design, sample size determination, and methods of analyses, that will help add rigor and improve the quality of preclinical studies.

  3. NASA Pocket Statistics: 1997 Edition

    NASA Technical Reports Server (NTRS)

    1997-01-01

    POCKET STATISTICS is published by the NATIONAL AERONAUTICS AND SPACE ADMINISTRATION (NASA). Included in each edition is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, Aeronautics and Space Transportation and NASA Procurement, Financial and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. All Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  4. Statistical Properties of Online Auctions

    NASA Astrophysics Data System (ADS)

    Namazi, Alireza; Schadschneider, Andreas

    We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.

  5. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  6. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  7. Improved model for statistical alignment

    SciTech Connect

    Miklos, I.; Toroczkai, Z.

    2001-01-01

    The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.

  8. Statistical physics of polymer gels

    NASA Astrophysics Data System (ADS)

    Panyukov, Sergei; Rabin, Yitzhak

    1996-05-01

    This work presents a comprehensive analysis of the statistical mechanics of randomly cross-linked polymer gels, starting from a microscopic model of a network made of instantaneously cross-linked Gaussian chains with excluded volume, and ending with the derivation of explicit expressions for the thermodynamic functions and for the density correlation functions which can be tested by experiments. Using replica field theory we calculate the mean field density in replica space and show that this solution contains statistical information about the behavior of individual chains in the network. The average monomer positions change affinely with macroscopic deformation and fluctuations about these positions are limited to length scales of the order of the mesh size. We prove that a given gel has a unique state of microscopic equilibrium which depends on the temperature, the solvent, the average monomer density and the imposed deformation. This state is characterized by the set of the average positions of all the monomers or, equivalently, by a unique inhomogeneous monomer density profile. Gels are thus the only known example of equilibrium solids with no long-range order. We calculate the RPA density correlation functions that describe the statistical properties of small deviations from the average density, due to both static spatial heterogeneities (which characterize the inhomogeneous equilibrium state) and thermal fluctuations (about this equilibrium). We explain how the deformation-induced anisotropy of the inhomogeneous equilibrium density profile is revealed by small angle neutron scattering and light scattering experiments, through the observation of the butterfly effect. We show that all the statistical information about the structure of polymer networks is contained in two parameters whose values are determined by the conditions of synthesis: the density of cross-links and the heterogeneity parameter. We find that the structure of instantaneously cross

  9. Transportation Statistics Annual Report 1997

    SciTech Connect

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these

  10. Spatial Statistical Data Fusion (SSDF)

    NASA Technical Reports Server (NTRS)

    Braverman, Amy J.; Nguyen, Hai M.; Cressie, Noel

    2013-01-01

    As remote sensing for scientific purposes has transitioned from an experimental technology to an operational one, the selection of instruments has become more coordinated, so that the scientific community can exploit complementary measurements. However, tech nological and scientific heterogeneity across devices means that the statistical characteristics of the data they collect are different. The challenge addressed here is how to combine heterogeneous remote sensing data sets in a way that yields optimal statistical estimates of the underlying geophysical field, and provides rigorous uncertainty measures for those estimates. Different remote sensing data sets may have different spatial resolutions, different measurement error biases and variances, and other disparate characteristics. A state-of-the-art spatial statistical model was used to relate the true, but not directly observed, geophysical field to noisy, spatial aggregates observed by remote sensing instruments. The spatial covariances of the true field and the covariances of the true field with the observations were modeled. The observations are spatial averages of the true field values, over pixels, with different measurement noise superimposed. A kriging framework is used to infer optimal (minimum mean squared error and unbiased) estimates of the true field at point locations from pixel-level, noisy observations. A key feature of the spatial statistical model is the spatial mixed effects model that underlies it. The approach models the spatial covariance function of the underlying field using linear combinations of basis functions of fixed size. Approaches based on kriging require the inversion of very large spatial covariance matrices, and this is usually done by making simplifying assumptions about spatial covariance structure that simply do not hold for geophysical variables. In contrast, this method does not require these assumptions, and is also computationally much faster. This method is

  11. Nonstationary statistical theory for multipactor

    SciTech Connect

    Anza, S.; Vicente, C.; Gil, J.

    2010-06-15

    This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.

  12. Entropy in statistical energy analysis.

    PubMed

    Le Bot, Alain

    2009-03-01

    In this paper, the second principle of thermodynamics is discussed in the framework of statistical energy analysis (SEA). It is shown that the "vibrational entropy" and the "vibrational temperature" of sub-systems only depend on the vibrational energy and the number of resonant modes. A SEA system can be described as a thermodynamic system slightly out of equilibrium. In steady-state condition, the entropy exchanged with exterior by sources and dissipation exactly balances the production of entropy by irreversible processes at interface between SEA sub-systems.

  13. Non-gaussian statistics of pencil beam surveys

    NASA Technical Reports Server (NTRS)

    Amendola, Luca

    1994-01-01

    We study the effect of the non-Gaussian clustering of galaxies on the statistics of pencil beam surveys. We derive the probability from the power spectrum peaks by means of Edgeworth expansion and find that the higher order moments of the galaxy distribution play a dominant role. The probability of obtaining the 128 Mpc/h periodicity found in pencil beam surveys is raised by more than one order of magnitude, up to 1%. Further data are needed to decide if non-Gaussian distribution alone is sufficient to explain the 128 Mpc/h periodicity, or if extra large-scale power is necessary.

  14. Statistical and Scientometric Analysis of International Research in Geographical and Environmental Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos; Kidman, Gillian

    2012-01-01

    Certain statistic and scientometric features of articles published in the journal "International Research in Geographical and Environmental Education" (IRGEE) are examined in this paper for the period 1992-2009 by applying nonparametric statistics and Shannon's entropy (diversity) formula. The main findings of this analysis are: (a) after 2004,…

  15. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  16. Fertilizer use and price statistics, 1960-1991. Statistical bulletin

    SciTech Connect

    Vroomen, H.; Taylor, H.

    1992-11-01

    Fertilizer consumption grew rapidly throughout the 1960's and 1970's and peaked at 23.7 million nutrient tons in 1981. After falling to 18.1 million tons in 1983, use has remained relatively stable, ranging from 19.1 million to 21.8 million tons in 1984-91. Use declined from its peak level because of fewer planted acres and stabilizing rates of application. Retail fertilizer prices, while stable or declining during the 1960's, have varied widely since 1973. The bulletin includes quarterly or semiannual time series for retail fertilizer prices, annual retail and wholesale fertilizer price indexes, fertilizer consumption by plant nutrient and major selected products, consumption of mixed fertilizers and secondary and micronutrients, and statistics on fertilizer use per acre by nutrient in the major producing States for corn, cotton, soybeans, and wheat.

  17. Chaos in Periodic Discrete Systems

    NASA Astrophysics Data System (ADS)

    Shi, Yuming; Zhang, Lijuan; Yu, Panpan; Huang, Qiuling

    This paper focuses on chaos in periodic discrete systems, whose state space may vary with time. Some close relationships between some chaotic dynamical behaviors of a periodic discrete system and its autonomous induced system are given. Based on these relationships, several criteria of chaos are established and some sufficient conditions for no chaos are given for periodic discrete systems. Further, it is shown that a finite-dimensional linear periodic discrete system is not chaotic in the sense of Li-Yorke or Wiggins. In particular, an interesting problem of whether nonchaotic rules may generate a chaotic system is studied, with some examples provided, one of which surprisingly shows that a composition of globally asymptotically stable maps can be chaotic. In addition, some properties of sign pattern matrices of non-negative square matrices are given for convenience of the study.

  18. Autism: a "critical period" disorder?

    PubMed

    LeBlanc, Jocelyn J; Fagiolini, Michela

    2011-01-01

    Cortical circuits in the brain are refined by experience during critical periods early in postnatal life. Critical periods are regulated by the balance of excitatory and inhibitory (E/I) neurotransmission in the brain during development. There is now increasing evidence of E/I imbalance in autism, a complex genetic neurodevelopmental disorder diagnosed by abnormal socialization, impaired communication, and repetitive behaviors or restricted interests. The underlying cause is still largely unknown and there is no fully effective treatment or cure. We propose that alteration of the expression and/or timing of critical period circuit refinement in primary sensory brain areas may significantly contribute to autistic phenotypes, including cognitive and behavioral impairments. Dissection of the cellular and molecular mechanisms governing well-established critical periods represents a powerful tool to identify new potential therapeutic targets to restore normal plasticity and function in affected neuronal circuits.

  19. Statistical interpretation of traveltime fluctuations

    NASA Astrophysics Data System (ADS)

    Roth, Michael

    1997-02-01

    A ray-theoretical relation between the autocorrelation functions of traveltime and slowness fluctuations is established for recording profiles with arbitrary angles to the propagation direction of a plane wave. From this relation follows that the variance of traveltime fluctuations is independent of the profile orientation and proportional to the variance, ɛ2, of slowness fluctuations, to the correlation distance, a, and to the propagation distance L. The halfwidth of the autocorrelation function of traveltime fluctuations is proportional to a and decreases with increasing profile angle. This relationship allows us to estimate the statistical parameters ɛ and a from observed traveltime fluctuations. Numerical experiments for spatial isotropic random media characterized by a Gaussian autocorrelation function show that the statistical parameters can be reproduced successfully if L/a ≤ 10 . For larger L/a the correlation distance is overestimated and the standard deviation is underestimated. However, the results of the numerical experiments provide empirical factors to correct for these effects. The theory is applied to observed traveltime fluctuations of the Pg phase on a profile of the BABEL project. For the upper crust east of Øland (Sweden) slowness fluctuations with standard deviation ɛ = 2.2-5% and correlation distance a = 330-600 m are found.

  20. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  1. Statistical properties of cosmological billiards

    NASA Astrophysics Data System (ADS)

    Damour, Thibault; Lecian, Orchidea Maria

    2011-02-01

    Belinski, Khalatnikov, and Lifshitz pioneered the study of the statistical properties of the never-ending oscillatory behavior (among successive Kasner epochs) of the geometry near a spacelike singularity. We show how the use of a “cosmological billiard” description allows one to refine and deepen the understanding of these statistical properties. Contrary to previous treatments, we do not quotient the dynamics by its discrete symmetry group (of order 6), thereby uncovering new phenomena, such as correlations between the successive billiard corners in which the oscillations take place. Starting from the general integral invariants of Hamiltonian systems, we show how to construct invariant measures for various projections of the cosmological-billiard dynamics. In particular, we exhibit, for the first time, a (non-normalizable) invariant measure on the “Kasner circle” which parametrizes the exponents of successive Kasner epochs. Finally, we discuss the relation between: (i) the unquotiented dynamics of the Bianchi-IX (a, b, c or mixmaster) model; (ii) its quotienting by the group of permutations of (a, b, c); and (iii) the billiard dynamics that arose in recent studies suggesting the hidden presence of Kac-Moody symmetries in cosmological billiards.

  2. Simulating Metabolism with Statistical Thermodynamics

    PubMed Central

    Cannon, William R.

    2014-01-01

    New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed. PMID:25089525

  3. Demystifying EQA statistics and reports

    PubMed Central

    Coucke, Wim; Soumali, Mohamed Rida

    2017-01-01

    Reports act as an important feedback tool in External Quality Assessment (EQA). Their main role is to score laboratories for their performance in an EQA round. The most common scores that apply to quantitative data are Q- and Z-scores. To calculate these scores, EQA providers need to have an assigned value and standard deviation for the sample. Both assigned values and standard deviations can be derived chemically or statistically. When derived statistically, different anomalies against the normal distribution of the data have to be handled. Various procedures for evaluating laboratories are able to handle these anomalies. Formal tests and graphical representation techniques are discussed and suggestions are given to help choosing between the different evaluations techniques. In order to obtain reliable estimates for calculating performance scores, a satisfactory number of data is needed. There is no general agreement about the minimal number that is needed. A solution for very small numbers is proposed by changing the limits of evaluation.
Apart from analyte- and sample-specific laboratory evaluation, supplementary information can be obtained by combining results for different analytes and samples. Various techniques are overviewed. It is shown that combining results leads to supplementary information, not only for quantitative, but also for qualitative and semi-quantitative analytes. PMID:28392725

  4. Simulating metabolism with statistical thermodynamics.

    PubMed

    Cannon, William R

    2014-01-01

    New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed.

  5. Statistical variation in progressive scrambling

    NASA Astrophysics Data System (ADS)

    Clark, Robert D.; Fox, Peter C.

    2004-07-01

    The two methods most often used to evaluate the robustness and predictivity of partial least squares (PLS) models are cross-validation and response randomization. Both methods may be overly optimistic for data sets that contain redundant observations, however. The kinds of perturbation analysis widely used for evaluating model stability in the context of ordinary least squares regression are only applicable when the descriptors are independent of each other and errors are independent and normally distributed; neither assumption holds for QSAR in general and for PLS in particular. Progressive scrambling is a novel, non-parametric approach to perturbing models in the response space in a way that does not disturb the underlying covariance structure of the data. Here, we introduce adjustments for two of the characteristic values produced by a progressive scrambling analysis - the deprecated predictivity (Q_s^{ast^2}) and standard error of prediction (SDEP s * ) - that correct for the effect of introduced perturbation. We also explore the statistical behavior of the adjusted values (Q_0^{ast^2} and SDEP 0 * ) and the sensitivity to perturbation (d q 2/d r yy ' 2). It is shown that the three statistics are all robust for stable PLS models, in terms of the stochastic component of their determination and of their variation due to sampling effects involved in training set selection.

  6. Two dimensional unstable scar statistics.

    SciTech Connect

    Warne, Larry Kevin; Jorgenson, Roy Eberhardt; Kotulski, Joseph Daniel; Lee, Kelvin S. H. (ITT Industries/AES Los Angeles, CA)

    2006-12-01

    This report examines the localization of time harmonic high frequency modal fields in two dimensional cavities along periodic paths between opposing sides of the cavity. The cases where these orbits lead to unstable localized modes are known as scars. This paper examines the enhancements for these unstable orbits when the opposing mirrors are both convex and concave. In the latter case the construction includes the treatment of interior foci.

  7. Statistical Features of Complex Systems ---Toward Establishing Sociological Physics---

    NASA Astrophysics Data System (ADS)

    Kobayashi, Naoki; Kuninaka, Hiroto; Wakita, Jun-ichi; Matsushita, Mitsugu

    2011-07-01

    Complex systems have recently attracted much attention, both in natural sciences and in sociological sciences. Members constituting a complex system evolve through nonlinear interactions among each other. This means that in a complex system the multiplicative experience or, so to speak, the history of each member produces its present characteristics. If attention is paid to any statistical property in any complex system, the lognormal distribution is the most natural and appropriate among the standard or ``normal'' statistics to overview the whole system. In fact, the lognormality emerges rather conspicuously when we examine, as familiar and typical examples of statistical aspects in complex systems, the nursing-care period for the aged, populations of prefectures and municipalities, and our body height and weight. Many other examples are found in nature and society. On the basis of these observations, we discuss the possibility of sociological physics.

  8. Statistical Mechanics of Turbulent Dynamos

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2014-01-01

    Incompressible magnetohydrodynamic (MHD) turbulence and magnetic dynamos, which occur in magnetofluids with large fluid and magnetic Reynolds numbers, will be discussed. When Reynolds numbers are large and energy decays slowly, the distribution of energy with respect to length scale becomes quasi-stationary and MHD turbulence can be described statistically. In the limit of infinite Reynolds numbers, viscosity and resistivity become zero and if these values are used in the MHD equations ab initio, a model system called ideal MHD turbulence results. This model system is typically confined in simple geometries with some form of homogeneous boundary conditions, allowing for velocity and magnetic field to be represented by orthogonal function expansions. One advantage to this is that the coefficients of the expansions form a set of nonlinearly interacting variables whose behavior can be described by equilibrium statistical mechanics, i.e., by a canonical ensemble theory based on the global invariants (energy, cross helicity and magnetic helicity) of ideal MHD turbulence. Another advantage is that truncated expansions provide a finite dynamical system whose time evolution can be numerically simulated to test the predictions of the associated statistical mechanics. If ensemble predictions are the same as time averages, then the system is said to be ergodic; if not, the system is nonergodic. Although it had been implicitly assumed in the early days of ideal MHD statistical theory development that these finite dynamical systems were ergodic, numerical simulations provided sufficient evidence that they were, in fact, nonergodic. Specifically, while canonical ensemble theory predicted that expansion coefficients would be (i) zero-mean random variables with (ii) energy that decreased with length scale, it was found that although (ii) was correct, (i) was not and the expected ergodicity was broken. The exact cause of this broken ergodicity was explained, after much

  9. Superluminal motion statistics and cosmology

    NASA Astrophysics Data System (ADS)

    Vermeulen, R. C.; Cohen, M. H.

    1994-08-01

    This paper has three parts. First, we give an up-to-date overview of the available apparent velocity (Betaapp) data; second, we present some statistical predictions from simple relativistic beaming models; third, we discuss the inferences which a comparison of data and models allows for both relativistic jets and cosmology. We demonstrate that, in objects selected by Doppler-boosted flux density, likely Lorentz factors (gamma) can be estimated from the first-ranked (Betaapp) in samples as small as 5. Using 25 core-selected quasars, we find that the dependence of gamma on redshift differs depending on the value of qzero: gamma is close to constant over z if qzero = 0.5, but increases with z if qzero = 0.05. Conversely, this result could be used to constrain qzero, using either theoretical limits on gamma or observational constraints on the full distribution of gamma in each of several redshift bins, as could be derived from the (Betaapp) statistics in larger samples. We investigate several modifications to the simple relativistic beam concept, and their effects on the (Betaapp) statistics. There is likely to be a spread of gamma over the sample, with relative width W. There could also be a separate pattern and bulk gamma, which we model with a factor r identically equal to gammap/gammab. The values of W and r are coupled, and a swath in the (W,r)-plane is allowed by the (Betaapp) data in core-selected quasars. Interestingly, gammap could be both smaller and larger than gammab, or they could be equal, if W is large, but the most naive model (0,1) -- the same Lorentz factor in all sources and no separate pattern motions -- is excluded. A possible cutoff in quasar jet orientations, as in some unification models, causes a sharp shift toward higher (Betaapp) in randomly oriented samples but does not strongly affect the statistics of core-selected samples. If there is moderate bending of the jets on parsec scales, on the other hand, this has no significant impact on

  10. Statistical aspects of solar flares

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    1987-01-01

    A survey of the statistical properties of 850 H alpha solar flares during 1975 is presented. Comparison of the results found here with those reported elsewhere for different epochs is accomplished. Distributions of rise time, decay time, and duration are given, as are the mean, mode, median, and 90th percentile values. Proportions by selected groupings are also determined. For flares in general, mean values for rise time, decay time, and duration are 5.2 + or - 0.4 min, and 18.1 + or 1.1 min, respectively. Subflares, accounting for nearly 90 percent of the flares, had mean values lower than those found for flares of H alpha importance greater than 1, and the differences are statistically significant. Likewise, flares of bright and normal relative brightness have mean values of decay time and duration that are significantly longer than those computed for faint flares, and mass-motion related flares are significantly longer than non-mass-motion related flares. Seventy-three percent of the mass-motion related flares are categorized as being a two-ribbon flare and/or being accompanied by a high-speed dark filament. Slow rise time flares (rise time greater than 5 min) have a mean value for duration that is significantly longer than that computed for fast rise time flares, and long-lived duration flares (duration greater than 18 min) have a mean value for rise time that is significantly longer than that computed for short-lived duration flares, suggesting a positive linear relationship between rise time and duration for flares. Monthly occurrence rates for flares in general and by group are found to be linearly related in a positive sense to monthly sunspot number. Statistical testing reveals the association between sunspot number and numbers of flares to be significant at the 95 percent level of confidence, and the t statistic for slope is significant at greater than 99 percent level of confidence. Dependent upon the specific fit, between 58 percent and 94 percent of

  11. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.

  12. Optimized Flood Forecasts Using a Statistical Enemble

    NASA Astrophysics Data System (ADS)

    Silver, Micha; Fredj, Erick

    2016-04-01

    The method presented here assembles an optimized flood forecast from a set of consecutive WRF-Hydro simulations by applying coefficients which we derive from straightforward statistical procedures. Several government and research institutions that produce climate data offer ensemble forecasts, which merge predictions from different models to gain a more accurate fit to observed data. Existing ensemble forecasts present climate and weather predictions only. In this research we propose a novel approach to constructing hydrological ensembles for flood forecasting. The ensemble flood forecast is created by combining predictions from the same model, but initiated at different times. An operative flood forecasting system, run by the Israeli Hydrological Service, produces flood forecasts twice daily with a 72 hour forecast period. By collating the output from consecutive simulation runs we have access to multiple overlapping forecasts. We then apply two statistical procedures to blend these consecutive forecasts, resulting in a very close fit to observed flood runoff. We first employ cross-correlation with a time lag to determine a time shift for each of the original, consecutive forecasts. This shift corrects for two possible sources of error: slow or fast moving weather fronts in the base climate data; and mis-calibrations of the WRF-Hydro model in determining the rate of flow of surface runoff and in channels. We apply this time shift to all consecutive forecasts, then run a linear regression with the observed runoff data as the dependent variable and all shifted forecasts as the predictor variables. The solution to the linear regression equation is a set of coefficients that corrects the amplitude errors in the forecasts. These resulting regression coefficients are then applied to the consecutive forecasts producing a statistical ensemble which, by design, closely matches the observed runoff. After performing this procedure over many storm events in the Negev region

  13. Assessing Statistical Model Assumptions under Climate Change

    NASA Astrophysics Data System (ADS)

    Varotsos, Konstantinos V.; Giannakopoulos, Christos; Tombrou, Maria

    2016-04-01

    The majority of the studies assesses climate change impacts on air-quality using chemical transport models coupled to climate ones in an off-line mode, for various horizontal resolutions and different present and future time slices. A complementary approach is based on present-day empirical relations between air-pollutants and various meteorological variables which are then extrapolated to the future. However, the extrapolation relies on various assumptions such as that these relationships will retain their main characteristics in the future. In this study we focus on the ozone-temperature relationship. It is well known that among a number of meteorological variables, temperature is found to exhibit the highest correlation with ozone concentrations. This has led, in the past years, to the development and application of statistical models with which the potential impact of increasing future temperatures on various ozone statistical targets was examined. To examine whether the ozone-temperature relationship retains its main characteristics under warmer temperatures we analyze the relationship during the heatwaves events of 2003 and 2006 in Europe. More specifically, we use available gridded daily maximum temperatures (E-OBS) and hourly ozone observations from different non-urban stations (EMEP) within the areas that were impacted from the two heatwave events. In addition, we compare the temperature distributions of the two events with temperatures from two different future time periods 2021-2050 and 2071-2100 from a number of regional climate models developed under the framework of the Cordex initiative (http://www.cordex.org) with a horizontal resolution of 12 x 12km, based on different IPCC RCPs emissions scenarios. A statistical analysis is performed on the ozone-temperature relationship for each station and for the two aforementioned years which are then compared against the ozone-temperature relationships obtained from the rest of the available dataseries. The

  14. BETTER STATISTICS FOR BETTER DECISIONS: REJECTING NULL HYPOTHESES STATISTICAL TESTS IN FAVOR OF REPLICATION STATISTICS

    PubMed Central

    SANABRIA, FEDERICO; KILLEEN, PETER R.

    2008-01-01

    Despite being under challenge for the past 50 years, null hypothesis significance testing (NHST) remains dominant in the scientific field for want of viable alternatives. NHST, along with its significance level p, is inadequate for most of the uses to which it is put, a flaw that is of particular interest to educational practitioners who too often must use it to sanctify their research. In this article, we review the failure of NHST and propose prep, the probability of replicating an effect, as a more useful statistic for evaluating research and aiding practical decision making. PMID:19122766

  15. On statistical relationship of solar, geomagnetic and human activities.

    PubMed

    Alania, M V; Gil, A; Modzelewska, R

    2004-01-01

    Data of galactic cosmic rays, solar and geomagnetic activities and solar wind parameters on the one side and car accident events (CAE) in Poland on the other have been analyzed in order to reveal the statistical relationships among them for the period of 1990-2001. Cross correlation and cross spectrum of the galactic cosmic ray intensity, the solar wind (SW) velocity, Kp index of geomagnetic activity and CAE in Poland have been carried out. It is shown that in some epochs of the above-mentioned period there is found a reliable relationship between CAE and solar and geomagnetic activities parameters in the range of the different periodicities, especially, 7 days. The periodicity of 7 days revealed in the data of the CAE has the maximum on Friday without any exception for the minimum and maximum epochs of solar activity. However, the periodicity of 7 days is reliably revealed in other parameters characterizing galactic cosmic rays, SW, solar and geomagnetic activities, especially for the minimum epoch of solar activity. The periodicity of 3.5 days found in the series of CAE data more or less can be completely ascribed to the social effects, while the periodicity of 7 days can be ascribed to the social effect or/to the processes on the Sun, in the interplanetary space and in the Earth's magnetosphere and atmosphere.

  16. Gaussian statistics for palaeomagnetic vectors

    USGS Publications Warehouse

    Love, J.J.; Constable, C.G.

    2003-01-01

    With the aim of treating the statistics of palaeomagnetic directions and intensities jointly and consistently, we represent the mean and the variance of palaeomagnetic vectors, at a particular site and of a particular polarity, by a probability density function in a Cartesian three-space of orthogonal magnetic-field components consisting of a single (unimoda) non-zero mean, spherically-symmetrical (isotropic) Gaussian function. For palaeomagnetic data of mixed polarities, we consider a bimodal distribution consisting of a pair of such symmetrical Gaussian functions, with equal, but opposite, means and equal variances. For both the Gaussian and bi-Gaussian distributions, and in the spherical three-space of intensity, inclination, and declination, we obtain analytical expressions for the marginal density functions, the cumulative distributions, and the expected values and variances for each spherical coordinate (including the angle with respect to the axis of symmetry of the distributions). The mathematical expressions for the intensity and off-axis angle are closed-form and especially manageable, with the intensity distribution being Rayleigh-Rician. In the limit of small relative vectorial dispersion, the Gaussian (bi-Gaussian) directional distribution approaches a Fisher (Bingham) distribution and the intensity distribution approaches a normal distribution. In the opposite limit of large relative vectorial dispersion, the directional distributions approach a spherically-uniform distribution and the intensity distribution approaches a Maxwell distribution. We quantify biases in estimating the properties of the vector field resulting from the use of simple arithmetic averages, such as estimates of the intensity or the inclination of the mean vector, or the variances of these quantities. With the statistical framework developed here and using the maximum-likelihood method, which gives unbiased estimates in the limit of large data numbers, we demonstrate how to

  17. Statistical Seismology and Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  18. Statistical learning across development: flexible yet constrained.

    PubMed

    Krogh, Lauren; Vlach, Haley A; Johnson, Scott P

    2012-01-01

    Much research in the past two decades has documented infants' and adults' ability to extract statistical regularities from auditory input. Importantly, recent research has extended these findings to the visual domain, demonstrating learners' sensitivity to statistical patterns within visual arrays and sequences of shapes. In this review we discuss both auditory and visual statistical learning to elucidate both the generality of and constraints on statistical learning. The review first outlines the major findings of the statistical learning literature with infants, followed by discussion of statistical learning across domains, modalities, and development. The second part of this review considers constraints on statistical learning. The discussion focuses on two categories of constraint: constraints on the types of input over which statistical learning operates and constraints based on the state of the learner. The review concludes with a discussion of possible mechanisms underlying statistical learning.

  19. NLO error propagation exercise: statistical results

    SciTech Connect

    Pack, D.J.; Downing, D.J.

    1985-09-01

    Error propagation is the extrapolation and cumulation of uncertainty (variance) above total amounts of special nuclear material, for example, uranium or /sup 235/U, that are present in a defined location at a given time. The uncertainty results from the inevitable inexactness of individual measurements of weight, uranium concentration, /sup 235/U enrichment, etc. The extrapolated and cumulated uncertainty leads directly to quantified limits of error on inventory differences (LEIDs) for such material. The NLO error propagation exercise was planned as a field demonstration of the utilization of statistical error propagation methodology at the Feed Materials Production Center in Fernald, Ohio from April 1 to July 1, 1983 in a single material balance area formed specially for the exercise. Major elements of the error propagation methodology were: variance approximation by Taylor Series expansion; variance cumulation by uncorrelated primary error sources as suggested by Jaech; random effects ANOVA model estimation of variance effects (systematic error); provision for inclusion of process variance in addition to measurement variance; and exclusion of static material. The methodology was applied to material balance area transactions from the indicated time period through a FORTRAN computer code developed specifically for this purpose on the NLO HP-3000 computer. This paper contains a complete description of the error propagation methodology and a full summary of the numerical results of applying the methodlogy in the field demonstration. The error propagation LEIDs did encompass the actual uranium and /sup 235/U inventory differences. Further, one can see that error propagation actually provides guidance for reducing inventory differences and LEIDs in future time periods.

  20. Statistical Analysis of Tsunami Variability

    NASA Astrophysics Data System (ADS)

    Zolezzi, Francesca; Del Giudice, Tania; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    The purpose of this paper was to investigate statistical variability of seismically generated tsunami impact. The specific goal of the work was to evaluate the variability in tsunami wave run-up due to uncertainty in fault rupture parameters (source effects) and to the effects of local bathymetry at an individual location (site effects). This knowledge is critical to development of methodologies for probabilistic tsunami hazard assessment. Two types of variability were considered: • Inter-event; • Intra-event. Generally, inter-event variability refers to the differences of tsunami run-up at a given location for a number of different earthquake events. The focus of the current study was to evaluate the variability of tsunami run-up at a given point for a given magnitude earthquake. In this case, the variability is expected to arise from lack of knowledge regarding the specific details of the fault rupture "source" parameters. As sufficient field observations are not available to resolve this question, numerical modelling was used to generate run-up data. A scenario magnitude 8 earthquake in the Hellenic Arc was modelled. This is similar to the event thought to have caused the infamous 1303 tsunami. The tsunami wave run-up was computed at 4020 locations along the Egyptian coast between longitudes 28.7° E and 33.8° E. Specific source parameters (e.g. fault rupture length and displacement) were varied, and the effects on wave height were determined. A Monte Carlo approach considering the statistical distribution of the underlying parameters was used to evaluate the variability in wave height at locations along the coast. The results were evaluated in terms of the coefficient of variation of the simulated wave run-up (standard deviation divided by mean value) for each location. The coefficient of variation along the coast was between 0.14 and 3.11, with an average value of 0.67. The variation was higher in areas of irregular coast. This level of variability is