42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…
INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER
A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...
ERIC Educational Resources Information Center
Warkentien, Siri; Grady, Sarah
2009-01-01
This Statistics in Brief contributes to current research by investigating the use of tutoring services among a nationally representative group of public school students enrolled in grades K-12. The report compares students in schools that have not made Adequate Yearly Progress (AYP) for 3 or more years, and were thereby enrolled in schools that…
Calculating statistical power in Mendelian randomization studies.
Brion, Marie-Jo A; Shakhbazov, Konstantin; Visscher, Peter M
2013-10-01
In Mendelian randomization (MR) studies, where genetic variants are used as proxy measures for an exposure trait of interest, obtaining adequate statistical power is frequently a concern due to the small amount of variation in a phenotypic trait that is typically explained by genetic variants. A range of power estimates based on simulations and specific parameters for two-stage least squares (2SLS) MR analyses based on continuous variables has previously been published. However there are presently no specific equations or software tools one can implement for calculating power of a given MR study. Using asymptotic theory, we show that in the case of continuous variables and a single instrument, for example a single-nucleotide polymorphism (SNP) or multiple SNP predictor, statistical power for a fixed sample size is a function of two parameters: the proportion of variation in the exposure variable explained by the genetic predictor and the true causal association between the exposure and outcome variable. We demonstrate that power for 2SLS MR can be derived using the non-centrality parameter (NCP) of the statistical test that is employed to test whether the 2SLS regression coefficient is zero. We show that the previously published power estimates from simulations can be represented theoretically using this NCP-based approach, with similar estimates observed when the simulation-based estimates are compared with our NCP-based approach. General equations for calculating statistical power for 2SLS MR using the NCP are provided in this note, and we implement the calculations in a web-based application. PMID:24159078
Statistical Power in Meta-Analysis
ERIC Educational Resources Information Center
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Can loss of balance from mesoscale eddies adequately power deep ocean mixing?
NASA Astrophysics Data System (ADS)
Williams, P. D.; Haine, T. W.; Read, P. L.
2009-12-01
The global ocean thermohaline circulation is partly composed of the sinking of dense surface waters at high latitudes. But in order to close the circulation and maintain the abyssal stratification, the dense waters must rise up again through vertical mixing. This process requires a source of energy roughly estimated to be 2 TW. Previous work has concluded that tides and winds may adequately supply the required power, but the conceivable role of loss of balance from mesoscale eddies, resulting in the generation of internal inertia-gravity waves and associated vertical mixing, has hitherto been considered to be 'of unknown importance' (Wunsch and Ferrari, 2004). We investigate the potential role of loss of balance, by studying the generation of internal inertia-gravity waves by balanced flow in a rotating two-layer annulus laboratory experiment (Williams et al., 2008). A photograph from the experiment is shown in the figure. As the Rossby number of the balanced flow decreases, the amplitude of the emitted inertia-gravity waves also decreases, but much less rapidly than is predicted by several dynamical theories. This finding suggests that inertia-gravity waves might be far more energised than previously thought. The balanced flow leaks roughly one per cent of its energy each rotation period into internal inertia-gravity waves at the peak of their generation. Crude extrapolation of this result to the global ocean suggests that the flux of energy from mesoscale eddies into internal waves may be as large as 1.5 TW. We claim no accuracy for this figure which is only indicative. Nevertheless, we are persuaded that generation of inertia-gravity waves from the balanced mesoscale flow may be an important source of energy for deep interior mixing, and deserves further study. Reference Williams, PD, Haine, TWN and Read, PL (2008) Inertia-Gravity Waves Emitted from Balanced Flow: Observations, Properties, and Consequences. Journal of the Atmospheric Sciences, 65(11), pp 3543
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Practical Uses of Statistical Power in Business Research Studies.
ERIC Educational Resources Information Center
Markowski, Edward P.; Markowski, Carol A.
1999-01-01
Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)
Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.
ERIC Educational Resources Information Center
Rogers, Bruce G.
This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
Cheong, JeeWon
2016-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including sample size, effect size of mediated effect, number of measurement occasions, and R2 of measured variables. In general, the results showed that relatively large samples were needed to accurately estimate the mediated effects and to have adequate statistical power, when testing mediation in the LGCM framework. Guidelines for designing studies to examine longitudinal mediation and ways to improve the accuracy of the estimates and statistical power were discussed.
Statistical modelling of mitochondrial power supply.
James, A T; Wiskich, J T; Conyers, R A
1989-01-01
By experiment and theory, formulae are derived to calculate the response of mitochondrial power supply, in flux and potential, to an ATP consuming enzyme load, incorporating effects of varying amounts of (i) enzyme, (ii) total circulating adenylate, and (iii) inhibition of the ATP/ADP translocase. The formulae, which apply between about 20% and 80% of maximum respiration, are the same as for the current and voltage of an electrical circuit in which a battery with potential, linear in the logarithm of the total adenylate, charges another battery whose opposing potential is also linear in the same logarithm, through three resistances. These resistances produce loss of potential due to dis-equilibrium of (i) intramitochondrial oxidative phosphorylation, (ii) the ATP/ADP translocase, and (iii) the ATP-consuming enzyme load. The model is represented geometrically by the following configuration: when potential is plotted against flux, the points lie on two pencils of lines each concurrent at zero respiration, the two pencils describing the respective characteristics of the mitochondrion and enzyme. Control coefficients and elasticities are calculated from the formulae. PMID:2708917
New Dynamical-Statistical Techniques for Wind Power Prediction
NASA Astrophysics Data System (ADS)
Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.
2012-04-01
The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.
Statistical Power Analysis in Education Research. NCSER 2010-3006
ERIC Educational Resources Information Center
Hedges, Larry V.; Rhoads, Christopher
2010-01-01
This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from…
Power Curve Modeling in Complex Terrain Using Statistical Models
NASA Astrophysics Data System (ADS)
Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.
2014-12-01
Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.
Robust Statistical Detection of Power-Law Cross-Correlation
NASA Astrophysics Data System (ADS)
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
"Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2009-01-01
Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
The Role of Atmospheric Measurements in Wind Power Statistical Models
NASA Astrophysics Data System (ADS)
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.
2015-12-01
The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Statistical Models of Power-law Distributions in Homogeneous Plasmas
Roth, Ilan
2011-01-04
A variety of in-situ measurements in space plasmas point out to an intermittent formation of distribution functions with elongated tails and power-law at high energies. Power-laws form ubiquitous signature of many complex systems, plasma being a good example of a non-Boltzmann behavior for distribution functions of energetic particles. Particles, which either undergo mutual collisions or are scattered in phase space by electromagnetic fluctuations, exhibit statistical properties, which are determined by the transition probability density function of a single interaction, while their non-asymptotic evolution may determine the observed high-energy populations. It is shown that relaxation of the Brownian motion assumptions leads to non-analytical characteristic functions and to generalization of the Fokker-Planck equation with fractional derivatives that result in power law solutions parameterized by the probability density function.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Robust Statistical Detection of Power-Law Cross-Correlation.
Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques
NASA Technical Reports Server (NTRS)
Kuan, Gary M
2008-01-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
Spatial factors affecting statistical power in testing marine fauna displacement.
Pérez Lapeña, B; Wijnberg, K M; Stein, A; Hulscher, S J M H
2011-10-01
Impacts of offshore wind farms on marine fauna are largely unknown. Therefore, one commonly adheres to the precautionary principle, which states that one shall take action to avoid potentially damaging impacts on marine ecosystems, even when full scientific certainty is lacking. We implement this principle by means of a statistical power analysis including spatial factors. Implementation is based on geostatistical simulations, accommodating for zero-inflation in species data. We investigate scenarios in which an impact assessment still has to be carried out. Our results show that the environmental conditions at the time of the survey is the most influential factor on power. This is followed by survey effort and species abundance in the reference situation. Spatial dependence in species numbers at local scales affects power, but its effect is smaller for the scenarios investigated. Our findings can be used to improve effectiveness of the economical investment for monitoring surveys. In addition, unnecessary extra survey effort, and related costs, can be avoided when spatial dependence in species abundance is present and no improvement on power is achieved. PMID:22073657
Statistical tests for power-law cross-correlated processes
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Statistical tests for power-law cross-correlated processes.
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series. PMID:22304166
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring
Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring.
Schultz, Stewart T; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Toward improved statistical treatments of wind power forecast errors
NASA Astrophysics Data System (ADS)
Hart, E.; Jacobson, M. Z.
2011-12-01
The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W., Jr.
2003-01-01
A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
Decadal power in land air temperatures: Is it statistically significant?
NASA Astrophysics Data System (ADS)
Thejll, Peter A.
2001-12-01
The geographical distribution and properties of the well-known 10-11 year signal in terrestrial temperature records is investigated. By analyzing the Global Historical Climate Network data for surface air temperatures we verify that the signal is strongest in North America and is similar in nature to that reported earlier by R. G. Currie. The decadal signal is statistically significant for individual stations, but it is not possible to show that the signal is statistically significant globally, using strict tests. In North America, during the twentieth century, the decadal variability in the solar activity cycle is associated with the decadal part of the North Atlantic Oscillation index series in such a way that both of these signals correspond to the same spatial pattern of cooling and warming. A method for testing statistical results with Monte Carlo trials on data fields with specified temporal structure and specific spatial correlation retained is presented.
Heidel, R Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Heidel, R. Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Automated FMV image quality assessment based on power spectrum statistics
NASA Astrophysics Data System (ADS)
Kalukin, Andrew
2015-05-01
Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
ERIC Educational Resources Information Center
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis
ERIC Educational Resources Information Center
Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.
2010-01-01
In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W.
2002-01-01
A simple power law model consisting of a single spectral index, a is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index alpha(sub 2) greater than alpha(sub 1) above E(sub k). The Maximum likelihood (ML) procedure was developed for estimating the single parameter alpha(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (P1) consistency (asymptotically unbiased). (P2) efficiency asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only he ascertained by calculating the CRB for an assumed energy spectrum-detector response function combination, which can be quite formidable in practice. However. the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are attained in practice are investigated. The ML technique is then extended to estimate spectra information from
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Estimating statistical power for open-enrollment group treatment trials.
Morgan-Lopez, Antonio A; Saavedra, Lissette M; Hien, Denise A; Fals-Stewart, William
2011-01-01
Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open-enrollment groups) and the designs of substance abuse treatment trials (closed-enrollment groups, individual therapy). Latent class pattern mixture models (LCPMMs) are emerging tools for modeling data from open-enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open-enrollment designs based on the Monte Carlo simulation of LCPMM models using parameters derived from published data from a randomized controlled trial comparing Seeking Safety to a Community Care condition for women presenting with comorbid posttraumatic stress disorder and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently proposed open-enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971
Efficiency statistics at all times: Carnot limit at finite power.
Polettini, M; Verley, G; Esposito, M
2015-02-01
We derive the statistics of the efficiency under the assumption that thermodynamic fluxes fluctuate with normal law, parametrizing it in terms of time, macroscopic efficiency, and a coupling parameter ζ. It has a peculiar behavior: no moments, one sub-, and one super-Carnot maxima corresponding to reverse operating regimes (engine or pump), the most probable efficiency decreasing in time. The limit ζ→0 where the Carnot bound can be saturated gives rise to two extreme situations, one where the machine works at its macroscopic efficiency, with Carnot limit corresponding to no entropy production, and one where for a transient time scaling like 1/ζ microscopic fluctuations are enhanced in such a way that the most probable efficiency approaches the Carnot limit at finite entropy production. PMID:25699428
Monitoring Statistics Which Have Increased Power over a Reduced Time Range.
ERIC Educational Resources Information Center
Tang, S. M.; MacNeill, I. B.
1992-01-01
The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)
The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study
ERIC Educational Resources Information Center
Dong, Nianbo; Lipsey, Mark
2010-01-01
This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…
Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models
ERIC Educational Resources Information Center
Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles
2012-01-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…
Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J.; Suggs, Suzanne; Barbour, Russell
2015-01-01
The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power. PMID:26640421
On the power for linkage detection using a test based on scan statistics.
Hernández, Sonia; Siegmund, David O; de Gunst, Mathisca
2005-04-01
We analyze some aspects of scan statistics, which have been proposed to help for the detection of weak signals in genetic linkage analysis. We derive approximate expressions for the power of a test based on moving averages of the identity by descent allele sharing proportions for pairs of relatives at several contiguous markers. We confirm these approximate formulae by simulation. The results show that when there is a single trait-locus on a chromosome, the test based on the scan statistic is slightly less powerful than that based on the customary allele sharing statistic. On the other hand, if two genes having a moderate effect on a trait lie close to each other on the same chromosome, scan statistics improve power to detect linkage. PMID:15772104
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.
Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G
2016-04-01
Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009
Narayan, Manjari; Allen, Genevera I
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Narayan, Manjari; Allen, Genevera I.
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
NASA Astrophysics Data System (ADS)
Ma, W. T.; Sandri, G. vH.; Sarkar, S.
1991-05-01
We use the convolution power of infinite sequences to obtain a novel representation of exponential functions of power series which often arise in statistical mechanics. We thus obtain new formulas for the configuration and cluster integrals of pairwise interacting systems of molecules in an imperfect gas. We prove that the asymptotic behaviour of the Luria-Delbrück distribution is pn∼ cn-2. We derive a new, simple and computationally efficient recursion relation for pn.
Prospective active marker motion correction improves statistical power in BOLD fMRI
Ooi, Melvyn B.; Goldman, Robin I.; Krueger, Sascha; Thomas, William J.; Sajda, Paul; Brown, Truman R.
2013-01-01
Group level statistical maps of blood oxygenation level dependent (BOLD) signals acquired using functional magnetic resonance imaging (fMRI) have become a basic measurement for much of systems, cognitive and social neuroscience. A challenge in making inferences from these statistical maps is the noise and potential confounds that arise from the head motion that occurs within and between acquisition volumes. This motion results in the scan plane being misaligned during acquisition, ultimately leading to reduced statistical power when maps are constructed at the group level. In most cases, an attempt is made to correct for this motion through the use of retrospective analysis methods. In this paper, we use a prospective active marker motion correction (PRAMMO) system that uses radio frequency markers for real-time tracking of motion, enabling on-line slice plane correction. We show that the statistical power of the activation maps is substantially increased using PRAMMO compared to conventional retrospective correction. Analysis of our results indicates that the PRAMMO acquisition reduces the variance without decreasing the signal component of the BOLD (beta). Using PRAMMO could thus improve the overall statistical power of fMRI based BOLD measurements, leading to stronger inferences of the nature of processing in the human brain. PMID:23220430
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
ERIC Educational Resources Information Center
Cheong, JeeWon
2011-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…
Violation of statistical isotropy and homogeneity in the 21-cm power spectrum
NASA Astrophysics Data System (ADS)
Shiraishi, Maresuke; Muñoz, Julian B.; Kamionkowski, Marc; Raccanelli, Alvise
2016-05-01
Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic microwave background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here, we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving upon the current CMB constraints by ˜2 orders of magnitude for a dipolar asymmetry and by ˜1 - 3 orders of magnitude for a quadrupolar asymmetry case.
Statistics of injected power on a bouncing ball subjected to a randomly vibrating piston.
García-Cid, Alfredo; Gutiérrez, Pablo; Falcón, Claudio; Aumaître, Sébastien; Falcon, Eric
2015-09-01
We present an experimental study on the statistical properties of the injected power needed to maintain an inelastic ball bouncing constantly on a randomly accelerating piston in the presence of gravity. We compute the injected power at each collision of the ball with the moving piston by measuring the velocity of the piston and the force exerted on the piston by the ball. The probability density function of the injected power has its most probable value close to zero and displays two asymmetric exponential tails, depending on the restitution coefficient, the piston acceleration, and its frequency content. This distribution can be deduced from a simple model assuming quasi-Gaussian statistics for the force and velocity of the piston. PMID:26465548
NASA Astrophysics Data System (ADS)
Asal, F. F.
2012-07-01
Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical
The Effect of Cluster Size Variability on Statistical Power in Cluster-Randomized Trials
Lauer, Stephen A.; Kleinman, Ken P.; Reich, Nicholas G.
2015-01-01
The frequency of cluster-randomized trials (CRTs) in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters) needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power. PMID:25830416
A statistical spatial power spectrum of the Earth's lithospheric magnetic field
NASA Astrophysics Data System (ADS)
Thébault, E.; Vervelidou, F.
2015-05-01
The magnetic field of the Earth's lithosphere arises from rock magnetization contrasts that were shaped over geological times. The field can be described mathematically in spherical harmonics or with distributions of magnetization. We exploit this dual representation and assume that the lithospheric field is induced by spatially varying susceptibility values within a shell of constant thickness. By introducing a statistical assumption about the power spectrum of the susceptibility, we then derive a statistical expression for the spatial power spectrum of the crustal magnetic field for the spatial scales ranging from 60 to 2500 km. This expression depends on the mean induced magnetization, the thickness of the shell, and a power law exponent for the power spectrum of the susceptibility. We test the relevance of this form with a misfit analysis to the observational NGDC-720 lithospheric magnetic field model power spectrum. This allows us to estimate a mean global apparent induced magnetization value between 0.3 and 0.6 A m-1, a mean magnetic crustal thickness value between 23 and 30 km, and a root mean square for the field value between 190 and 205 nT at 95 per cent. These estimates are in good agreement with independent models of the crustal magnetization and of the seismic crustal thickness. We carry out the same analysis in the continental and oceanic domains separately. We complement the misfit analyses with a Kolmogorov-Smirnov goodness-of-fit test and we conclude that the observed power spectrum can be each time a sample of the statistical one.
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity.
Beasley, T Mark
2014-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity
Beasley, T. Mark
2013-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
A Powerful Statistical Approach for Large-Scale Differential Transcription Analysis
Tan, Yuan-De; Chandler, Anita M.; Chaudhury, Arindam; Neilson, Joel R.
2015-01-01
Next generation sequencing (NGS) is increasingly being used for transcriptome-wide analysis of differential gene expression. The NGS data are multidimensional count data. Therefore, most of the statistical methods developed well for microarray data analysis are not applicable to transcriptomic data. For this reason, a variety of new statistical methods based on count data of transcript reads have been correspondingly proposed. But due to high cost and limitation of biological resources, current NGS data are still generated from a few replicate libraries. Some of these existing methods do not always have desirable performances on count data. We here developed a very powerful and robust statistical method based on beta and binomial distributions. Our method (mBeta t-test) is specifically applicable to sequence count data from small samples. Both simulated and real transcriptomic data showed mBeta t-test significantly outperformed the existing top statistical methods chosen in all 12 given scenarios and performed with high efficiency and high stability. The differentially expressed genes found by our method from real transcriptomic data were validated by qPCR experiments. Our method shows high power in finding truly differential expression, conservatively estimating FDR and high stability in RNA sequence count data derived from small samples. Our method can also be extended to genome-wide detection of differential splicing events. PMID:25894390
Statistical Design Model (SDM) of power supply and communication subsystem's Satellite
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza
In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
NASA Astrophysics Data System (ADS)
Najac, Julien
2014-05-01
For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.
NASA Astrophysics Data System (ADS)
Osato, Ken; Shirasaki, Masato; Yoshida, Naoki
2015-06-01
We study the impact of baryonic physics on cosmological parameter estimation with weak-lensing surveys. We run a set of cosmological hydrodynamics simulations with different galaxy formation models. We then perform ray-tracing simulations through the total matter density field to generate 100 independent convergence maps with a field of view of 25 {{deg }2}, and we use them to examine the ability of the following three lensing statistics as cosmological probes: power spectrum (PS), peak counts, and Minkowski functionals (MFs). For the upcoming wide-field observations, such as the Subaru Hyper Suprime-Cam (HSC) survey with a sky coverage of 1400 {{deg }2}, these three statistics provide tight constraints on the matter density, density fluctuation amplitude, and dark energy equation of state, but parameter bias is induced by baryonic processes such as gas cooling and stellar feedback. When we use PS, peak counts, and MFs, the magnitude of relative bias in the dark energy equation of state parameter w is at a level of, respectively, δ w∼ 0.017, 0.061, and 0.0011. For the HSC survey, these values are smaller than the statistical errors estimated from Fisher analysis. The bias could be significant when the statistical errors become small in future observations with a much larger survey area. We find that the bias is induced in different directions in the parameter space depending on the statistics employed. While the two-point statistic, i.e., PS, yields robust results against baryonic effects, the overall constraining power is weak compared with peak counts and MFs. On the other hand, using one of peak counts or MFs, or combined analysis with multiple statistics, results in a biased parameter estimate. The bias can be as large as 1σ for the HSC survey and will be more significant for upcoming wider-area surveys. We suggest to use an optimized combination so that the baryonic effects on parameter estimation are mitigated. Such a “calibrated” combination can
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Assessing statistical power of SNPs for population structure and conservation studies.
Morin, Phillip A; Martien, Karen K; Taylor, Barbara L
2009-01-01
Single nucleotide polymorphisms (SNPs) have been proposed by some as the new frontier for population studies, and several papers have presented theoretical and empirical evidence reporting the advantages and limitations of SNPs. As a practical matter, however, it remains unclear how many SNP markers will be required or what the optimal characteristics of those markers should be in order to obtain sufficient statistical power to detect different levels of population differentiation. We use a hypothetical case to illustrate the process of designing a population genetics project, and present results from simulations that address several issues for maximizing statistical power to detect differentiation while minimizing the amount of effort in developing SNPs. Results indicate that (i) while ~30 SNPs should be sufficient to detect moderate (F(ST) = 0.01) levels of differentiation, studies aimed at detecting demographic independence (e.g. F(ST) < 0.005) may require 80 or more SNPs and large sample sizes; (ii) different SNP allele frequencies have little affect on power, and thus, selection of SNPs can be relatively unbiased; (iii) increasing the sample size has a strong effect on power, so that the number of loci can be minimized when sample number is known, and increasing sample size is almost always beneficial; and (iv) power is increased by including multiple SNPs within loci and inferring haplotypes, rather than trying to use only unlinked SNPs. This also has the practical benefit of reducing the SNP ascertainment effort, and may influence the decision of whether to seek SNPs in coding or noncoding regions. PMID:21564568
Statistical analysis of the cosmic microwave background: Power spectra and foregrounds
NASA Astrophysics Data System (ADS)
O'Dwyer, Ian J.
2005-11-01
In this thesis I examine some of the challenges associated with analyzing Cosmic Microwave Background (CMB) data and present a novel approach to solving the problem of power spectrum estimation, which is called MAGIC (MAGIC Allows Global Inference of Covariance). In light of the computational difficulty of a brute force approach to power spectrum estimation, I review several approaches which have been applied to the problem and show an example application of such an approximate method to experimental CMB data from the Background Emission Anisotropy Scanning Telescope (BEAST). I then introduce MAGIC, a new approach to power spectrum estimation; based on a Bayesian statistical analysis of the data utilizing Gibbs Sampling. I demonstrate application of this method to the all-sky Wilkinson Microwave Anistropy Probe WMAP data. The results are in broad agreement with those obtained originally by the WMAP team. Since MAGIC generates a full description of each C l it is possible to examine several issues raised by the best-fit WMAP power spectrum, for example the perceived lack of power at low ℓ. It is found that the distribution of C ℓ's at low l are significantly non-Gaussian and, based on the exact analysis presented here, the "low quadrupole issue" can be attributed to a statistical fluctuation. Finally, I examine the effect of Galactic foreground contamination on CMB experiments and describe the principle foregrounds. I show that it is possible to include the foreground components in a self-consistent fashion within the statistical framework of MAGIC and give explicit examples of how this might be achieved. Foreground contamination will become an increasingly important issue in CMB data analysis and the ability of this new algorithm to produce an exact power spectrum in a computationally feasible time, coupled with the foreground component separation and removal is an exciting development in CMB data analysis. When considered with current algorithmic developments
21 CFR 1404.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...
29 CFR 98.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 1 2010-07-01 2010-07-01 true Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...
Statistics of the radiated field of a space-to-earth microwave power transfer system
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Leininger, G.
1976-01-01
Statistics such as average power density pattern, variance of the power density pattern and variance of the beam pointing error are related to hardware parameters such as transmitter rms phase error and rms amplitude error. Also a limitation on spectral width of the phase reference for phase control was established. A 1 km diameter transmitter appears feasible provided the total rms insertion phase errors of the phase control modules does not exceed 10 deg, amplitude errors do not exceed 10% rms, and the phase reference spectral width does not exceed approximately 3 kHz. With these conditions the expected radiation pattern is virtually the same as the error free pattern, and the rms beam pointing error would be insignificant (approximately 10 meters).
Statistical distribution of pioneer vegetation: the role of local stream power
NASA Astrophysics Data System (ADS)
Crouzy, B.; Edmaier, K.; Pasquale, N.; Perona, P.
2012-12-01
We discuss results of a flume experiment on the colonization of river bars by pioneer vegetation and focus on the role of a non-constant local stream power in determining the statistics of riverbed and uprooted biomass characteristics (root length, number of roots and stem height). We verify the conjecture that the statistical distribution of riverbed vegetation subject to the action of flood disturbances can be obtained from the distribution before the flooding events combined to the relative resilience to floods of plants with given traits. By using fast growing vegetation (Avena sativa) we can access the competition between growth-associated root stabilization and uprooting by floods. We fix the hydrological timescale (in our experiment the arrival time between periodic flooding events) to be comparable with the biological timescales (plant germination and development rates). The sequence of flooding events is repeated until the surviving riverbed vegetation has grown out of scale with the uprooting capacity of the flood and the competition has stopped. We present and compare laboratory results obtained using converging and parallel channel walls to highlight the role of the local stream power in the process. The convergent geometry can be seen as the laboratory analog of different field conditions. At the scale of the bar it represents regions with flow concentration while at a larger scale it is an analog for a river with convergent banks, for an example see the work on the Tagliamento River by Gurnell and Petts (2006). As expected, we observe that for the convergent geometry the variability in the local stream power results in a longer tail of the distribution of root length for uprooted material compared to parallel geometries with an equal flow rate. More surprisingly, the presence of regions with increased stream power in the convergent experiments allows us to access two fundamentally different regimes. We observe that depending on the development stage
Detecting trends in raptor counts: power and type I error rates of various statistical tests
Hatfield, J.S.; Gould, W.R., IV; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.
1996-01-01
We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.
Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model
Gamisch, Alexander
2016-01-01
The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297
Integrating statistical genetic and geospatial methods brings new power to phylogeography.
Chan, Lauren M; Brown, Jason L; Yoder, Anne D
2011-05-01
The field of phylogeography continues to grow in terms of power and accessibility. Initially uniting population genetics and phylogenetics, it now spans disciplines as diverse as geology, statistics, climatology, ecology, physiology, and bioinformatics to name a few. One major and recent integration driving the field forward is between "statistical phylogeography" and Geographic Information Systems (GIS) (Knowles, 2009). Merging genetic and geospatial data, and their associated methodological toolkits, is helping to bring explicit hypothesis testing to the field of phylogeography. Hypotheses derived from one approach can be reciprocally tested with data derived from the other field and the synthesis of these data can help place demographic events in an historical and spatial context, guide genetic sampling, and point to areas for further investigation. Here, we present three practical examples of empirical analysis that integrate statistical genetic and GIS tools to construct and test phylogeographic hypotheses. Insights into the evolutionary mechanisms underlying recent divergences can benefit from simultaneously considering diverse types of information to iteratively test and reformulate hypotheses. Our goal is to provide the reader with an introduction to the variety of available tools and their potential application to typical questions in phylogeography with the hope that integrative methods will be more broadly and commonly applied to other biological systems and data sets. PMID:21352934
Kinetic power of quasars and statistical excess of MOJAVE superluminal motions
NASA Astrophysics Data System (ADS)
López-Corredoira, M.; Perucho, M.
2012-08-01
Aims: The MOJAVE (MOnitoring of Jets in AGN with VLBA Experiments) survey contains 101 quasars with a total of 354 observed radio components that are different from the radio cores, among which 95% move with apparent projected superluminal velocities with respect to the core, and 45% have projected velocities larger than 10c (with a maximum velocity 60c). We try to determine whether this distribution is statistically probable, and we make an independent measure of the kinetic power required in the quasars to produce such powerful ejections. Methods: Doppler boosting effects are analyzed to determine the statistics of the superluminal motions. We integrate over all possible values of the Lorentz factor, the values of the kinetic energy corresponding to each component. The calculation of the mass in the ejection is carried out by assuming the minimum energy state, i.e., that the magnetic field and particle energy distributions are arranged in the most efficient way to produce the observed synchrotron emission. This kinetic energy is multiplied by the frequency at which the portions of the jet fluid identified as "blobs" are produced. Hence, we estimate the average total power released by the quasars in the form of kinetic energy in the long term on pc-scales. Results: A selection effect in which both the core and the blobs of the quasar are affected by huge Doppler-boosting enhancement increases the probability of finding a jet ejected within 10 degrees of the line of sight ≳ 40 times above what one would expect for a random distribution of ejection, which explains the ratios of the very high projected velocities given above. The average total kinetic power of each MOJAVE quasar should be very high to obtain this distribution: ~ 7 × 1047 erg/s. This amount is much higher than previous estimates of kinetic power on kpc-scales based on the analysis of cavities in X-ray gas or radio lobes in samples of objects of much lower radio luminosity but similar black hole
Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants
NASA Astrophysics Data System (ADS)
Rajasekar, Vidyashree
This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly
Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.
2015-01-01
The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.
Conductance statistics for the power-law banded random matrix model
Martinez-Mendoza, A. J.; Mendez-Bermudez, J. A.; Varga, Imre
2010-12-21
We study numerically the conductance statistics of the one-dimensional (1D) Anderson model with random long-range hoppings described by the Power-law Banded Random Matrix (PBRM) model. Within a scattering approach to electronic transport, we consider two scattering setups in absence and presence of direct processes: 2M single-mode leads attached to one side and to opposite sides of 1D circular samples. For both setups we show that (i) the probability distribution of the logarithm of the conductance T behaves as w(lnT){proportional_to}T{sup M2/2}, for T<<
NASA Astrophysics Data System (ADS)
Bianucci, M.
2016-01-01
This letter has two main goals. The first one is to give a physically reasonable explanation for the use of stochastic models for mimicking the apparent random features of the El Ninõ-Southern Oscillation (ENSO) phenomenon. The second one is to obtain, from the theory, an analytical expression for the equilibrium density function of the anomaly sea surface temperature, an expression that fits the data from observations well, reproducing the asymmetry and the power law tail of the histograms of the NIÑO3 index. We succeed in these tasks exploiting some recent theoretical results of the author in the field of the dynamical origin of the stochastic processes. More precisely, we apply this approach to the celebrated recharge oscillator model (ROM), weakly interacting by a multiplicative term, with a general deterministic complex forcing (Madden-Julian Oscillations, westerly wind burst, etc.), and we obtain a Fokker-Planck equation that describes the statistical behavior of the ROM.
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Wagner, Tyler; Irwin, Brian J.; James R. Bence; Daniel B. Hayes
2016-01-01
Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.
Power-law distributions in economics: a nonextensive statistical approach (Invited Paper)
NASA Astrophysics Data System (ADS)
Duarte Queiros, Silvio M.; Anteneodo, Celia; Tsallis, Constantino
2005-05-01
The cornerstone of Boltzmann-Gibbs (BG) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy SBG≡ -k sh dx f(x) ln f(x), where k is a positive constant and f(x) a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is Sq ≡ k (1-sh dx [f(x)]q)=(1-q) (with S1 = SBG), where the entropic index q is a real parameter. It has been proposed as the basis for a generalization, referred to as nonextensive statistical mechanics, of the BG theory. Sq shares with SBG four remarkable properties, namely concavity (8q > 0), Lesche-stability (8q > 0), finiteness of the entropy production per unit time (q 2 <), and additivity (for at least a compact support of q including q = 1). The simultaneous validity of these properties suggests that Sq is appropriate for bridging, at a macroscopic level, with classical thermodynamics itself. In the same natural way that exponential probability functions arise in the standard context, power-law tailed distributions, even with exponents out of the Levy range, arise in the nonextensive framework. In this review, we intend to show that many processes of interest in economy, for which fat-tailed probability functions are empirically observed, can be described in terms of the statistical mechanisms that underly the nonextensive theory.
Eisenberg, Dan T.A.; Kuzawa, Christopher W.; Hayes, M. Geoffrey
2015-01-01
Objectives Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. Methods qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age and between mother and offspring are examined. Results First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity. Conclusions Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, re-running analyses of previous results with well position correction could serve as an independent test of the validity of these results. PMID:25757675
Using genomic annotations increases statistical power to detect eGenes
Duong, Dat; Zou, Jennifer; Hormozdiari, Farhad; Sul, Jae Hoon; Ernst, Jason; Han, Buhm; Eskin, Eleazar
2016-01-01
Motivation: Expression quantitative trait loci (eQTLs) are genetic variants that affect gene expression. In eQTL studies, one important task is to find eGenes or genes whose expressions are associated with at least one eQTL. The standard statistical method to determine whether a gene is an eGene requires association testing at all nearby variants and the permutation test to correct for multiple testing. The standard method however does not consider genomic annotation of the variants. In practice, variants near gene transcription start sites (TSSs) or certain histone modifications are likely to regulate gene expression. In this article, we introduce a novel eGene detection method that considers this empirical evidence and thereby increases the statistical power. Results: We applied our method to the liver Genotype-Tissue Expression (GTEx) data using distance from TSSs, DNase hypersensitivity sites, and six histone modifications as the genomic annotations for the variants. Each of these annotations helped us detected more candidate eGenes. Distance from TSS appears to be the most important annotation; specifically, using this annotation, our method discovered 50% more candidate eGenes than the standard permutation method. Contact: buhm.han@amc.seoul.kr or eeskin@cs.ucla.edu PMID:27307612
Statistics of 150-km echoes over Jicamarca based on low-power VHF observations
NASA Astrophysics Data System (ADS)
Chau, J. L.; Kudeki, E.
2006-07-01
In this work we summarize the statistics of the so-called 150-km echoes obtained with a low-power VHF radar operation at the Jicamarca Radio Observatory (11.97 S, 76.87 W, and 1.3 dip angle at 150-km altitude) in Peru. Our results are based on almost four years of observations between August 2001 and July 2005 (approximately 150 days per year). The majority of the observations have been conducted between 08:00 and 17:00 LT. We present the statistics of occurrence of the echoes for each of the four seasons as a function of time of day and altitude. The occurrence frequency of the echoes is ~75% around noon and start decreasing after 15:00 LT and disappear after 17:00 LT in all seasons. As shown in previous campaign observations, the 150-echoes appear at a higher altitude (>150 km) in narrow layers in the morning, reaching lower altitudes (~135 km) around noon, and disappear at higher altitudes (>150 km) after 17:00 LT. We show that although 150-km echoes are observed all year long, they exhibit a clear seasonal variability on altitudinal coverage and the percentage of occurrence around noon and early in the morning. We also show that there is a strong day-to-day variability, and no correlation with magnetic activity. Although our results do not solve the 150-km riddle, they should be taken into account when a reasonable theory is proposed.
NASA Astrophysics Data System (ADS)
Ladoni, Moslem; Kravchenko, Sasha
2014-05-01
Conservational agricultural managements have a potential to increase soil organic carbon sequestration. However, due to typically slow response of soil organic C to management and due to its large spatial variability many researchers find themselves failing to detect statistically significant management effects on soil organic carbon in their studies. One solution that has been commonly applied is to use active fractions of soil organic C for treatment comparisons. Active pools of soil organic C have been shown to respond to management changes faster than total C; however, it is possible that larger variability associated with these pools can make their use for treatment comparisons more difficult. The objectives of this study are to assess the variability of total C and C active pools and then to use power analysis to investigate the probability of detecting significant differences among the treatments for total C and for different active pools of C. We also explored the benefit of applying additional soil and landscape data as covariates to explain some of the variability and to enhance the statistical power for different pools of C. We collected 66 soil from 10 agricultural fields under three different management treatments, namely corn-soybean-wheat rotation systems with 1) conventional chemical inputs, 2) low chemical inputs with cover crops and 3) organic management with cover crops. The cores were analyzed for total organic carbon (TOC) and for two active C pool characteristics, such as particulate organic carbon (POC) and short-term mineralizable carbon (SMC). In addition, for each core we determined the values of potential covariates including soil particle size distribution, bulk density and topographical terrain attributes. Power analysis was conducted using the estimates of variances from the obtained data and a series of hypothesized management effects. The range of considered hypothesized effects consisted of 10-100% increases under low-input, 10
34 CFR 85.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Definitions § 85.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular act or omission has occurred. Authority: E.O. 12549 (3 CFR, 1986 Comp., p. 189); E.O 12689 (3 CFR, 1989 Comp., p. 235); 20 U.S.C. 1082, 1094, 1221e-3 and 3474; and Sec....
29 CFR 452.110 - Adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 2 2010-07-01 2010-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...
29 CFR 452.110 - Adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 2 2011-07-01 2011-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
The profound impact of negative power law noise on statistical estimation.
Reinhardt, Victor S
2010-01-01
This paper investigates the profound impact of negative power law (neg-p) noise - that is, noise with a power spectral density L(p)(f) proportional variant | f |(p) for p < 0 - on the ability of practical implementations of statistical estimation or fitting techniques, such as a least squares fit (LSQF) or a Kalman filter, to generate valid results. It demonstrates that such negp noise behaves more like systematic error than conventional noise, because neg-p noise is highly correlated, non-stationary, non-mean ergodic, and has an infinite correlation time tau(c). It is further demonstrated that stationary but correlated noise will also cause invalid estimation behavior when the condition T > tau(c) is not met, where T is the data collection interval for estimation. Thus, it is shown that neg-p noise, with its infinite Tau(c), can generate anomalous estimation results for all values of T, except in certain circumstances. A covariant theory is developed explaining much of this anomalous estimation behavior. However, simulations of the estimation behavior of neg-p noise demonstrate that the subject cannot be fully understood in terms of covariant theory or mean ergodicity. It is finally conjectured that one must investigate the variance ergodicity properties of neg-p noise through the use of 4th order correlation theory to fully explain such simulated behavior. PMID:20040429
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-01-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.
NASA Astrophysics Data System (ADS)
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.
Americans Getting Adequate Water Daily, CDC Finds
... medlineplus/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...
Americans Getting Adequate Water Daily, CDC Finds
... gov/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...
GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis
Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.
2011-01-01
The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required. PMID:26673526
Nakamura, Kunio; Guizard, Nicolas; Fonov, Vladimir S.; Narayanan, Sridar; Collins, D. Louis; Arnold, Douglas L.
2013-01-01
Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS) and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials. PMID:24266007
Kogalovskii, M.R.
1995-03-01
This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.
ERIC Educational Resources Information Center
Daniel, Thomas Dyson
Statistical power in music education was examined by taking an in-depth look at quantitative articles published in the "Journal of Research in Music Education" between 1987 and 1991, inclusive. Of the 109 articles of the period, 78 were quantitative, with both parametric and nonparametric procedures considered. Sample sizes were those reported by…
Asbestos/NESHAP adequately wet guidance
Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.
1990-12-01
The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.
Statistics of the epoch of reionization 21-cm signal - I. Power spectrum error-covariance
NASA Astrophysics Data System (ADS)
Mondal, Rajesh; Bharadwaj, Somnath; Majumdar, Suman
2016-02-01
The non-Gaussian nature of the epoch of reionization (EoR) 21-cm signal has a significant impact on the error variance of its power spectrum P(k). We have used a large ensemble of seminumerical simulations and an analytical model to estimate the effect of this non-Gaussianity on the entire error-covariance matrix {C}ij. Our analytical model shows that {C}ij has contributions from two sources. One is the usual variance for a Gaussian random field which scales inversely of the number of modes that goes into the estimation of P(k). The other is the trispectrum of the signal. Using the simulated 21-cm Signal Ensemble, an ensemble of the Randomized Signal and Ensembles of Gaussian Random Ensembles we have quantified the effect of the trispectrum on the error variance {C}ii. We find that its relative contribution is comparable to or larger than that of the Gaussian term for the k range 0.3 ≤ k ≤ 1.0 Mpc-1, and can be even ˜200 times larger at k ˜ 5 Mpc-1. We also establish that the off-diagonal terms of {C}ij have statistically significant non-zero values which arise purely from the trispectrum. This further signifies that the error in different k modes are not independent. We find a strong correlation between the errors at large k values (≥0.5 Mpc-1), and a weak correlation between the smallest and largest k values. There is also a small anticorrelation between the errors in the smallest and intermediate k values. These results are relevant for the k range that will be probed by the current and upcoming EoR 21-cm experiments.
Schroeder, Carl B.; Fawley, William M.; Esarey, Eric
2002-09-24
We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond first saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuation level reaches a minimum.
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Adequate supervision for children and adolescents.
Anderst, James; Moffatt, Mary
2014-11-01
Primary care providers (PCPs) have the opportunity to improve child health and well-being by addressing supervision issues before an injury or exposure has occurred and/or after an injury or exposure has occurred. Appropriate anticipatory guidance on supervision at well-child visits can improve supervision of children, and may prevent future harm. Adequate supervision varies based on the child's development and maturity, and the risks in the child's environment. Consideration should be given to issues as wide ranging as swimming pools, falls, dating violence, and social media. By considering the likelihood of harm and the severity of the potential harm, caregivers may provide adequate supervision by minimizing risks to the child while still allowing the child to take "small" risks as needed for healthy development. Caregivers should initially focus on direct (visual, auditory, and proximity) supervision of the young child. Gradually, supervision needs to be adjusted as the child develops, emphasizing a safe environment and safe social interactions, with graduated independence. PCPs may foster adequate supervision by providing concrete guidance to caregivers. In addition to preventing injury, supervision includes fostering a safe, stable, and nurturing relationship with every child. PCPs should be familiar with age/developmentally based supervision risks, adequate supervision based on those risks, characteristics of neglectful supervision based on age/development, and ways to encourage appropriate supervision throughout childhood. PMID:25369578
Small Rural Schools CAN Have Adequate Curriculums.
ERIC Educational Resources Information Center
Loustaunau, Martha
The small rural school's foremost and largest problem is providing an adequate curriculum for students in a changing world. Often the small district cannot or is not willing to pay the per-pupil cost of curriculum specialists, specialized courses using expensive equipment no more than one period a day, and remodeled rooms to accommodate new…
Funding the Formula Adequately in Oklahoma
ERIC Educational Resources Information Center
Hancock, Kenneth
2015-01-01
This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…
The Relation of Power of Statistical Tests to Range of Talent: A Correction and Amplification.
ERIC Educational Resources Information Center
Humphreys, Lloyd G.
1991-01-01
The difference in effect on power of restriction of range between treatments under experimental control and categories formed from examinee (individual differences) variables is discussed. Conditions under which there is loss or increase of power are explained. (SLD)
Inference of Statistical Patterns in Complex Geosystems: Fitting Power-law Distributions.
NASA Astrophysics Data System (ADS)
Deluca, Anna; Corral, Alvaro
2014-05-01
Power-law distributions contain precious information about a large variety of physical processes. Although there are sound theoretical grounds for these distributions, the empirical evidence giving support to power laws has been traditionally weak. Recently, Clauset et al. have proposed a systematic method to find over which range (if any) a certain distribution behaves as a power law. However, their method fails to recognize true (simulated) power-law tails in some instances, rejecting the power-law hypothesis. Moreover, the method does not perform well when it is extended to power-law distributions with an upper truncation. We present an alternative procedure, valid for truncated as well as for non-truncated power-law distributions, based in maximum likelihood estimation, the Kolmogorov-Smirnov goodness-of-fit test, and Monte Carlo simulations. We will test the performance of our method on several empirical data which were previously analyzed with less systematic approaches.
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles. PMID:21319626
NASA Astrophysics Data System (ADS)
Tsallis, C.; Cirto, L. J. L.
2014-10-01
We briefly review the connection between statistical mechanics and thermodynamics. We show that, in order to satisfy thermo-dynamics and its Legendre transformation mathematical frame, the celebrated Boltzmann-Gibbs (BG) statistical mechanics is sufficient but not necessary. Indeed, the N →∞ limit of statistical mechanics is expected to be consistent with thermodynamics. For systems whose elements are generically independent or quasi-independent in the sense of the theory of probabilities, it is well known that the BG theory (based on the additive BG entropy) does satisfy this expectation. However, in complete analogy, other thermostatistical theories (e.g., q-statistics), based on nonadditive entropic functionals, also satisfy the very same expectation. We illustrate this standpoint with systems whose elements are strongly correlated in a specific manner, such that they escape the BG realm.
Power law statistics of force and acoustic emission from a slowly penetrated granular bed
NASA Astrophysics Data System (ADS)
Matsuyama, K.; Katsuragi, H.
2014-01-01
Penetration-resistant force and acoustic emission (AE) from a plunged granular bed are experimentally investigated through their power law distribution forms. An AE sensor is buried in a glass bead bed. Then, the bed is slowly penetrated by a solid sphere. During the penetration, the resistant force exerted on the sphere and the AE signal are measured. The resistant force shows power law relation to the penetration depth. The power law exponent is independent of the penetration speed, while it seems to depend on the container's size. For the AE signal, we find that the size distribution of AE events obeys power laws. The power law exponent depends on grain size. Using the energy scaling, the experimentally observed power law exponents are discussed and compared to the Gutenberg-Richter (GR) law.
Chung, Moo K; Kim, Seung-Goo; Schaefer, Stacey M; van Reekum, Carien M; Peschke-Schmitz, Lara; Sutterer, Matthew J; Davidson, Richard J
2014-03-21
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace-Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power. PMID:25302007
The Power of Student's t and Wilcoxon W Statistics: A Comparison.
ERIC Educational Resources Information Center
Rasmussen, Jeffrey Lee
1985-01-01
A recent study (Blair and Higgins, 1980) indicated a power advantage for the Wilcoxon W Test over student's t-test when calculated from a common mixed-normal sample. Results of the present study indicate that the t-test corrected for outliers shows a superior power curve to the Wilcoxon W.
ERIC Educational Resources Information Center
Jiang, Depeng; Pepler, Debra; Yao, Hongxing
2010-01-01
Do interventions work and for whom? For this article, we examined the influence of population heterogeneity on power in designing and evaluating interventions. On the basis of Monte Carlo simulations in Study 1, we demonstrated that the power to detect the overall intervention effect is lower for a mixture of two subpopulations than for a…
ERIC Educational Resources Information Center
Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael
2014-01-01
Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…
Bellan, Steven E.; Pulliam, Juliet R. C.; Pearson, Carl A. B.; Champredon, David; Fox, Spencer J.; Skrip, Laura; Galvani, Alison P.; Gambhir, Manoj; Lopman, Ben A.; Porco, Travis C.; Meyers, Lauren Ancel; Dushoff, Jonathan
2016-01-01
Background Safe and effective vaccines may help end the ongoing Ebola virus disease (EVD) epidemic in West Africa, and mitigate future outbreaks. We evaluate the statistical validity and power of randomized controlled (RCT) and stepped-wedge cluster trial (SWCT) designs in Sierra Leone, where EVD incidence is spatiotemporally heterogeneous, and rapidly declining. Methods We forecasted district-level EVD incidence over the next six months using a stochastic model fit to data from Sierra Leone. We then simulated RCT and SWCT designs in trial populations comprising geographically distinct clusters of high risk, taking into account realistic logistical constraints, as well as both individual-level and cluster-level variation in risk. We assessed false positive rates and power for parametric and nonparametric analyses of simulated trial data, across a range of vaccine efficacies and trial start dates. Findings For an SWCT, regional variation in EVD incidence trends produced inflated false positive rates (up to 0.11 at α=0.05) under standard statistical models, but not when analyzed by a permutation test, whereas all analyses of RCTs remained valid. Assuming a six-month trial starting February 18, 2015, we estimate the power to detect a 90% efficacious vaccine to be between 48% and 89% for an RCT, and between 6.4% and 26% for an SWCT, depending on incidence within the trial population. We estimate that a one-month delay in implementation will reduce the power of the RCT and SWCT by 20% and 49%, respectively. Interpretation Spatiotemporal variation in infection risk undermines the SWCT's statistical power. This variation also undercuts the SWCT's expected ethical advantages over the RCT, because the latter but not the former can prioritize high-risk clusters. Funding US National Institutes of Health, US National Science Foundation, Canadian Institutes of Health Research PMID:25886798
ERIC Educational Resources Information Center
Groth, Randall E.
2013-01-01
A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…
ERIC Educational Resources Information Center
Endress, Ansgar D.; Mehler, Jacques
2009-01-01
Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…
Two universal physical principles shape the power-law statistics of real-world networks
Lorimer, Tom; Gomez, Florian; Stoop, Ruedi
2015-01-01
The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations. PMID:26202858
Two universal physical principles shape the power-law statistics of real-world networks
NASA Astrophysics Data System (ADS)
Lorimer, Tom; Gomez, Florian; Stoop, Ruedi
2015-07-01
The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations.
Statistical analysis of power-size-redshift distributions of extragalactic jets
NASA Technical Reports Server (NTRS)
Rosen, Alexander; Wiita, Paul J.
1991-01-01
This paper investigates whether a hot, sparse, yet cosmologically significant intergalactic medium is consistent with data collected from extragalactic radio sources. This is done by use of Monte Carlo simulations which employ previously run pseudohydrodynamical simulations to cover an observational parameter space. These observational parameters include the scale height, central density, and temperature of a (isothermal) galactic halo, and the power of the central engine which drives the jet. The Monte Carlo simulations generate distribution of sizes in bins of (received) power and redshift, which have been compared with observational data using Kolmogorov-Smirnov tests. Results of this analysis are consistent with the existence of an IGM with temperature and density mentioned above. In addition, this analysis suggests that the active lifetime of powerful extragalactic radio sources decreases with increasing power.
Statistical evidence for power law temporal correlations in exploratory behaviour of rats.
Yadav, Chetan K; Verma, Mahendra K; Ghosh, Subhendu
2010-01-01
Dynamics of exploratory behaviour of rats and home base establishment is investigated. Time series of instantaneous speed of rats was computed from their position during exploration. The probability distribution function (PDF) of the speed obeys a power law distribution with exponents ranging from 2.1 to 2.32. The PDF of the recurrence time of large speed also exhibits a power law, P(τ) ~ τ(⁻β) with β from 1.56 to 2.30. The power spectrum of the speed is in general agreement with the 1/f spectrum reported earlier. These observations indicate that the acquisition of spatial information during exploration is self-organized with power law temporal correlations. This provides a possible explanation for the home base behaviour of rats during exploration. The exploratory behaviour of rats resembles other systems exhibiting self-organized criticality, e.g., earthquakes, solar flares etc. PMID:20688133
NASA Technical Reports Server (NTRS)
Smith, Wayne Farrior
1973-01-01
The effect of finite source size on the power statistics in a reverberant room for pure tone excitation was investigated. Theoretical results indicate that the standard deviation of low frequency, pure tone finite sources is always less than that predicted by point source theory and considerably less when the source dimension approaches one-half an acoustic wavelength or greater. A supporting experimental study was conducted utilizing an eight inch loudspeaker and a 30 inch loudspeaker at eleven source positions. The resulting standard deviation of sound power output of the smaller speaker is in excellent agreement with both the derived finite source theory and existing point source theory, if the theoretical data is adjusted to account for experimental incomplete spatial averaging. However, the standard deviation of sound power output of the larger speaker is measurably lower than point source theory indicates, but is in good agreement with the finite source theory.
NASA Astrophysics Data System (ADS)
Woolley, Thomas W.; Dawson, George O.
It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
NASA Astrophysics Data System (ADS)
Kneller, James P.; Kabadi, Neel V.
2015-07-01
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. In this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.
A statistical framework for genetic association studies of power curves in bird flight
Lin, Min; Zhao, Wei
2006-01-01
How the power required for bird flight varies as a function of forward speed can be used to predict the flight style and behavioral strategy of a bird for feeding and migration. A U-shaped curve was observed between the power and flight velocity in many birds, which is consistent to the theoretical prediction by aerodynamic models. In this article, we present a general genetic model for fine mapping of quantitative trait loci (QTL) responsible for power curves in a sample of birds drawn from a natural population. This model is developed within the maximum likelihood context, implemented with the EM algorithm for estimating the population genetic parameters of QTL and the simplex algorithm for estimating the QTL genotype-specific parameters of power curves. Using Monte Carlo simulation derived from empirical observations of power curves in the European starling (Sturnus vulgaris), we demonstrate how the underlying QTL for power curves can be detected from molecular markers and how the QTL detected affect the most appropriate flight speeds used to design an optimal migration strategy. The results from our model can be directly integrated into a conceptual framework for understanding flight origin and evolution. PMID:17066123
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives. PMID:26903208
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.
2014-11-01
We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. This map shows an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales (ℓ < 60), the major foreground contaminant is the diffuse thermal dust emission. At small angular scales (ℓ > 500) the clustered cosmic infrared background and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We thus measure the tSZ power spectrum over angular scales 0.17° ≲ θ ≲ 3.0° that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with clear evidence of additional signal from unresolved clusters and, potentially, diffuse warm baryons. Marginalized band-powers of the Planck tSZ power spectrum and the best-fit model are given. The non-Gaussianity of the Compton parameter map is further characterized by computing its 1D probability distribution function and its bispectrum. The measured tSZ power spectrum and high order statistics are used to place constraints on σ8.
The power of 41%: A glimpse into the life of a statistic.
Tanis, Justin
2016-01-01
"Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record PMID:27380151
Real-time determination of total radiated power by bolometric cameras with statistical methods
Maraschek, M.; Fuchs, J.C.; Mast, K.F.; Mertens, V.; Zohm, H.
1998-01-01
A simpler and faster method for determining the total radiated power emitted from a tokamak plasma in real-time has been developed. This quantity is normally calculated after the discharge by a deconvolution of line integrals from a bolometer camera. This time-consuming algorithm assumes constant emissivity on closed flux surfaces and therefore needs the exact magnetic equilibrium information. Thus, it is highly desirable to have a different, simpler way to determine the total radiated power in real-time without additional magnetic equilibrium information. The real-time calculation of the total radiated power is done by a summation over ten or 18 lines of sight selected out of a bolometer camera with 40 channels. The number of channels is restricted by the summation hardware. A new selection scheme, which uses a singular value decomposition, has been developed to select the required subset of line integrals from the camera. With this subset, a linear regression analysis was done against the radiated power calculated by the conventional algorithm. The selected channels are finally used with the regression coefficients as weighting factors to determine an estimation of the radiated power for subsequent discharges. This selection and the corresponding weighting factors can only be applied to discharges with a similar plasma shape, e.g., in our case the typical ASDEX upgrade elliptical divertor plasma. {copyright} {ital 1998 American Institute of Physics.}
Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint
Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.
2012-08-01
Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.
Adequation of mini satellites to oceanic altimetry missions
NASA Astrophysics Data System (ADS)
Bellaieche, G.; Aguttes, J. P.
1993-01-01
Association of the mini satellite concept and oceanic altimetry missions is discussed. Mission definition and most constraining requirements (mesoscale for example) demonstrate mini satellites to be quite adequate for such missions. Progress in altimeter characteristics, orbit determination, and position reporting allow consideration of oceanic altimetry missions using low Earth orbit satellites. Satellite constellation, trace keeping and orbital period, and required payload characteristics are exposed. The mission requirements covering Sun synchronous orbit, service area, ground system, and launcher characteristics as well as constellation maintenance strategy are specified. Two options for the satellite, orbital mechanics, propulsion, onboard power and stabilizing subsystems, onboard management, satellite ground linkings, mechanical and thermal subsystems, budgets, and planning are discussed.
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2012-01-01
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power…
Tests of Independence in Contingency Tables with Small Samples: A Comparison of Statistical Power.
ERIC Educational Resources Information Center
Parshall, Cynthia G.; Kromrey, Jeffrey D.
1996-01-01
Power and Type I error rates were estimated for contingency tables with small sample sizes for the following four types of tests: (1) Pearson's chi-square; (2) chi-square with Yates's continuity correction; (3) the likelihood ratio test; and (4) Fisher's Exact Test. Various marginal distributions, sample sizes, and effect sizes were examined. (SLD)
NASA Astrophysics Data System (ADS)
Barkana, Rennan; Loeb, Abraham
2008-03-01
A new generation of radio telescopes are currently being built with the goal of tracing the cosmic distribution of atomic hydrogen at redshifts 6-15 through its 21-cm line. The observations will probe the large-scale brightness fluctuations sourced by ionization fluctuations during cosmic reionization. Since detailed maps will be difficult to extract due to noise and foreground emission, efforts have focused on a statistical detection of the 21-cm fluctuations. During cosmic reionization, these fluctuations are highly non-Gaussian and thus more information can be extracted than just the one-dimensional function that is usually considered, i.e. the correlation function. We calculate a two-dimensional function that if measured observationally would allow a more thorough investigation of the properties of the underlying ionizing sources. This function is the probability distribution function (PDF) of the difference in the 21-cm brightness temperature between two points, as a function of the separation between the points. While the standard correlation function is determined by a complicated mixture of contributions from density and ionization fluctuations, we show that the difference PDF holds the key to separately measuring the statistical properties of the ionized regions.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
The results of a NASA investigation of a claimed Overlap between two gust response analysis methods: the Statistical Discrete Gust (SDG) Method and the Power Spectral Density (PSD) Method are presented. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented for several different airplanes at several different flight conditions indicate that such an Overlap does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
This paper presents the results of a NASA investigation of a claimed 'Overlap' between two gust response analysis methods: the Statistical Discrete Gust (SDG) method and the Power Spectral Density (PSD) method. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented in this paper for several different airplanes at several different flight conditions indicate that such an 'Overlap' does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
Is a vegetarian diet adequate for children.
Hackett, A; Nathan, I; Burgess, L
1998-01-01
The number of people who avoid eating meat is growing, especially among young people. Benefits to health from a vegetarian diet have been reported in adults but it is not clear to what extent these benefits are due to diet or to other aspects of lifestyles. In children concern has been expressed concerning the adequacy of vegetarian diets especially with regard to growth. The risks/benefits seem to be related to the degree of restriction of he diet; anaemia is probably both the main and the most serious risk but this also applies to omnivores. Vegan diets are more likely to be associated with malnutrition, especially if the diets are the result of authoritarian dogma. Overall, lacto-ovo-vegetarian children consume diets closer to recommendations than omnivores and their pre-pubertal growth is at least as good. The simplest strategy when becoming vegetarian may involve reliance on vegetarian convenience foods which are not necessarily superior in nutritional composition. The vegetarian sector of the food industry could do more to produce foods closer to recommendations. Vegetarian diets can be, but are not necessarily, adequate for children, providing vigilance is maintained, particularly to ensure variety. Identical comments apply to omnivorous diets. Three threats to the diet of children are too much reliance on convenience foods, lack of variety and lack of exercise. PMID:9670174
Statistical modelling and power analysis for detecting trends in total suspended sediment loads
NASA Astrophysics Data System (ADS)
Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason
2015-01-01
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.
The power of the optimal asymptotic tests of composite statistical hypotheses.
Singh, A C; Zhurbenko, I G
1975-02-01
The easily computable asymptotic power of the locally asymptotically optimal test of a composite hypothesis, known as the optimal C(alpha) test, is obtained through a "double" passage to the limit: the number n of observations is indefinitely increased while the conventional measure xi of the error in the hypothesis tested tends to zero so that xi(n)n((1/2)) --> tau not equal 0. Contrary to this, practical problems require information on power, say beta(xi,n), for a fixed xi and for a fixed n. The present paper gives the upper and the lower bounds for beta(xi,n). These bounds can be used to estimate the rate of convergence of beta(xi,n) to unity as n --> infinity. The results obtained can be extended to test criteria other than those labeled C(alpha). The study revealed a difference between situations in which the C(alpha) test criterion is used to test a simple or a composite hypothesis. This difference affects the rate of convergence of the actual probability of type I error to the preassigned level alpha. In the case of a simple hypothesis, the rate is of the order of n(-(1/2)). In the case of a composite hypothesis, the best that it was possible to show is that the rate of convergence cannot be slower than that of the order of n(-(1/2)) ln n. PMID:16592222
Fast fMRI provides high statistical power in the analysis of epileptic networks.
Jacobs, Julia; Stich, Julia; Zahneisen, Benjamin; Assländer, Jakob; Ramantani, Georgia; Schulze-Bonhage, Andreas; Korinthenberg, Rudolph; Hennig, Jürgen; LeVan, Pierre
2014-03-01
EEG-fMRI is a unique method to combine the high temporal resolution of EEG with the high spatial resolution of MRI to study generators of intrinsic brain signals such as sleep grapho-elements or epileptic spikes. While the standard EPI sequence in fMRI experiments has a temporal resolution of around 2.5-3s a newly established fast fMRI sequence called MREG (Magnetic-Resonance-Encephalography) provides a temporal resolution of around 100ms. This technical novelty promises to improve statistics, facilitate correction of physiological artifacts and improve the understanding of epileptic networks in fMRI. The present study compares simultaneous EEG-EPI and EEG-MREG analyzing epileptic spikes to determine the yield of fast MRI in the analysis of intrinsic brain signals. Patients with frequent interictal spikes (>3/20min) underwent EEG-MREG and EEG-EPI (3T, 20min each, voxel size 3×3×3mm, EPI TR=2.61s, MREG TR=0.1s). Timings of the spikes were used in an event-related analysis to generate activation maps of t-statistics. (FMRISTAT, |t|>3.5, cluster size: 7 voxels, p<0.05 corrected). For both sequences, the amplitude and location of significant BOLD activations were compared with the spike topography. 13 patients were recorded and 33 different spike types could be analyzed. Peak T-values were significantly higher in MREG than in EPI (p<0.0001). Positive BOLD effects correlating with the spike topography were found in 8/29 spike types using the EPI and in 22/33 spikes types using the MREG sequence. Negative BOLD responses in the default mode network could be observed in 3/29 spike types with the EPI and in 19/33 with the MREG sequence. With the latter method, BOLD changes were observed even when few spikes occurred during the investigation. Simultaneous EEG-MREG thus is possible with good EEG quality and shows higher sensitivity in regard to the localization of spike-related BOLD responses than EEG-EPI. The development of new methods of analysis for this sequence such as
Statistical power of detecting trends in total suspended sediment loads to the Great Barrier Reef.
Darnell, Ross; Henderson, Brent; Kroon, Frederieke J; Kuhnert, Petra
2012-01-01
The export of pollutant loads from coastal catchments is of primary interest to natural resource management. For example, Reef Plan, a joint initiative by the Australian Government and the Queensland Government, has indicated that a 20% reduction in sediment is required by 2020. There is an obvious need to consider our ability to detect any trend if we are to set realistic targets or to reliably identify changes to catchment loads. We investigate the number of years of monitoring aquatic pollutant loads necessary to detect trends. Instead of modelling the trend in the annual loads directly, given their strong relationship to flow, we consider trends through the reduction in concentration for a given flow. Our simulations show very low power (<40%) of detecting changes of 20% over time periods of several decades, indicating that the chances of detecting trends of reasonable magnitudes over these time frames are very small. PMID:22551850
Weak lensing statistics as a probe of {OMEGA} and power spectrum.
NASA Astrophysics Data System (ADS)
Bernardeau, F.; van Waerbeke, L.; Mellier, Y.
1997-06-01
The possibility of detecting weak lensing effects from deep wide field imaging surveys has opened new means of probing the large-scale structure of the Universe and measuring cosmological parameters. In this paper we present a systematic study of the expected dependence of the low order moments of the filtered gravitational local convergence on the power spectrum of the density fluctuations and on the cosmological parameters {OMEGA}_0_ and {LAMBDA}. The results show a significant dependence on all these parameters. Though we note that this degeneracy could be partially raised by considering two populations of sources, at different redshifts, computing the third moment is more promising since it is expected, in the quasi-linear regime and for Gaussian initial conditions, to be only {OMEGA}_0_ dependent (with a slight degeneracy with {LAMBDA}) when it is correctly expressed in terms of the second moment. More precisely we show that the variance of the convergence varies approximately as P(k){OMEGA}_0_^1.5^z_s_^1.5^, whereas the skewness varies as {OMEGA}_0_^-0.8^z_s_^-1.35^, where P(k) is the projected power spectrum and z_s_ the redshift of the sources. Thus, used jointly they can provide both P(k) and {OMEGA}_0_. However, the dependence on the redshift of the sources is large and could be a major concern for a practical implementation. We have estimated the errors expected for these parameters in a realistic scenario and sketched what would be the observational requirements for doing such measurements. A more detailed study of an observational strategy is left for a second paper.
Statistical power of multilevel modelling in dental caries clinical trials: a simulation study.
Burnside, G; Pine, C M; Williamson, P R
2014-01-01
Outcome data from dental caries clinical trials have a naturally hierarchical structure, with surfaces clustered within teeth, clustered within individuals. Data are often aggregated into the DMF index for each individual, losing tooth- and surface-specific information. If these data are to be analysed by tooth or surface, allowing exploration of effects of interventions on different teeth and surfaces, appropriate methods must be used to adjust for the clustered nature of the data. Multilevel modelling allows analysis of clustered data using individual observations without aggregating data, and has been little used in the field of dental caries. A simulation study was conducted to investigate the performance of multilevel modelling methods and standard caries increment analysis. Data sets were simulated from a three-level binomial distribution based on analysis of a caries clinical trial in Scottish adolescents, with varying sample sizes, treatment effects and random tooth level effects based on trials reported in Cochrane reviews of topical fluoride, and analysed to compare the power of multilevel models and traditional analysis. 40,500 data sets were simulated. Analysis showed that estimated power for the traditional caries increment method was similar to that for multilevel modelling, with more variation in smaller data sets. Multilevel modelling may not allow significant reductions in the number of participants required in a caries clinical trial, compared to the use of traditional analyses, but investigators interested in exploring the effect of their intervention in more detail may wish to consider the application of multilevel modelling to their clinical trial data. PMID:24216573
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
Kinetic and Statistical Analysis of Primary Circuit Water Chemistry Data in a VVER Power Plant
Nagy, Gabor; Tilky, Peter; Horvath, Akos; Pinter, Tamas; Schiller, Robert
2001-12-15
The results of chemical and radiochemical analyses of the primary circuit coolant liquid, obtained between 1995 and 1999 at the four VVER-type blocks of the Paks (Hungary) nuclear power station, are assessed. A model has been developed regarding the pressure vessel with its auxiliary parts plus the fuel elements as the zone, with the six steam generators as one single unit. The stream from the steam generator is split, with its larger part returning to the zone through the main circulating pump and the smaller one passing through the purifier column. Based on this flowchart, the formation kinetics of corrosion products and of radioactive substances are evaluated. Correlation analysis is applied to reveal any eventual interdependence of the processes, whereas the range-per-scatter (R/S) method is used to characterize the random or deterministic nature of a process. The evaluation of the t {yields} {infinity} limits of the kinetic equations enables one to conclude that (a) the total amount of corrosion products per element during one cycle is almost always <15 kg and (b) the zone acts as a highly efficient filter with an efficiency of {approx}1. The R/S results show that the fluctuations in the concentrations of the corrosion products are persistent; this finding indicates that random effects play here little if any role and that the processes in the coolant are under control. Correlation analyses show that the variations of the concentrations are practically uncorrelated and that the processes are independent of each other.
ERIC Educational Resources Information Center
Blair, R. Clifford; Higgins, James J.
1980-01-01
Monte Carlo techniques were used to compare the power of Wilcoxon's rank-sum test to the power of the two independent means t test for situations in which samples were drawn from (1) uniform, (2) Laplace, (3) half-normal, (4) exponential, (5) mixed-normal, and (6) mixed-uniform distributions. (Author/JKS)
NASA Astrophysics Data System (ADS)
Mittendorfer, J.; Zwanziger, P.
2000-03-01
High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 4 Accounts 1 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 200.14 Section 200.14 Accounts RECOVERY ACCOUNTABILITY AND TRANSPARENCY BOARD PRIVACY ACT OF 1974 § 200.14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 4 Accounts 1 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 200....14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security safeguards to prevent unauthorized disclosure...
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
NASA Astrophysics Data System (ADS)
Kolchev, K. K.; Mezin, S. V.
2015-07-01
A technique for constructing mathematical models simulating the technological processes in thermal power equipment developed on the basis of the statistical approximation method is described. The considered method was used in the developed software module (plug-in) intended for calculating nonlinear mathematical models of gas turbine units and for diagnosing them. The mathematical models constructed using this module are used for describing the current state of a system. Deviations of the system's actual state from the estimate obtained using the mathematical model point to malfunctions in operation of this system. The multidimensional interpolation and approximation method and the theory of random functions serve as a theoretical basis of the developed technique. By using the developed technique it is possible to construct complex static models of plants that are subject to control and diagnostics. The module developed using the proposed technique makes it possible to carry out periodic diagnostics of the operating equipment for revealing deviations from the normal mode of its operation. The specific features relating to construction of mathematical models are considered, and examples of applying them with the use of observations obtained on the equipment of gas turbine units are given.
ERIC Educational Resources Information Center
Tabor, Josh
2010-01-01
On the 2009 AP[c] Statistics Exam, students were asked to create a statistic to measure skewness in a distribution. This paper explores several of the most popular student responses and evaluates which statistic performs best when sampling from various skewed populations. (Contains 8 figures, 3 tables, and 4 footnotes.)
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
NASA Astrophysics Data System (ADS)
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
NASA Astrophysics Data System (ADS)
Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja
2016-04-01
Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.
7 CFR 4290.200 - Adequate capital for RBICs.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 15 2011-01-01 2011-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...
7 CFR 4290.200 - Adequate capital for RBICs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 503.35 - Inability to obtain adequate capital.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...
10 CFR 503.35 - Inability to obtain adequate capital.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Adequate exploration plan....
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Adequate exploration plan....
"Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya
ERIC Educational Resources Information Center
Parker, Jan
2014-01-01
Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo…
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
NASA Astrophysics Data System (ADS)
Olson, Kyle David
A model is presented and confirmed experimentally that explains the anomalous behavior observed in the continuous wave (CW) excitation of thermally-isolated optics. Very low absorption, high reflective optical thin film coatings of HfO2 and SiO2 were prepared. When illuminated with a laser for 30s the coatings survived peak irradiances of 13MW/cm 2. The temperature profile of the optical surfaces was measured using a calibrated thermal imaging camera; about the same peak temperatures were recorded regardless of spot size, which ranged between 500mum and 5mm. This phenomenon is explained by solving the heat diffusion equation for an optic of finite dimensions, including the non-idealities of the measurement. An analytical result is also derived showing the transition from millisecond pulses to CW, where the heating is proportional to the laser irradiance (W/m 2) for millisecond pulses, and proportional to the beam radius (W/m) for CW. Contamination-induced laser breakdown is often viewed as random and simple physical models are difficult to apply. Under continuous wave illumination conditions, failure appears to be induced by a runaway free-carrier absorption process. High power laser illumination is absorbed by the contaminant particles or regions, which heat rapidly. Some of this heat transfers to the substrate, raising its temperature towards that of the vaporizing particle. This generates free carriers, causing more absorption and more heating. If a certain threshold concentration is created, the process becomes unstable, thermally heating the material to catastrophic breakdown. Contamination-induced breakdown is exponentially bandgap dependent, and this prediction is borne out in experimental data from TiO2, Ta2O5, HfO2, Al 2O3, and SiO2. The spectral dependence of blackbody radiation and thermal photon noise is derived analytically for the first time as a function of spectra and mode density. An algorithm by which the analytical expression for the variance can
ERIC Educational Resources Information Center
Safarkhani, Maryam; Moerbeek, Mirjam
2013-01-01
In a randomized controlled trial, a decision needs to be made about the total number of subjects for adequate statistical power. One way to increase the power of a trial is by including a predictive covariate in the model. In this article, the effects of various covariate adjustment strategies on increasing the power is studied for discrete-time…
NASA Astrophysics Data System (ADS)
Shergill, H.; Robinson, T. R.; Dhillon, R. S.; Lester, M.; Milan, S. E.; Yeoman, T. K.
2010-05-01
High-power electromagnetic waves can excite a variety of plasma instabilities in Earth's ionosphere. These lead to the growth of plasma waves and plasma density irregularities within the heated volume, including patches of small-scale field-aligned electron density irregularities. This paper reports a statistical study of intensity distributions in patches of these irregularities excited by the European Incoherent Scatter (EISCAT) heater during beam-sweeping experiments. The irregularities were detected by the Co-operative UK Twin Located Auroral Sounding System (CUTLASS) coherent scatter radar located in Finland. During these experiments the heater beam direction is steadily changed from northward to southward pointing. Comparisons are made between statistical parameters of CUTLASS backscatter power distributions and modeled heater beam power distributions provided by the EZNEC version 4 software. In general, good agreement between the statistical parameters and the modeled beam is observed, clearly indicating the direct causal connection between the heater beam and the irregularities, despite the sometimes seemingly unpredictable nature of unaveraged results. The results also give compelling evidence in support of the upper hybrid theory of irregularity excitation.
Dziak, John J.; Lanza, Stephanie T.; Tan, Xianming
2014-01-01
Selecting the number of different classes which will be assumed to exist in the population is an important step in latent class analysis (LCA). The bootstrap likelihood ratio test (BLRT) provides a data-driven way to evaluate the relative adequacy of a (K −1)-class model compared to a K-class model. However, very little is known about how to predict the power or the required sample size for the BLRT in LCA. Based on extensive Monte Carlo simulations, we provide practical effect size measures and power curves which can be used to predict power for the BLRT in LCA given a proposed sample size and a set of hypothesized population parameters. Estimated power curves and tables provide guidance for researchers wishing to size a study to have sufficient power to detect hypothesized underlying latent classes. PMID:25328371
Arabidopsis: An Adequate Model for Dicot Root Systems?
Zobel, Richard W
2016-01-01
The Arabidopsis root system is frequently considered to have only three classes of root: primary, lateral, and adventitious. Research with other plant species has suggested up to eight different developmental/functional classes of root for a given plant root system. If Arabidopsis has only three classes of root, it may not be an adequate model for eudicot plant root systems. Recent research, however, can be interpreted to suggest that pre-flowering Arabidopsis does have at least five (5) of these classes of root. This then suggests that Arabidopsis root research can be considered an adequate model for dicot plant root systems. PMID:26904040
Lotterhos, Katie E; Whitlock, Michael C
2015-03-01
Although genome scans have become a popular approach towards understanding the genetic basis of local adaptation, the field still does not have a firm grasp on how sampling design and demographic history affect the performance of genome scans on complex landscapes. To explore these issues, we compared 20 different sampling designs in equilibrium (i.e. island model and isolation by distance) and nonequilibrium (i.e. range expansion from one or two refugia) demographic histories in spatially heterogeneous environments. We simulated spatially complex landscapes, which allowed us to exploit local maxima and minima in the environment in 'pair' and 'transect' sampling strategies. We compared F(ST) outlier and genetic-environment association (GEA) methods for each of two approaches that control for population structure: with a covariance matrix or with latent factors. We show that while the relative power of two methods in the same category (F(ST) or GEA) depended largely on the number of individuals sampled, overall GEA tests had higher power in the island model and F(ST) had higher power under isolation by distance. In the refugia models, however, these methods varied in their power to detect local adaptation at weakly selected loci. At weakly selected loci, paired sampling designs had equal or higher power than transect or random designs to detect local adaptation. Our results can inform sampling designs for studies of local adaptation and have important implications for the interpretation of genome scans based on landscape data. PMID:25648189
Is the Marketing Concept Adequate for Continuing Education?
ERIC Educational Resources Information Center
Rittenburg, Terri L.
1984-01-01
Because educators have a social responsibility to those they teach, the marketing concept may not be adequate as a philosophy for continuing education. In attempting to broaden the audience for continuing education, educators should consider a societal marketing concept to meet the needs of the educationally disadvantaged. (SK)
Comparability and Reliability Considerations of Adequate Yearly Progress
ERIC Educational Resources Information Center
Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young
2012-01-01
The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION...
Understanding Your Adequate Yearly Progress (AYP), 2011-2012
ERIC Educational Resources Information Center
Missouri Department of Elementary and Secondary Education, 2011
2011-01-01
The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation Rates.…
Assessing Juvenile Sex Offenders to Determine Adequate Levels of Supervision.
ERIC Educational Resources Information Center
Gerdes, Karen E.; And Others
1995-01-01
This study analyzed the internal consistency of four inventories used by Utah probation officers to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. Three factors accounted for 41.2 percent of variance (custodian's and juvenile's attitude toward intervention, offense characteristics, and historical…
34 CFR 200.13 - Adequate yearly progress in general.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Making adequate yearly progress. 200.20 Section 200.20 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE DISADVANTAGED...
Do Beginning Teachers Receive Adequate Support from Their Headteachers?
ERIC Educational Resources Information Center
Menon, Maria Eliophotou
2012-01-01
The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2010 CFR
2010-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... the reading/language arts and mathematics assessments in the three grade spans required under § 200.5... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2013 CFR
2013-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2012 CFR
2012-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
Maintaining adequate hydration and nutrition in adult enteral tube feeding.
Dunn, Sasha
2015-01-01
Predicting the nutritional and fluid requirements of enterally-fed patients can be challenging and the practicalities of ensuring adequate delivery must be taken into consideration. Patients who are enterally fed can be more reliant on clinicians, family members and carers to meet their nutrition and hydration needs and identify any deficiencies, excesses or problems with delivery. Estimating a patient's requirements can be challenging due to the limitations of using predictive equations in the clinical setting. Close monitoring by all those involved in the patient's care, as well as regular review by a dietitian, is therefore required to balance the delivery of adequate feed and fluids to meet each patient's individual needs and prevent the complications of malnutrition and dehydration. Increasing the awareness of the signs of malnutrition and dehydration in patients receiving enteral tube feeding among those involved in a patient's care will help any deficiencies to be detected early on and rectified before complications occur. PMID:26087203
Assessing juvenile sex offenders to determine adequate levels of supervision.
Gerdes, K E; Gourley, M M; Cash, M C
1995-08-01
The present study analyzed the internal consistency of four inventories currently being used by probation officers in the state of Utah to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. The internal consistency or reliability of the inventories ranged from moderate to good. Factor analysis was utilized to significantly increase the reliability of the four inventories by collapsing them into the following three factors: (a) Custodian's and Juvenile's Attitude Toward Intervention; (b) Offense Characteristics; and (c) Historical Risk Factors. These three inventories/factors explained 41.2% of the variance in the combined inventories' scores. Suggestions are made regarding the creation of an additional inventory. "Characteristics of the Victim" to account for more of the variance. In addition, suggestions as to how these inventories can be used by probation officers to make objective and consistent decisions about adequate supervision levels and placement for juvenile sex offenders are discussed. PMID:7583754
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Qualley, G.; Newman, J. F.; Miller, W. O.
2015-09-28
The goal of our FY15 project was to explore the use of statistical models and high-resolution atmospheric input data to develop more accurate prediction models for turbine power generation. We modeled power for two operational wind farms in two regions of the country. The first site is a 235 MW wind farm in Northern Oklahoma with 140 GE 1.68 turbines. Our second site is a 38 MW wind farm in the Altamont Pass Region of Northern California with 38 Mitsubishi 1 MW turbines. The farms are very different in topography, climatology, and turbine technology; however, both occupy high wind resource areas in the U.S. and are representative of typical wind farms found in their respective areas.
Influence of the fluid density on the statistics of power fluctuations in von Kármán swirling flows
NASA Astrophysics Data System (ADS)
Opazo, A.; Sáez, A.; Bustamante, G.; Labbé, R.
2016-02-01
Here, we report experimental results on the fluctuations of injected power in confined turbulence. Specifically, we have studied a von Kármán swirling flow with constant external torque applied to the stirrers. Two experiments were performed at nearly equal Reynolds numbers, in geometrically similar experimental setups. Air was utilized in one of them and water in the other. With air, it was found that the probability density function of power fluctuations is strongly asymmetric, while with water, it is nearly Gaussian. This suggests that the outcome of a big change of the fluid density in the flow-stirrer interaction is not simply a change in the amplitude of stirrers' response. In the case of water, with a density roughly 830 times greater than air density, the coupling between the flow and the stirrers is stronger, so that they follow more closely the fluctuations of the average rotation of the nearby flow. When the fluid is air, the coupling is much weaker. The result is not just a smaller response of the stirrers to the torque exerted by the flow; the PDF of the injected power becomes strongly asymmetric and its spectrum acquires a broad region that scales as f-2. Thus, the asymmetry of the probability density functions of torque or angular speed could be related to the inability of the stirrers to respond to flow stresses. This happens, for instance, when the torque exerted by the flow is weak, due to small fluid density, or when the stirrers' moment of inertia is large. Moreover, a correlation analysis reveals that the features of the energy transfer dynamics with water are qualitatively and quantitatively different to what is observed with air as working fluid.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Nagelkerke, Leopold A J; van Densen, Wim L T
2007-02-01
We studied the effects of inter-annual variability and serial correlation on the statistical power of monitoring schemes to detect trends in biomass of bream (Abramis brama) in Lake Veluwemeer (The Netherlands). In order to distinguish between 'true' system variability and sampling variability we simulated the development of the bream population, using estimates for population structure and growth, and compared the resulting inter-annual variabilities and serial correlations with those from field data. In all cases the inter-annual variability in the field data was larger than in simulated data (e.g. for total biomass of all assessed bream sigma = 0.45 in field data, and sigma = 0.03-0.14 in simulated data) indicating that sampling variability decreased statistical power for detecting trends. Moreover, sampling variability obscured the inter-annual dependency (and thus the serial correlation) of biomass, which was expected because in this long-lived population biomass changes are buffered by the many year classes present. We did find the expected serial correlation in our simulation results and concluded that good survey data of long-lived fish populations should show low sampling variability and considerable inter-annual serial correlation. Since serial correlation decreases the power for detecting trends, this means that even when sampling variability would be greatly reduced, the number of sampling years to detect a change of 15%.year(-1) in bream populations (corresponding to a halving or doubling in a six-year period) would in most cases be more than six. This would imply that the six-year reporting periods that are required by the Water Framework Directive of the European Union are too short for the existing fish monitoring schemes. PMID:17219244
Higgins, P; Murray, M L; Williams, E M
1994-03-01
This descriptive, retrospective study examined levels of self-esteem, social support, and satisfaction with prenatal care in 193 low-risk postpartal women who obtained adequate and inadequate care. The participants were drawn from a regional medical center and university teaching hospital in New Mexico. A demographic questionnaire, the Coopersmith self-esteem inventory, the personal resource questionnaire part 2, and the prenatal care satisfaction inventory were used for data collection. Significant differences were found in the level of education, income, insurance, and ethnicity between women who received adequate prenatal care and those who received inadequate care. Women who were likely to seek either adequate or inadequate prenatal care were those whose total family income was $10,000 to $19,999 per year and high school graduates. Statistically significant differences were found in self-esteem, social support, and satisfaction between the two groups of women. Strategies to enhance self-esteem and social support have to be developed to reach women at risk for receiving inadequate prenatal care. PMID:8155221
Quantifying dose to the reconstructed breast: Can we adequately treat?
Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.
2013-04-01
To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.
Purchasing a cycle helmet: are retailers providing adequate advice?
Plumridge, E.; McCool, J.; Chetwynd, J.; Langley, J. D.
1996-01-01
OBJECTIVES: The aim of this study was to examine the selling of cycle helmets in retail stores with particular reference to the adequacy of advice offered about the fit and securing of helmets. METHODS: All 55 retail outlets selling cycle helmets in Christchurch, New Zealand were studied by participant observation. A research entered each store as a prospective customer and requested assistance to purchase a helmet. She took detailed field notes of the ensuing encounter and these were subsequently transcribed, coded, and analysed. RESULTS: Adequate advice for helmet purchase was given in less than half of the stores. In general the sales assistants in specialist cycle shops were better informed and gave more adequate advice than those in department stores. Those who have good advice also tended to be more good advice also tended to be more active in helping with fitting the helmet. Knowledge about safety standards was apparent in one third of sales assistants. Few stores displayed information for customers about the correct fit of cycle helmets. CONCLUSIONS: These findings suggest that the advice and assistance being given to ensure that cycle helmets fit properly is often inadequate and thus the helmets may fail to fulfil their purpose in preventing injury. Consultation between retailers and policy makers is a necessary first step to improving this situation. PMID:9346053
Adequate drainage system design for heap leaching structures.
Majdi, Abbas; Amini, Mehdi; Nasab, Saeed Karimi
2007-08-17
The paper describes an optimum design of a drainage system for a heap leaching structure which has positive impacts on both mine environment and mine economics. In order to properly design a drainage system the causes of an increase in the acid level of the heap which in turn produces severe problems in the hydrometallurgy processes must be evaluated. One of the most significant negative impacts induced by an increase in the acid level within a heap structure is the increase of pore acid pressure which in turn increases the potential of a heap-slide that may endanger the mine environment. In this paper, initially the thickness of gravelly drainage layer is determined via existing empirical equations. Then by assuming that the calculated thickness is constant throughout the heap structure, an approach has been proposed to calculate the required internal diameter of the slotted polyethylene pipes which are used for auxiliary drainage purposes. In order to adequately design this diameter, the pipe's cross-sectional deformation due to stepped heap structure overburden pressure is taken into account. Finally, a design of an adequate drainage system for the heap structure 2 at Sarcheshmeh copper mine is presented and the results are compared with those calculated by exiting equations. PMID:17321044
Charro, Elena; Pardo, Rafael; Peña, Víctor
2013-10-01
Coal-fired power-plants (CFPP) can be a source of contamination because the coal contains trace amounts of natural radionuclides, such as (40)K and (238)U, (232)Th and their decay products. These radionuclides can be released as fly ash from the CFPP and deposited from the atmosphere on the nearby top soils, therefore modifying the natural radioactivity background levels, and subsequently increasing the total radioactive dose received for the nearby population. In this paper, an area of 64 km(2) around the CFPP of Velilla del Río Carrión (Spain) has been studied by collecting 67 surface soil samples and measuring the activities of one artificial and six natural radionuclides by gamma spectrometry. The found results are similar to the background natural levels and ranged from 0 to 209 for (137)Cs, 11 to 50 for (238)U, 14 to 67 for (226)Ra, 29 to 380 for (210)Pb, 15 to 68 for (232)Th, 17 to 78 for (224)Ra, 97 to 790 for (40)K (all values in Bq kg(-1)). Besides the classical radiochemical tools, Analysis of Variance (ANOVA), Principal Component Analysis (PCA), Hierarchical Clustering Analysis (HCA), and kriging mapping have been used to the experimental dataset, allowing us to find the existence of two different models of spatial distribution around the CFPP. The first, followed by (238)U, (226)Ra, (232)Th, (224)Ra and (40)K can be assigned to 'natural background radioactivity', whereas the second model, followed by (210)Pb and (137)Cs, is based on 'atmospheric fallout radioactivity'. The main conclusion of this work is that CFPP has not influence on the radioactivity levels measured in the studied area, with has a mean annual outdoor effective dose E = 71 ± 22 μSv, very close to the average UNSCEAR value of 70 μSv, thus confirming the almost non-existent radioactive risk posed by the presence of the CFPP. PMID:23680923
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that
Are Academic Programs Adequate for the Software Profession?
ERIC Educational Resources Information Center
Koster, Alexis
2010-01-01
According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…
Are PPS payments adequate? Issues for updating and assessing rates
Sheingold, Steven H.; Richter, Elizabeth
1992-01-01
Declining operating margins under Medicare's prospective payment system (PPS) have focused attention on the adequacy of payment rates. The question of whether annual updates to the rates have been too low or cost increases too high has become important. In this article we discuss issues relevant to updating PPS rates and judging their adequacy. We describe a modification to the current framework for recommending annual update factors. This framework is then used to retrospectively assess PPS payment and cost growth since 1985. The preliminary results suggest that current rates are more than adequate to support the cost of efficient care. Also discussed are why using financial margins to evaluate rates is problematic and alternative methods that might be employed. PMID:10127450
Dose Limits for Man do not Adequately Protect the Ecosystem
Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.
2004-08-01
It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.
ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL
Coutts, D
2007-01-22
Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.
Adequate peritoneal dialysis: theoretical model and patient treatment.
Tast, C
1998-01-01
The objective of this study was to evaluate the relationship between adequate PD with sufficient weekly Kt/V (2.0) and Creatinine clearance (CCR) (60l) and necessary daily dialysate volume. This recommended parameter was the result of a recent multi-centre study (CANUSA). For this there were 40 patients in our hospital examined and compared in 1996, who carried out PD for at least 8 weeks and up to 6 years. These goals (CANUSA) are easily attainable in the early treatment of many individuals with a low body surface area (BSA). With higher BSA or missing RRF (Residual Renal Function) the daily dose of dialysis must be adjusted. We found it difficult to obtain the recommended parameters and tried to find a solution to this problem. The simplest method is to increase the volume or exchange rate. The most expensive method is to change from CAPD to APD with the possibility of higher volume or exchange rates. Selection of therapy must take into consideration: 1. patient preference, 2. body mass, 3. peritoneal transport rates, 4. ability to perform therapy, 5. cost of therapy and 6. risk of peritonitis. With this information in mind, an individual prescription can be formulated and matched to the appropriate modality of PD. PMID:10392062
DARHT - an `adequate` EIS: A NEPA case study
Webb, M.D.
1997-08-01
The Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS) provides a case study that is interesting for many reasons. The EIS was prepared quickly, in the face of a lawsuit, for a project with unforeseen environmental impacts, for a facility that was deemed urgently essential to national security. Following judicial review the EIS was deemed to be {open_quotes}adequate.{close_quotes} DARHT is a facility now being built at Los Alamos National Laboratory (LANL) as part of the Department of Energy (DOE) nuclear weapons stockpile stewardship program. DARHT will be used to evaluate the safety and reliability of nuclear weapons, evaluate conventional munitions and study high-velocity impact phenomena. DARHT will be equipped with two accelerator-driven, high-intensity X-ray machines to record images of materials driven by high explosives. DARHT will be used for a variety of hydrodynamic tests, and DOE plans to conduct some dynamic experiments using plutonium at DARHT as well.
Adequate Yearly Progress (AYP) at Your Library Media Center
ERIC Educational Resources Information Center
Anderson, Cynthia
2007-01-01
Together administration and the library media center form a team that can make a difference in student learning and, in turn, in student achievement. The library media center can contribute to improve student learning, and there is an amazingly small cost that administration must pay for this powerful support. This article addresses…
On Adequate Comparisons of Antenna Phase Center Variations
NASA Astrophysics Data System (ADS)
Schoen, S.; Kersten, T.
2013-12-01
One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate
Improving access to adequate pain management in Taiwan.
Scholten, Willem
2015-06-01
There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment. PMID:26068436
Shi, Runhua; McLarty, Jerry W
2009-10-01
In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications. PMID:19891281
Kasap, Burcu; Akbaba, Gülhan; Yeniçeri, Emine N.; Akın, Melike N.; Akbaba, Eren; Öner, Gökalp; Turhan, Nilgün Ö.; Duru, Mehmet E.
2016-01-01
Objectives: To assess current iodine levels and related factors among healthy pregnant women. Methods: In this cross-sectional, hospital-based study, healthy pregnant women (n=135) were scanned for thyroid volume, provided urine samples for urinary iodine concentration and completed a questionnaire including sociodemographic characteristics and dietary habits targeted for iodine consumption at the Department of Obstetrics and Gynecology, School of Medicine, Muğla Sıtkı Koçman University, Muğla, Turkey, between August 2014 and February 2015. Sociodemographic data were analyzed by simple descriptive statistics. Results: Median urinary iodine concentration was 222.0 µg/L, indicating adequate iodine intake during pregnancy. According to World Health Organization (WHO) criteria, 28.1% of subjects had iodine deficiency, 34.1% had adequate iodine intake, 34.8% had more than adequate iodine intake, and 3.0% had excessive iodine intake during pregnancy. Education level, higher monthly income, current employment, consuming iodized salt, and adding salt to food during, or after cooking were associated with higher urinary iodine concentration. Conclusion: Iodine status of healthy pregnant women was adequate, although the percentage of women with more than adequate iodine intake was higher than the reported literature. PMID:27279519
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.
ERIC Educational Resources Information Center
Callamaras, Peter
1983-01-01
This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)
ERIC Educational Resources Information Center
Meyer, Donald L.
Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…
SNAP benefits: Can an adequate benefit be defined?
Yaktine, Ann L; Caswell, Julie A
2014-01-01
The Supplemental Nutrition Assistance Program (SNAP) increases the food purchasing power of participating households. A committee convened by the Institute of Medicine (IOM) examined the question of whether it is feasible to define SNAP allotment adequacy. Total resources; individual, household, and environmental factors; and SNAP program characteristics that affect allotment adequacy were identified from a framework developed by the IOM committee. The committee concluded that it is feasible to define SNAP allotment adequacy; however, such a definition must take into account the degree to which participants' total resources and individual, household, and environmental factors influence the purchasing power of SNAP benefits and the impact of SNAP program characteristics on the calculation of the dollar value of the SNAP allotment. The committee recommended that the USDA Food and Nutrition Service investigate ways to incorporate these factors and program characteristics into research aimed at defining allotment adequacy. PMID:24425718
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
HU, Yi-Juan; SUN, Wei; TZENG, Jung-Ying; PEROU, Charles M.
2015-01-01
Studies of expression quantitative trait loci (eQTLs) offer insight into the molecular mechanisms of loci that were found to be associated with complex diseases and the mechanisms can be classified into cis- and trans-acting regulation. At present, high-throughput RNA sequencing (RNA-seq) is rapidly replacing expression microarrays to assess gene expression abundance. Unlike microarrays that only measure the total expression of each gene, RNA-seq also provides information on allele-specific expression (ASE), which can be used to distinguish cis-eQTLs from trans-eQTLs and, more importantly, enhance cis-eQTL mapping. However, assessing the cis-effect of a candidate eQTL on a gene requires knowledge of the haplotypes connecting the candidate eQTL and the gene, which cannot be inferred with certainty. The existing two-stage approach that first phases the candidate eQTL against the gene and then treats the inferred phase as observed in the association analysis tends to attenuate the estimated cis-effect and reduce the power for detecting a cis-eQTL. In this article, we provide a maximum-likelihood framework for cis-eQTL mapping with RNA-seq data. Our approach integrates the inference of haplotypes and the association analysis into a single stage, and is thus unbiased and statistically powerful. We also develop a pipeline for performing a comprehensive scan of all local eQTLs for all genes in the genome by controlling for false discovery rate, and implement the methods in a computationally efficient software program. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to empirical breast cancer data from The Cancer Genome Atlas project. PMID:26568645
Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A
2008-01-01
Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.
Wide Wide World of Statistics: International Statistics on the Internet.
ERIC Educational Resources Information Center
Foudy, Geraldine
2000-01-01
Explains how to find statistics on the Internet, especially international statistics. Discusses advantages over print sources, including convenience, currency of information, cost effectiveness, and value-added formatting; sources of international statistics; United Nations agencies; search engines and power searching; and evaluating sources. (LRW)
Smith, Alwyn
1969-01-01
This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Statistics of football dynamics
NASA Astrophysics Data System (ADS)
Mendes, R. S.; Malacarne, L. C.; Anteneodo, C.
2007-06-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by q-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Percentage of Adults with High Blood Pressure Whose Hypertension Is Adequately Controlled
... is Adequately Controlled Percentage of Adults with High Blood Pressure Whose Hypertension is Adequately Controlled Heart disease ... Survey. Age Group Percentage of People with High Blood Pressure that is Controlled by Age Group f94q- ...
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...
NASA Astrophysics Data System (ADS)
Hermann, Claudine
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.