42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…
INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER
A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...
ERIC Educational Resources Information Center
Warkentien, Siri; Grady, Sarah
2009-01-01
This Statistics in Brief contributes to current research by investigating the use of tutoring services among a nationally representative group of public school students enrolled in grades K-12. The report compares students in schools that have not made Adequate Yearly Progress (AYP) for 3 or more years, and were thereby enrolled in schools that…
Calculating statistical power in Mendelian randomization studies.
Brion, Marie-Jo A; Shakhbazov, Konstantin; Visscher, Peter M
2013-10-01
In Mendelian randomization (MR) studies, where genetic variants are used as proxy measures for an exposure trait of interest, obtaining adequate statistical power is frequently a concern due to the small amount of variation in a phenotypic trait that is typically explained by genetic variants. A range of power estimates based on simulations and specific parameters for two-stage least squares (2SLS) MR analyses based on continuous variables has previously been published. However there are presently no specific equations or software tools one can implement for calculating power of a given MR study. Using asymptotic theory, we show that in the case of continuous variables and a single instrument, for example a single-nucleotide polymorphism (SNP) or multiple SNP predictor, statistical power for a fixed sample size is a function of two parameters: the proportion of variation in the exposure variable explained by the genetic predictor and the true causal association between the exposure and outcome variable. We demonstrate that power for 2SLS MR can be derived using the non-centrality parameter (NCP) of the statistical test that is employed to test whether the 2SLS regression coefficient is zero. We show that the previously published power estimates from simulations can be represented theoretically using this NCP-based approach, with similar estimates observed when the simulation-based estimates are compared with our NCP-based approach. General equations for calculating statistical power for 2SLS MR using the NCP are provided in this note, and we implement the calculations in a web-based application. PMID:24159078
Statistical Power in Meta-Analysis
ERIC Educational Resources Information Center
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Can loss of balance from mesoscale eddies adequately power deep ocean mixing?
NASA Astrophysics Data System (ADS)
Williams, P. D.; Haine, T. W.; Read, P. L.
2009-12-01
The global ocean thermohaline circulation is partly composed of the sinking of dense surface waters at high latitudes. But in order to close the circulation and maintain the abyssal stratification, the dense waters must rise up again through vertical mixing. This process requires a source of energy roughly estimated to be 2 TW. Previous work has concluded that tides and winds may adequately supply the required power, but the conceivable role of loss of balance from mesoscale eddies, resulting in the generation of internal inertia-gravity waves and associated vertical mixing, has hitherto been considered to be 'of unknown importance' (Wunsch and Ferrari, 2004). We investigate the potential role of loss of balance, by studying the generation of internal inertia-gravity waves by balanced flow in a rotating two-layer annulus laboratory experiment (Williams et al., 2008). A photograph from the experiment is shown in the figure. As the Rossby number of the balanced flow decreases, the amplitude of the emitted inertia-gravity waves also decreases, but much less rapidly than is predicted by several dynamical theories. This finding suggests that inertia-gravity waves might be far more energised than previously thought. The balanced flow leaks roughly one per cent of its energy each rotation period into internal inertia-gravity waves at the peak of their generation. Crude extrapolation of this result to the global ocean suggests that the flux of energy from mesoscale eddies into internal waves may be as large as 1.5 TW. We claim no accuracy for this figure which is only indicative. Nevertheless, we are persuaded that generation of inertia-gravity waves from the balanced mesoscale flow may be an important source of energy for deep interior mixing, and deserves further study. Reference Williams, PD, Haine, TWN and Read, PL (2008) Inertia-Gravity Waves Emitted from Balanced Flow: Observations, Properties, and Consequences. Journal of the Atmospheric Sciences, 65(11), pp 3543
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Practical Uses of Statistical Power in Business Research Studies.
ERIC Educational Resources Information Center
Markowski, Edward P.; Markowski, Carol A.
1999-01-01
Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)
Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.
ERIC Educational Resources Information Center
Rogers, Bruce G.
This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
Cheong, JeeWon
2016-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including sample size, effect size of mediated effect, number of measurement occasions, and R2 of measured variables. In general, the results showed that relatively large samples were needed to accurately estimate the mediated effects and to have adequate statistical power, when testing mediation in the LGCM framework. Guidelines for designing studies to examine longitudinal mediation and ways to improve the accuracy of the estimates and statistical power were discussed.
Statistical modelling of mitochondrial power supply.
James, A T; Wiskich, J T; Conyers, R A
1989-01-01
By experiment and theory, formulae are derived to calculate the response of mitochondrial power supply, in flux and potential, to an ATP consuming enzyme load, incorporating effects of varying amounts of (i) enzyme, (ii) total circulating adenylate, and (iii) inhibition of the ATP/ADP translocase. The formulae, which apply between about 20% and 80% of maximum respiration, are the same as for the current and voltage of an electrical circuit in which a battery with potential, linear in the logarithm of the total adenylate, charges another battery whose opposing potential is also linear in the same logarithm, through three resistances. These resistances produce loss of potential due to dis-equilibrium of (i) intramitochondrial oxidative phosphorylation, (ii) the ATP/ADP translocase, and (iii) the ATP-consuming enzyme load. The model is represented geometrically by the following configuration: when potential is plotted against flux, the points lie on two pencils of lines each concurrent at zero respiration, the two pencils describing the respective characteristics of the mitochondrion and enzyme. Control coefficients and elasticities are calculated from the formulae. PMID:2708917
New Dynamical-Statistical Techniques for Wind Power Prediction
NASA Astrophysics Data System (ADS)
Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.
2012-04-01
The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.
Statistical Power Analysis in Education Research. NCSER 2010-3006
ERIC Educational Resources Information Center
Hedges, Larry V.; Rhoads, Christopher
2010-01-01
This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from…
Power Curve Modeling in Complex Terrain Using Statistical Models
NASA Astrophysics Data System (ADS)
Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.
2014-12-01
Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.
Robust Statistical Detection of Power-Law Cross-Correlation
NASA Astrophysics Data System (ADS)
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
"Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2009-01-01
Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
The Role of Atmospheric Measurements in Wind Power Statistical Models
NASA Astrophysics Data System (ADS)
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.
2015-12-01
The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Statistical Models of Power-law Distributions in Homogeneous Plasmas
Roth, Ilan
2011-01-04
A variety of in-situ measurements in space plasmas point out to an intermittent formation of distribution functions with elongated tails and power-law at high energies. Power-laws form ubiquitous signature of many complex systems, plasma being a good example of a non-Boltzmann behavior for distribution functions of energetic particles. Particles, which either undergo mutual collisions or are scattered in phase space by electromagnetic fluctuations, exhibit statistical properties, which are determined by the transition probability density function of a single interaction, while their non-asymptotic evolution may determine the observed high-energy populations. It is shown that relaxation of the Brownian motion assumptions leads to non-analytical characteristic functions and to generalization of the Fokker-Planck equation with fractional derivatives that result in power law solutions parameterized by the probability density function.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Robust Statistical Detection of Power-Law Cross-Correlation.
Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques
NASA Technical Reports Server (NTRS)
Kuan, Gary M
2008-01-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
Spatial factors affecting statistical power in testing marine fauna displacement.
Pérez Lapeña, B; Wijnberg, K M; Stein, A; Hulscher, S J M H
2011-10-01
Impacts of offshore wind farms on marine fauna are largely unknown. Therefore, one commonly adheres to the precautionary principle, which states that one shall take action to avoid potentially damaging impacts on marine ecosystems, even when full scientific certainty is lacking. We implement this principle by means of a statistical power analysis including spatial factors. Implementation is based on geostatistical simulations, accommodating for zero-inflation in species data. We investigate scenarios in which an impact assessment still has to be carried out. Our results show that the environmental conditions at the time of the survey is the most influential factor on power. This is followed by survey effort and species abundance in the reference situation. Spatial dependence in species numbers at local scales affects power, but its effect is smaller for the scenarios investigated. Our findings can be used to improve effectiveness of the economical investment for monitoring surveys. In addition, unnecessary extra survey effort, and related costs, can be avoided when spatial dependence in species abundance is present and no improvement on power is achieved. PMID:22073657
Statistical tests for power-law cross-correlated processes
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Statistical tests for power-law cross-correlated processes.
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series. PMID:22304166
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring
Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring.
Schultz, Stewart T; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Toward improved statistical treatments of wind power forecast errors
NASA Astrophysics Data System (ADS)
Hart, E.; Jacobson, M. Z.
2011-12-01
The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W., Jr.
2003-01-01
A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
Decadal power in land air temperatures: Is it statistically significant?
NASA Astrophysics Data System (ADS)
Thejll, Peter A.
2001-12-01
The geographical distribution and properties of the well-known 10-11 year signal in terrestrial temperature records is investigated. By analyzing the Global Historical Climate Network data for surface air temperatures we verify that the signal is strongest in North America and is similar in nature to that reported earlier by R. G. Currie. The decadal signal is statistically significant for individual stations, but it is not possible to show that the signal is statistically significant globally, using strict tests. In North America, during the twentieth century, the decadal variability in the solar activity cycle is associated with the decadal part of the North Atlantic Oscillation index series in such a way that both of these signals correspond to the same spatial pattern of cooling and warming. A method for testing statistical results with Monte Carlo trials on data fields with specified temporal structure and specific spatial correlation retained is presented.
Heidel, R Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Heidel, R. Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Automated FMV image quality assessment based on power spectrum statistics
NASA Astrophysics Data System (ADS)
Kalukin, Andrew
2015-05-01
Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
ERIC Educational Resources Information Center
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis
ERIC Educational Resources Information Center
Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.
2010-01-01
In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W.
2002-01-01
A simple power law model consisting of a single spectral index, a is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index alpha(sub 2) greater than alpha(sub 1) above E(sub k). The Maximum likelihood (ML) procedure was developed for estimating the single parameter alpha(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (P1) consistency (asymptotically unbiased). (P2) efficiency asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only he ascertained by calculating the CRB for an assumed energy spectrum-detector response function combination, which can be quite formidable in practice. However. the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are attained in practice are investigated. The ML technique is then extended to estimate spectra information from
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Estimating statistical power for open-enrollment group treatment trials.
Morgan-Lopez, Antonio A; Saavedra, Lissette M; Hien, Denise A; Fals-Stewart, William
2011-01-01
Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open-enrollment groups) and the designs of substance abuse treatment trials (closed-enrollment groups, individual therapy). Latent class pattern mixture models (LCPMMs) are emerging tools for modeling data from open-enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open-enrollment designs based on the Monte Carlo simulation of LCPMM models using parameters derived from published data from a randomized controlled trial comparing Seeking Safety to a Community Care condition for women presenting with comorbid posttraumatic stress disorder and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently proposed open-enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971
Efficiency statistics at all times: Carnot limit at finite power.
Polettini, M; Verley, G; Esposito, M
2015-02-01
We derive the statistics of the efficiency under the assumption that thermodynamic fluxes fluctuate with normal law, parametrizing it in terms of time, macroscopic efficiency, and a coupling parameter ζ. It has a peculiar behavior: no moments, one sub-, and one super-Carnot maxima corresponding to reverse operating regimes (engine or pump), the most probable efficiency decreasing in time. The limit ζ→0 where the Carnot bound can be saturated gives rise to two extreme situations, one where the machine works at its macroscopic efficiency, with Carnot limit corresponding to no entropy production, and one where for a transient time scaling like 1/ζ microscopic fluctuations are enhanced in such a way that the most probable efficiency approaches the Carnot limit at finite entropy production. PMID:25699428
Monitoring Statistics Which Have Increased Power over a Reduced Time Range.
ERIC Educational Resources Information Center
Tang, S. M.; MacNeill, I. B.
1992-01-01
The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)
The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study
ERIC Educational Resources Information Center
Dong, Nianbo; Lipsey, Mark
2010-01-01
This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…
Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models
ERIC Educational Resources Information Center
Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles
2012-01-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…
Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J.; Suggs, Suzanne; Barbour, Russell
2015-01-01
The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power. PMID:26640421
On the power for linkage detection using a test based on scan statistics.
Hernández, Sonia; Siegmund, David O; de Gunst, Mathisca
2005-04-01
We analyze some aspects of scan statistics, which have been proposed to help for the detection of weak signals in genetic linkage analysis. We derive approximate expressions for the power of a test based on moving averages of the identity by descent allele sharing proportions for pairs of relatives at several contiguous markers. We confirm these approximate formulae by simulation. The results show that when there is a single trait-locus on a chromosome, the test based on the scan statistic is slightly less powerful than that based on the customary allele sharing statistic. On the other hand, if two genes having a moderate effect on a trait lie close to each other on the same chromosome, scan statistics improve power to detect linkage. PMID:15772104
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.
Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G
2016-04-01
Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009
Narayan, Manjari; Allen, Genevera I
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Narayan, Manjari; Allen, Genevera I.
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
NASA Astrophysics Data System (ADS)
Ma, W. T.; Sandri, G. vH.; Sarkar, S.
1991-05-01
We use the convolution power of infinite sequences to obtain a novel representation of exponential functions of power series which often arise in statistical mechanics. We thus obtain new formulas for the configuration and cluster integrals of pairwise interacting systems of molecules in an imperfect gas. We prove that the asymptotic behaviour of the Luria-Delbrück distribution is pn∼ cn-2. We derive a new, simple and computationally efficient recursion relation for pn.
Prospective active marker motion correction improves statistical power in BOLD fMRI
Ooi, Melvyn B.; Goldman, Robin I.; Krueger, Sascha; Thomas, William J.; Sajda, Paul; Brown, Truman R.
2013-01-01
Group level statistical maps of blood oxygenation level dependent (BOLD) signals acquired using functional magnetic resonance imaging (fMRI) have become a basic measurement for much of systems, cognitive and social neuroscience. A challenge in making inferences from these statistical maps is the noise and potential confounds that arise from the head motion that occurs within and between acquisition volumes. This motion results in the scan plane being misaligned during acquisition, ultimately leading to reduced statistical power when maps are constructed at the group level. In most cases, an attempt is made to correct for this motion through the use of retrospective analysis methods. In this paper, we use a prospective active marker motion correction (PRAMMO) system that uses radio frequency markers for real-time tracking of motion, enabling on-line slice plane correction. We show that the statistical power of the activation maps is substantially increased using PRAMMO compared to conventional retrospective correction. Analysis of our results indicates that the PRAMMO acquisition reduces the variance without decreasing the signal component of the BOLD (beta). Using PRAMMO could thus improve the overall statistical power of fMRI based BOLD measurements, leading to stronger inferences of the nature of processing in the human brain. PMID:23220430
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
ERIC Educational Resources Information Center
Cheong, JeeWon
2011-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…
Violation of statistical isotropy and homogeneity in the 21-cm power spectrum
NASA Astrophysics Data System (ADS)
Shiraishi, Maresuke; Muñoz, Julian B.; Kamionkowski, Marc; Raccanelli, Alvise
2016-05-01
Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic microwave background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here, we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving upon the current CMB constraints by ˜2 orders of magnitude for a dipolar asymmetry and by ˜1 - 3 orders of magnitude for a quadrupolar asymmetry case.
Statistics of injected power on a bouncing ball subjected to a randomly vibrating piston.
García-Cid, Alfredo; Gutiérrez, Pablo; Falcón, Claudio; Aumaître, Sébastien; Falcon, Eric
2015-09-01
We present an experimental study on the statistical properties of the injected power needed to maintain an inelastic ball bouncing constantly on a randomly accelerating piston in the presence of gravity. We compute the injected power at each collision of the ball with the moving piston by measuring the velocity of the piston and the force exerted on the piston by the ball. The probability density function of the injected power has its most probable value close to zero and displays two asymmetric exponential tails, depending on the restitution coefficient, the piston acceleration, and its frequency content. This distribution can be deduced from a simple model assuming quasi-Gaussian statistics for the force and velocity of the piston. PMID:26465548
NASA Astrophysics Data System (ADS)
Asal, F. F.
2012-07-01
Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical
The Effect of Cluster Size Variability on Statistical Power in Cluster-Randomized Trials
Lauer, Stephen A.; Kleinman, Ken P.; Reich, Nicholas G.
2015-01-01
The frequency of cluster-randomized trials (CRTs) in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters) needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power. PMID:25830416
A statistical spatial power spectrum of the Earth's lithospheric magnetic field
NASA Astrophysics Data System (ADS)
Thébault, E.; Vervelidou, F.
2015-05-01
The magnetic field of the Earth's lithosphere arises from rock magnetization contrasts that were shaped over geological times. The field can be described mathematically in spherical harmonics or with distributions of magnetization. We exploit this dual representation and assume that the lithospheric field is induced by spatially varying susceptibility values within a shell of constant thickness. By introducing a statistical assumption about the power spectrum of the susceptibility, we then derive a statistical expression for the spatial power spectrum of the crustal magnetic field for the spatial scales ranging from 60 to 2500 km. This expression depends on the mean induced magnetization, the thickness of the shell, and a power law exponent for the power spectrum of the susceptibility. We test the relevance of this form with a misfit analysis to the observational NGDC-720 lithospheric magnetic field model power spectrum. This allows us to estimate a mean global apparent induced magnetization value between 0.3 and 0.6 A m-1, a mean magnetic crustal thickness value between 23 and 30 km, and a root mean square for the field value between 190 and 205 nT at 95 per cent. These estimates are in good agreement with independent models of the crustal magnetization and of the seismic crustal thickness. We carry out the same analysis in the continental and oceanic domains separately. We complement the misfit analyses with a Kolmogorov-Smirnov goodness-of-fit test and we conclude that the observed power spectrum can be each time a sample of the statistical one.
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity.
Beasley, T Mark
2014-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity
Beasley, T. Mark
2013-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
A Powerful Statistical Approach for Large-Scale Differential Transcription Analysis
Tan, Yuan-De; Chandler, Anita M.; Chaudhury, Arindam; Neilson, Joel R.
2015-01-01
Next generation sequencing (NGS) is increasingly being used for transcriptome-wide analysis of differential gene expression. The NGS data are multidimensional count data. Therefore, most of the statistical methods developed well for microarray data analysis are not applicable to transcriptomic data. For this reason, a variety of new statistical methods based on count data of transcript reads have been correspondingly proposed. But due to high cost and limitation of biological resources, current NGS data are still generated from a few replicate libraries. Some of these existing methods do not always have desirable performances on count data. We here developed a very powerful and robust statistical method based on beta and binomial distributions. Our method (mBeta t-test) is specifically applicable to sequence count data from small samples. Both simulated and real transcriptomic data showed mBeta t-test significantly outperformed the existing top statistical methods chosen in all 12 given scenarios and performed with high efficiency and high stability. The differentially expressed genes found by our method from real transcriptomic data were validated by qPCR experiments. Our method shows high power in finding truly differential expression, conservatively estimating FDR and high stability in RNA sequence count data derived from small samples. Our method can also be extended to genome-wide detection of differential splicing events. PMID:25894390
Statistical Design Model (SDM) of power supply and communication subsystem's Satellite
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza
In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
NASA Astrophysics Data System (ADS)
Najac, Julien
2014-05-01
For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.
NASA Astrophysics Data System (ADS)
Osato, Ken; Shirasaki, Masato; Yoshida, Naoki
2015-06-01
We study the impact of baryonic physics on cosmological parameter estimation with weak-lensing surveys. We run a set of cosmological hydrodynamics simulations with different galaxy formation models. We then perform ray-tracing simulations through the total matter density field to generate 100 independent convergence maps with a field of view of 25 {{deg }2}, and we use them to examine the ability of the following three lensing statistics as cosmological probes: power spectrum (PS), peak counts, and Minkowski functionals (MFs). For the upcoming wide-field observations, such as the Subaru Hyper Suprime-Cam (HSC) survey with a sky coverage of 1400 {{deg }2}, these three statistics provide tight constraints on the matter density, density fluctuation amplitude, and dark energy equation of state, but parameter bias is induced by baryonic processes such as gas cooling and stellar feedback. When we use PS, peak counts, and MFs, the magnitude of relative bias in the dark energy equation of state parameter w is at a level of, respectively, δ w∼ 0.017, 0.061, and 0.0011. For the HSC survey, these values are smaller than the statistical errors estimated from Fisher analysis. The bias could be significant when the statistical errors become small in future observations with a much larger survey area. We find that the bias is induced in different directions in the parameter space depending on the statistics employed. While the two-point statistic, i.e., PS, yields robust results against baryonic effects, the overall constraining power is weak compared with peak counts and MFs. On the other hand, using one of peak counts or MFs, or combined analysis with multiple statistics, results in a biased parameter estimate. The bias can be as large as 1σ for the HSC survey and will be more significant for upcoming wider-area surveys. We suggest to use an optimized combination so that the baryonic effects on parameter estimation are mitigated. Such a “calibrated” combination can
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Assessing statistical power of SNPs for population structure and conservation studies.
Morin, Phillip A; Martien, Karen K; Taylor, Barbara L
2009-01-01
Single nucleotide polymorphisms (SNPs) have been proposed by some as the new frontier for population studies, and several papers have presented theoretical and empirical evidence reporting the advantages and limitations of SNPs. As a practical matter, however, it remains unclear how many SNP markers will be required or what the optimal characteristics of those markers should be in order to obtain sufficient statistical power to detect different levels of population differentiation. We use a hypothetical case to illustrate the process of designing a population genetics project, and present results from simulations that address several issues for maximizing statistical power to detect differentiation while minimizing the amount of effort in developing SNPs. Results indicate that (i) while ~30 SNPs should be sufficient to detect moderate (F(ST) = 0.01) levels of differentiation, studies aimed at detecting demographic independence (e.g. F(ST) < 0.005) may require 80 or more SNPs and large sample sizes; (ii) different SNP allele frequencies have little affect on power, and thus, selection of SNPs can be relatively unbiased; (iii) increasing the sample size has a strong effect on power, so that the number of loci can be minimized when sample number is known, and increasing sample size is almost always beneficial; and (iv) power is increased by including multiple SNPs within loci and inferring haplotypes, rather than trying to use only unlinked SNPs. This also has the practical benefit of reducing the SNP ascertainment effort, and may influence the decision of whether to seek SNPs in coding or noncoding regions. PMID:21564568
Statistical analysis of the cosmic microwave background: Power spectra and foregrounds
NASA Astrophysics Data System (ADS)
O'Dwyer, Ian J.
2005-11-01
In this thesis I examine some of the challenges associated with analyzing Cosmic Microwave Background (CMB) data and present a novel approach to solving the problem of power spectrum estimation, which is called MAGIC (MAGIC Allows Global Inference of Covariance). In light of the computational difficulty of a brute force approach to power spectrum estimation, I review several approaches which have been applied to the problem and show an example application of such an approximate method to experimental CMB data from the Background Emission Anisotropy Scanning Telescope (BEAST). I then introduce MAGIC, a new approach to power spectrum estimation; based on a Bayesian statistical analysis of the data utilizing Gibbs Sampling. I demonstrate application of this method to the all-sky Wilkinson Microwave Anistropy Probe WMAP data. The results are in broad agreement with those obtained originally by the WMAP team. Since MAGIC generates a full description of each C l it is possible to examine several issues raised by the best-fit WMAP power spectrum, for example the perceived lack of power at low ℓ. It is found that the distribution of C ℓ's at low l are significantly non-Gaussian and, based on the exact analysis presented here, the "low quadrupole issue" can be attributed to a statistical fluctuation. Finally, I examine the effect of Galactic foreground contamination on CMB experiments and describe the principle foregrounds. I show that it is possible to include the foreground components in a self-consistent fashion within the statistical framework of MAGIC and give explicit examples of how this might be achieved. Foreground contamination will become an increasingly important issue in CMB data analysis and the ability of this new algorithm to produce an exact power spectrum in a computationally feasible time, coupled with the foreground component separation and removal is an exciting development in CMB data analysis. When considered with current algorithmic developments
21 CFR 1404.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular...
29 CFR 98.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 1 2010-07-01 2010-07-01 true Adequate evidence. 98.900 Section 98.900 Labor Office of the Secretary of Labor GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 98.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a...
Statistics of the radiated field of a space-to-earth microwave power transfer system
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Leininger, G.
1976-01-01
Statistics such as average power density pattern, variance of the power density pattern and variance of the beam pointing error are related to hardware parameters such as transmitter rms phase error and rms amplitude error. Also a limitation on spectral width of the phase reference for phase control was established. A 1 km diameter transmitter appears feasible provided the total rms insertion phase errors of the phase control modules does not exceed 10 deg, amplitude errors do not exceed 10% rms, and the phase reference spectral width does not exceed approximately 3 kHz. With these conditions the expected radiation pattern is virtually the same as the error free pattern, and the rms beam pointing error would be insignificant (approximately 10 meters).
Statistical distribution of pioneer vegetation: the role of local stream power
NASA Astrophysics Data System (ADS)
Crouzy, B.; Edmaier, K.; Pasquale, N.; Perona, P.
2012-12-01
We discuss results of a flume experiment on the colonization of river bars by pioneer vegetation and focus on the role of a non-constant local stream power in determining the statistics of riverbed and uprooted biomass characteristics (root length, number of roots and stem height). We verify the conjecture that the statistical distribution of riverbed vegetation subject to the action of flood disturbances can be obtained from the distribution before the flooding events combined to the relative resilience to floods of plants with given traits. By using fast growing vegetation (Avena sativa) we can access the competition between growth-associated root stabilization and uprooting by floods. We fix the hydrological timescale (in our experiment the arrival time between periodic flooding events) to be comparable with the biological timescales (plant germination and development rates). The sequence of flooding events is repeated until the surviving riverbed vegetation has grown out of scale with the uprooting capacity of the flood and the competition has stopped. We present and compare laboratory results obtained using converging and parallel channel walls to highlight the role of the local stream power in the process. The convergent geometry can be seen as the laboratory analog of different field conditions. At the scale of the bar it represents regions with flow concentration while at a larger scale it is an analog for a river with convergent banks, for an example see the work on the Tagliamento River by Gurnell and Petts (2006). As expected, we observe that for the convergent geometry the variability in the local stream power results in a longer tail of the distribution of root length for uprooted material compared to parallel geometries with an equal flow rate. More surprisingly, the presence of regions with increased stream power in the convergent experiments allows us to access two fundamentally different regimes. We observe that depending on the development stage
Detecting trends in raptor counts: power and type I error rates of various statistical tests
Hatfield, J.S.; Gould, W.R., IV; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.
1996-01-01
We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.
Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model
Gamisch, Alexander
2016-01-01
The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297
Integrating statistical genetic and geospatial methods brings new power to phylogeography.
Chan, Lauren M; Brown, Jason L; Yoder, Anne D
2011-05-01
The field of phylogeography continues to grow in terms of power and accessibility. Initially uniting population genetics and phylogenetics, it now spans disciplines as diverse as geology, statistics, climatology, ecology, physiology, and bioinformatics to name a few. One major and recent integration driving the field forward is between "statistical phylogeography" and Geographic Information Systems (GIS) (Knowles, 2009). Merging genetic and geospatial data, and their associated methodological toolkits, is helping to bring explicit hypothesis testing to the field of phylogeography. Hypotheses derived from one approach can be reciprocally tested with data derived from the other field and the synthesis of these data can help place demographic events in an historical and spatial context, guide genetic sampling, and point to areas for further investigation. Here, we present three practical examples of empirical analysis that integrate statistical genetic and GIS tools to construct and test phylogeographic hypotheses. Insights into the evolutionary mechanisms underlying recent divergences can benefit from simultaneously considering diverse types of information to iteratively test and reformulate hypotheses. Our goal is to provide the reader with an introduction to the variety of available tools and their potential application to typical questions in phylogeography with the hope that integrative methods will be more broadly and commonly applied to other biological systems and data sets. PMID:21352934
Kinetic power of quasars and statistical excess of MOJAVE superluminal motions
NASA Astrophysics Data System (ADS)
López-Corredoira, M.; Perucho, M.
2012-08-01
Aims: The MOJAVE (MOnitoring of Jets in AGN with VLBA Experiments) survey contains 101 quasars with a total of 354 observed radio components that are different from the radio cores, among which 95% move with apparent projected superluminal velocities with respect to the core, and 45% have projected velocities larger than 10c (with a maximum velocity 60c). We try to determine whether this distribution is statistically probable, and we make an independent measure of the kinetic power required in the quasars to produce such powerful ejections. Methods: Doppler boosting effects are analyzed to determine the statistics of the superluminal motions. We integrate over all possible values of the Lorentz factor, the values of the kinetic energy corresponding to each component. The calculation of the mass in the ejection is carried out by assuming the minimum energy state, i.e., that the magnetic field and particle energy distributions are arranged in the most efficient way to produce the observed synchrotron emission. This kinetic energy is multiplied by the frequency at which the portions of the jet fluid identified as "blobs" are produced. Hence, we estimate the average total power released by the quasars in the form of kinetic energy in the long term on pc-scales. Results: A selection effect in which both the core and the blobs of the quasar are affected by huge Doppler-boosting enhancement increases the probability of finding a jet ejected within 10 degrees of the line of sight ≳ 40 times above what one would expect for a random distribution of ejection, which explains the ratios of the very high projected velocities given above. The average total kinetic power of each MOJAVE quasar should be very high to obtain this distribution: ~ 7 × 1047 erg/s. This amount is much higher than previous estimates of kinetic power on kpc-scales based on the analysis of cavities in X-ray gas or radio lobes in samples of objects of much lower radio luminosity but similar black hole
Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants
NASA Astrophysics Data System (ADS)
Rajasekar, Vidyashree
This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly
Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.
2015-01-01
The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.
Conductance statistics for the power-law banded random matrix model
Martinez-Mendoza, A. J.; Mendez-Bermudez, J. A.; Varga, Imre
2010-12-21
We study numerically the conductance statistics of the one-dimensional (1D) Anderson model with random long-range hoppings described by the Power-law Banded Random Matrix (PBRM) model. Within a scattering approach to electronic transport, we consider two scattering setups in absence and presence of direct processes: 2M single-mode leads attached to one side and to opposite sides of 1D circular samples. For both setups we show that (i) the probability distribution of the logarithm of the conductance T behaves as w(lnT){proportional_to}T{sup M2/2}, for T<<
NASA Astrophysics Data System (ADS)
Bianucci, M.
2016-01-01
This letter has two main goals. The first one is to give a physically reasonable explanation for the use of stochastic models for mimicking the apparent random features of the El Ninõ-Southern Oscillation (ENSO) phenomenon. The second one is to obtain, from the theory, an analytical expression for the equilibrium density function of the anomaly sea surface temperature, an expression that fits the data from observations well, reproducing the asymmetry and the power law tail of the histograms of the NIÑO3 index. We succeed in these tasks exploiting some recent theoretical results of the author in the field of the dynamical origin of the stochastic processes. More precisely, we apply this approach to the celebrated recharge oscillator model (ROM), weakly interacting by a multiplicative term, with a general deterministic complex forcing (Madden-Julian Oscillations, westerly wind burst, etc.), and we obtain a Fokker-Planck equation that describes the statistical behavior of the ROM.
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Wagner, Tyler; Irwin, Brian J.; James R. Bence; Daniel B. Hayes
2016-01-01
Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.
Power-law distributions in economics: a nonextensive statistical approach (Invited Paper)
NASA Astrophysics Data System (ADS)
Duarte Queiros, Silvio M.; Anteneodo, Celia; Tsallis, Constantino
2005-05-01
The cornerstone of Boltzmann-Gibbs (BG) statistical mechanics is the Boltzmann-Gibbs-Jaynes-Shannon entropy SBG≡ -k sh dx f(x) ln f(x), where k is a positive constant and f(x) a probability density function. This theory has exibited, along more than one century, great success in the treatment of systems where short spatio/temporal correlations dominate. There are, however, anomalous natural and artificial systems that violate the basic requirements for its applicability. Different physical entropies, other than the standard one, appear to be necessary in order to satisfactorily deal with such anomalies. One of such entropies is Sq ≡ k (1-sh dx [f(x)]q)=(1-q) (with S1 = SBG), where the entropic index q is a real parameter. It has been proposed as the basis for a generalization, referred to as nonextensive statistical mechanics, of the BG theory. Sq shares with SBG four remarkable properties, namely concavity (8q > 0), Lesche-stability (8q > 0), finiteness of the entropy production per unit time (q 2 <), and additivity (for at least a compact support of q including q = 1). The simultaneous validity of these properties suggests that Sq is appropriate for bridging, at a macroscopic level, with classical thermodynamics itself. In the same natural way that exponential probability functions arise in the standard context, power-law tailed distributions, even with exponents out of the Levy range, arise in the nonextensive framework. In this review, we intend to show that many processes of interest in economy, for which fat-tailed probability functions are empirically observed, can be described in terms of the statistical mechanisms that underly the nonextensive theory.
Eisenberg, Dan T.A.; Kuzawa, Christopher W.; Hayes, M. Geoffrey
2015-01-01
Objectives Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. Methods qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age and between mother and offspring are examined. Results First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity. Conclusions Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, re-running analyses of previous results with well position correction could serve as an independent test of the validity of these results. PMID:25757675
Using genomic annotations increases statistical power to detect eGenes
Duong, Dat; Zou, Jennifer; Hormozdiari, Farhad; Sul, Jae Hoon; Ernst, Jason; Han, Buhm; Eskin, Eleazar
2016-01-01
Motivation: Expression quantitative trait loci (eQTLs) are genetic variants that affect gene expression. In eQTL studies, one important task is to find eGenes or genes whose expressions are associated with at least one eQTL. The standard statistical method to determine whether a gene is an eGene requires association testing at all nearby variants and the permutation test to correct for multiple testing. The standard method however does not consider genomic annotation of the variants. In practice, variants near gene transcription start sites (TSSs) or certain histone modifications are likely to regulate gene expression. In this article, we introduce a novel eGene detection method that considers this empirical evidence and thereby increases the statistical power. Results: We applied our method to the liver Genotype-Tissue Expression (GTEx) data using distance from TSSs, DNase hypersensitivity sites, and six histone modifications as the genomic annotations for the variants. Each of these annotations helped us detected more candidate eGenes. Distance from TSS appears to be the most important annotation; specifically, using this annotation, our method discovered 50% more candidate eGenes than the standard permutation method. Contact: buhm.han@amc.seoul.kr or eeskin@cs.ucla.edu PMID:27307612
Statistics of 150-km echoes over Jicamarca based on low-power VHF observations
NASA Astrophysics Data System (ADS)
Chau, J. L.; Kudeki, E.
2006-07-01
In this work we summarize the statistics of the so-called 150-km echoes obtained with a low-power VHF radar operation at the Jicamarca Radio Observatory (11.97 S, 76.87 W, and 1.3 dip angle at 150-km altitude) in Peru. Our results are based on almost four years of observations between August 2001 and July 2005 (approximately 150 days per year). The majority of the observations have been conducted between 08:00 and 17:00 LT. We present the statistics of occurrence of the echoes for each of the four seasons as a function of time of day and altitude. The occurrence frequency of the echoes is ~75% around noon and start decreasing after 15:00 LT and disappear after 17:00 LT in all seasons. As shown in previous campaign observations, the 150-echoes appear at a higher altitude (>150 km) in narrow layers in the morning, reaching lower altitudes (~135 km) around noon, and disappear at higher altitudes (>150 km) after 17:00 LT. We show that although 150-km echoes are observed all year long, they exhibit a clear seasonal variability on altitudinal coverage and the percentage of occurrence around noon and early in the morning. We also show that there is a strong day-to-day variability, and no correlation with magnetic activity. Although our results do not solve the 150-km riddle, they should be taken into account when a reasonable theory is proposed.
NASA Astrophysics Data System (ADS)
Ladoni, Moslem; Kravchenko, Sasha
2014-05-01
Conservational agricultural managements have a potential to increase soil organic carbon sequestration. However, due to typically slow response of soil organic C to management and due to its large spatial variability many researchers find themselves failing to detect statistically significant management effects on soil organic carbon in their studies. One solution that has been commonly applied is to use active fractions of soil organic C for treatment comparisons. Active pools of soil organic C have been shown to respond to management changes faster than total C; however, it is possible that larger variability associated with these pools can make their use for treatment comparisons more difficult. The objectives of this study are to assess the variability of total C and C active pools and then to use power analysis to investigate the probability of detecting significant differences among the treatments for total C and for different active pools of C. We also explored the benefit of applying additional soil and landscape data as covariates to explain some of the variability and to enhance the statistical power for different pools of C. We collected 66 soil from 10 agricultural fields under three different management treatments, namely corn-soybean-wheat rotation systems with 1) conventional chemical inputs, 2) low chemical inputs with cover crops and 3) organic management with cover crops. The cores were analyzed for total organic carbon (TOC) and for two active C pool characteristics, such as particulate organic carbon (POC) and short-term mineralizable carbon (SMC). In addition, for each core we determined the values of potential covariates including soil particle size distribution, bulk density and topographical terrain attributes. Power analysis was conducted using the estimates of variances from the obtained data and a series of hypothesized management effects. The range of considered hypothesized effects consisted of 10-100% increases under low-input, 10
34 CFR 85.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Definitions § 85.900 Adequate evidence. Adequate evidence means information sufficient to support the reasonable belief that a particular act or omission has occurred. Authority: E.O. 12549 (3 CFR, 1986 Comp., p. 189); E.O 12689 (3 CFR, 1989 Comp., p. 235); 20 U.S.C. 1082, 1094, 1221e-3 and 3474; and Sec....
29 CFR 452.110 - Adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 2 2010-07-01 2010-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...
29 CFR 452.110 - Adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 2 2011-07-01 2011-07-01 false Adequate safeguards. 452.110 Section 452.110 Labor... DISCLOSURE ACT OF 1959 Election Procedures; Rights of Members § 452.110 Adequate safeguards. (a) In addition to the election safeguards discussed in this part, the Act contains a general mandate in section...
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
The profound impact of negative power law noise on statistical estimation.
Reinhardt, Victor S
2010-01-01
This paper investigates the profound impact of negative power law (neg-p) noise - that is, noise with a power spectral density L(p)(f) proportional variant | f |(p) for p < 0 - on the ability of practical implementations of statistical estimation or fitting techniques, such as a least squares fit (LSQF) or a Kalman filter, to generate valid results. It demonstrates that such negp noise behaves more like systematic error than conventional noise, because neg-p noise is highly correlated, non-stationary, non-mean ergodic, and has an infinite correlation time tau(c). It is further demonstrated that stationary but correlated noise will also cause invalid estimation behavior when the condition T > tau(c) is not met, where T is the data collection interval for estimation. Thus, it is shown that neg-p noise, with its infinite Tau(c), can generate anomalous estimation results for all values of T, except in certain circumstances. A covariant theory is developed explaining much of this anomalous estimation behavior. However, simulations of the estimation behavior of neg-p noise demonstrate that the subject cannot be fully understood in terms of covariant theory or mean ergodicity. It is finally conjectured that one must investigate the variance ergodicity properties of neg-p noise through the use of 4th order correlation theory to fully explain such simulated behavior. PMID:20040429
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-01-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.
NASA Astrophysics Data System (ADS)
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.
Americans Getting Adequate Water Daily, CDC Finds
... medlineplus/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...
Americans Getting Adequate Water Daily, CDC Finds
... gov/news/fullstory_158510.html Americans Getting Adequate Water Daily, CDC Finds Men take in an average ... new government report finds most are getting enough water each day. The data, from the U.S. National ...
GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis
Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.
2011-01-01
The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required. PMID:26673526
Nakamura, Kunio; Guizard, Nicolas; Fonov, Vladimir S.; Narayanan, Sridar; Collins, D. Louis; Arnold, Douglas L.
2013-01-01
Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS) and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials. PMID:24266007
Kogalovskii, M.R.
1995-03-01
This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.
ERIC Educational Resources Information Center
Daniel, Thomas Dyson
Statistical power in music education was examined by taking an in-depth look at quantitative articles published in the "Journal of Research in Music Education" between 1987 and 1991, inclusive. Of the 109 articles of the period, 78 were quantitative, with both parametric and nonparametric procedures considered. Sample sizes were those reported by…
Asbestos/NESHAP adequately wet guidance
Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.
1990-12-01
The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.
Statistics of the epoch of reionization 21-cm signal - I. Power spectrum error-covariance
NASA Astrophysics Data System (ADS)
Mondal, Rajesh; Bharadwaj, Somnath; Majumdar, Suman
2016-02-01
The non-Gaussian nature of the epoch of reionization (EoR) 21-cm signal has a significant impact on the error variance of its power spectrum P(k). We have used a large ensemble of seminumerical simulations and an analytical model to estimate the effect of this non-Gaussianity on the entire error-covariance matrix {C}ij. Our analytical model shows that {C}ij has contributions from two sources. One is the usual variance for a Gaussian random field which scales inversely of the number of modes that goes into the estimation of P(k). The other is the trispectrum of the signal. Using the simulated 21-cm Signal Ensemble, an ensemble of the Randomized Signal and Ensembles of Gaussian Random Ensembles we have quantified the effect of the trispectrum on the error variance {C}ii. We find that its relative contribution is comparable to or larger than that of the Gaussian term for the k range 0.3 ≤ k ≤ 1.0 Mpc-1, and can be even ˜200 times larger at k ˜ 5 Mpc-1. We also establish that the off-diagonal terms of {C}ij have statistically significant non-zero values which arise purely from the trispectrum. This further signifies that the error in different k modes are not independent. We find a strong correlation between the errors at large k values (≥0.5 Mpc-1), and a weak correlation between the smallest and largest k values. There is also a small anticorrelation between the errors in the smallest and intermediate k values. These results are relevant for the k range that will be probed by the current and upcoming EoR 21-cm experiments.
Schroeder, Carl B.; Fawley, William M.; Esarey, Eric
2002-09-24
We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond first saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuation level reaches a minimum.
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Adequate supervision for children and adolescents.
Anderst, James; Moffatt, Mary
2014-11-01
Primary care providers (PCPs) have the opportunity to improve child health and well-being by addressing supervision issues before an injury or exposure has occurred and/or after an injury or exposure has occurred. Appropriate anticipatory guidance on supervision at well-child visits can improve supervision of children, and may prevent future harm. Adequate supervision varies based on the child's development and maturity, and the risks in the child's environment. Consideration should be given to issues as wide ranging as swimming pools, falls, dating violence, and social media. By considering the likelihood of harm and the severity of the potential harm, caregivers may provide adequate supervision by minimizing risks to the child while still allowing the child to take "small" risks as needed for healthy development. Caregivers should initially focus on direct (visual, auditory, and proximity) supervision of the young child. Gradually, supervision needs to be adjusted as the child develops, emphasizing a safe environment and safe social interactions, with graduated independence. PCPs may foster adequate supervision by providing concrete guidance to caregivers. In addition to preventing injury, supervision includes fostering a safe, stable, and nurturing relationship with every child. PCPs should be familiar with age/developmentally based supervision risks, adequate supervision based on those risks, characteristics of neglectful supervision based on age/development, and ways to encourage appropriate supervision throughout childhood. PMID:25369578
Small Rural Schools CAN Have Adequate Curriculums.
ERIC Educational Resources Information Center
Loustaunau, Martha
The small rural school's foremost and largest problem is providing an adequate curriculum for students in a changing world. Often the small district cannot or is not willing to pay the per-pupil cost of curriculum specialists, specialized courses using expensive equipment no more than one period a day, and remodeled rooms to accommodate new…
Funding the Formula Adequately in Oklahoma
ERIC Educational Resources Information Center
Hancock, Kenneth
2015-01-01
This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…
The Relation of Power of Statistical Tests to Range of Talent: A Correction and Amplification.
ERIC Educational Resources Information Center
Humphreys, Lloyd G.
1991-01-01
The difference in effect on power of restriction of range between treatments under experimental control and categories formed from examinee (individual differences) variables is discussed. Conditions under which there is loss or increase of power are explained. (SLD)
Inference of Statistical Patterns in Complex Geosystems: Fitting Power-law Distributions.
NASA Astrophysics Data System (ADS)
Deluca, Anna; Corral, Alvaro
2014-05-01
Power-law distributions contain precious information about a large variety of physical processes. Although there are sound theoretical grounds for these distributions, the empirical evidence giving support to power laws has been traditionally weak. Recently, Clauset et al. have proposed a systematic method to find over which range (if any) a certain distribution behaves as a power law. However, their method fails to recognize true (simulated) power-law tails in some instances, rejecting the power-law hypothesis. Moreover, the method does not perform well when it is extended to power-law distributions with an upper truncation. We present an alternative procedure, valid for truncated as well as for non-truncated power-law distributions, based in maximum likelihood estimation, the Kolmogorov-Smirnov goodness-of-fit test, and Monte Carlo simulations. We will test the performance of our method on several empirical data which were previously analyzed with less systematic approaches.
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles. PMID:21319626
NASA Astrophysics Data System (ADS)
Tsallis, C.; Cirto, L. J. L.
2014-10-01
We briefly review the connection between statistical mechanics and thermodynamics. We show that, in order to satisfy thermo-dynamics and its Legendre transformation mathematical frame, the celebrated Boltzmann-Gibbs (BG) statistical mechanics is sufficient but not necessary. Indeed, the N →∞ limit of statistical mechanics is expected to be consistent with thermodynamics. For systems whose elements are generically independent or quasi-independent in the sense of the theory of probabilities, it is well known that the BG theory (based on the additive BG entropy) does satisfy this expectation. However, in complete analogy, other thermostatistical theories (e.g., q-statistics), based on nonadditive entropic functionals, also satisfy the very same expectation. We illustrate this standpoint with systems whose elements are strongly correlated in a specific manner, such that they escape the BG realm.
Power law statistics of force and acoustic emission from a slowly penetrated granular bed
NASA Astrophysics Data System (ADS)
Matsuyama, K.; Katsuragi, H.
2014-01-01
Penetration-resistant force and acoustic emission (AE) from a plunged granular bed are experimentally investigated through their power law distribution forms. An AE sensor is buried in a glass bead bed. Then, the bed is slowly penetrated by a solid sphere. During the penetration, the resistant force exerted on the sphere and the AE signal are measured. The resistant force shows power law relation to the penetration depth. The power law exponent is independent of the penetration speed, while it seems to depend on the container's size. For the AE signal, we find that the size distribution of AE events obeys power laws. The power law exponent depends on grain size. Using the energy scaling, the experimentally observed power law exponents are discussed and compared to the Gutenberg-Richter (GR) law.
Chung, Moo K; Kim, Seung-Goo; Schaefer, Stacey M; van Reekum, Carien M; Peschke-Schmitz, Lara; Sutterer, Matthew J; Davidson, Richard J
2014-03-21
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace-Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power. PMID:25302007
The Power of Student's t and Wilcoxon W Statistics: A Comparison.
ERIC Educational Resources Information Center
Rasmussen, Jeffrey Lee
1985-01-01
A recent study (Blair and Higgins, 1980) indicated a power advantage for the Wilcoxon W Test over student's t-test when calculated from a common mixed-normal sample. Results of the present study indicate that the t-test corrected for outliers shows a superior power curve to the Wilcoxon W.
ERIC Educational Resources Information Center
Jiang, Depeng; Pepler, Debra; Yao, Hongxing
2010-01-01
Do interventions work and for whom? For this article, we examined the influence of population heterogeneity on power in designing and evaluating interventions. On the basis of Monte Carlo simulations in Study 1, we demonstrated that the power to detect the overall intervention effect is lower for a mixture of two subpopulations than for a…
ERIC Educational Resources Information Center
Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael
2014-01-01
Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…
Bellan, Steven E.; Pulliam, Juliet R. C.; Pearson, Carl A. B.; Champredon, David; Fox, Spencer J.; Skrip, Laura; Galvani, Alison P.; Gambhir, Manoj; Lopman, Ben A.; Porco, Travis C.; Meyers, Lauren Ancel; Dushoff, Jonathan
2016-01-01
Background Safe and effective vaccines may help end the ongoing Ebola virus disease (EVD) epidemic in West Africa, and mitigate future outbreaks. We evaluate the statistical validity and power of randomized controlled (RCT) and stepped-wedge cluster trial (SWCT) designs in Sierra Leone, where EVD incidence is spatiotemporally heterogeneous, and rapidly declining. Methods We forecasted district-level EVD incidence over the next six months using a stochastic model fit to data from Sierra Leone. We then simulated RCT and SWCT designs in trial populations comprising geographically distinct clusters of high risk, taking into account realistic logistical constraints, as well as both individual-level and cluster-level variation in risk. We assessed false positive rates and power for parametric and nonparametric analyses of simulated trial data, across a range of vaccine efficacies and trial start dates. Findings For an SWCT, regional variation in EVD incidence trends produced inflated false positive rates (up to 0.11 at α=0.05) under standard statistical models, but not when analyzed by a permutation test, whereas all analyses of RCTs remained valid. Assuming a six-month trial starting February 18, 2015, we estimate the power to detect a 90% efficacious vaccine to be between 48% and 89% for an RCT, and between 6.4% and 26% for an SWCT, depending on incidence within the trial population. We estimate that a one-month delay in implementation will reduce the power of the RCT and SWCT by 20% and 49%, respectively. Interpretation Spatiotemporal variation in infection risk undermines the SWCT's statistical power. This variation also undercuts the SWCT's expected ethical advantages over the RCT, because the latter but not the former can prioritize high-risk clusters. Funding US National Institutes of Health, US National Science Foundation, Canadian Institutes of Health Research PMID:25886798
ERIC Educational Resources Information Center
Groth, Randall E.
2013-01-01
A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…
ERIC Educational Resources Information Center
Endress, Ansgar D.; Mehler, Jacques
2009-01-01
Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…
Two universal physical principles shape the power-law statistics of real-world networks
Lorimer, Tom; Gomez, Florian; Stoop, Ruedi
2015-01-01
The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations. PMID:26202858
Two universal physical principles shape the power-law statistics of real-world networks
NASA Astrophysics Data System (ADS)
Lorimer, Tom; Gomez, Florian; Stoop, Ruedi
2015-07-01
The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations.
Statistical analysis of power-size-redshift distributions of extragalactic jets
NASA Technical Reports Server (NTRS)
Rosen, Alexander; Wiita, Paul J.
1991-01-01
This paper investigates whether a hot, sparse, yet cosmologically significant intergalactic medium is consistent with data collected from extragalactic radio sources. This is done by use of Monte Carlo simulations which employ previously run pseudohydrodynamical simulations to cover an observational parameter space. These observational parameters include the scale height, central density, and temperature of a (isothermal) galactic halo, and the power of the central engine which drives the jet. The Monte Carlo simulations generate distribution of sizes in bins of (received) power and redshift, which have been compared with observational data using Kolmogorov-Smirnov tests. Results of this analysis are consistent with the existence of an IGM with temperature and density mentioned above. In addition, this analysis suggests that the active lifetime of powerful extragalactic radio sources decreases with increasing power.
Statistical evidence for power law temporal correlations in exploratory behaviour of rats.
Yadav, Chetan K; Verma, Mahendra K; Ghosh, Subhendu
2010-01-01
Dynamics of exploratory behaviour of rats and home base establishment is investigated. Time series of instantaneous speed of rats was computed from their position during exploration. The probability distribution function (PDF) of the speed obeys a power law distribution with exponents ranging from 2.1 to 2.32. The PDF of the recurrence time of large speed also exhibits a power law, P(τ) ~ τ(⁻β) with β from 1.56 to 2.30. The power spectrum of the speed is in general agreement with the 1/f spectrum reported earlier. These observations indicate that the acquisition of spatial information during exploration is self-organized with power law temporal correlations. This provides a possible explanation for the home base behaviour of rats during exploration. The exploratory behaviour of rats resembles other systems exhibiting self-organized criticality, e.g., earthquakes, solar flares etc. PMID:20688133
NASA Technical Reports Server (NTRS)
Smith, Wayne Farrior
1973-01-01
The effect of finite source size on the power statistics in a reverberant room for pure tone excitation was investigated. Theoretical results indicate that the standard deviation of low frequency, pure tone finite sources is always less than that predicted by point source theory and considerably less when the source dimension approaches one-half an acoustic wavelength or greater. A supporting experimental study was conducted utilizing an eight inch loudspeaker and a 30 inch loudspeaker at eleven source positions. The resulting standard deviation of sound power output of the smaller speaker is in excellent agreement with both the derived finite source theory and existing point source theory, if the theoretical data is adjusted to account for experimental incomplete spatial averaging. However, the standard deviation of sound power output of the larger speaker is measurably lower than point source theory indicates, but is in good agreement with the finite source theory.
NASA Astrophysics Data System (ADS)
Woolley, Thomas W.; Dawson, George O.
It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
NASA Astrophysics Data System (ADS)
Kneller, James P.; Kabadi, Neel V.
2015-07-01
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. In this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.
A statistical framework for genetic association studies of power curves in bird flight
Lin, Min; Zhao, Wei
2006-01-01
How the power required for bird flight varies as a function of forward speed can be used to predict the flight style and behavioral strategy of a bird for feeding and migration. A U-shaped curve was observed between the power and flight velocity in many birds, which is consistent to the theoretical prediction by aerodynamic models. In this article, we present a general genetic model for fine mapping of quantitative trait loci (QTL) responsible for power curves in a sample of birds drawn from a natural population. This model is developed within the maximum likelihood context, implemented with the EM algorithm for estimating the population genetic parameters of QTL and the simplex algorithm for estimating the QTL genotype-specific parameters of power curves. Using Monte Carlo simulation derived from empirical observations of power curves in the European starling (Sturnus vulgaris), we demonstrate how the underlying QTL for power curves can be detected from molecular markers and how the QTL detected affect the most appropriate flight speeds used to design an optimal migration strategy. The results from our model can be directly integrated into a conceptual framework for understanding flight origin and evolution. PMID:17066123
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives. PMID:26903208
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.
2014-11-01
We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. This map shows an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales (ℓ < 60), the major foreground contaminant is the diffuse thermal dust emission. At small angular scales (ℓ > 500) the clustered cosmic infrared background and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We thus measure the tSZ power spectrum over angular scales 0.17° ≲ θ ≲ 3.0° that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with clear evidence of additional signal from unresolved clusters and, potentially, diffuse warm baryons. Marginalized band-powers of the Planck tSZ power spectrum and the best-fit model are given. The non-Gaussianity of the Compton parameter map is further characterized by computing its 1D probability distribution function and its bispectrum. The measured tSZ power spectrum and high order statistics are used to place constraints on σ8.
The power of 41%: A glimpse into the life of a statistic.
Tanis, Justin
2016-01-01
"Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record PMID:27380151
Real-time determination of total radiated power by bolometric cameras with statistical methods
Maraschek, M.; Fuchs, J.C.; Mast, K.F.; Mertens, V.; Zohm, H.
1998-01-01
A simpler and faster method for determining the total radiated power emitted from a tokamak plasma in real-time has been developed. This quantity is normally calculated after the discharge by a deconvolution of line integrals from a bolometer camera. This time-consuming algorithm assumes constant emissivity on closed flux surfaces and therefore needs the exact magnetic equilibrium information. Thus, it is highly desirable to have a different, simpler way to determine the total radiated power in real-time without additional magnetic equilibrium information. The real-time calculation of the total radiated power is done by a summation over ten or 18 lines of sight selected out of a bolometer camera with 40 channels. The number of channels is restricted by the summation hardware. A new selection scheme, which uses a singular value decomposition, has been developed to select the required subset of line integrals from the camera. With this subset, a linear regression analysis was done against the radiated power calculated by the conventional algorithm. The selected channels are finally used with the regression coefficients as weighting factors to determine an estimation of the radiated power for subsequent discharges. This selection and the corresponding weighting factors can only be applied to discharges with a similar plasma shape, e.g., in our case the typical ASDEX upgrade elliptical divertor plasma. {copyright} {ital 1998 American Institute of Physics.}
Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint
Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.
2012-08-01
Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.
Adequation of mini satellites to oceanic altimetry missions
NASA Astrophysics Data System (ADS)
Bellaieche, G.; Aguttes, J. P.
1993-01-01
Association of the mini satellite concept and oceanic altimetry missions is discussed. Mission definition and most constraining requirements (mesoscale for example) demonstrate mini satellites to be quite adequate for such missions. Progress in altimeter characteristics, orbit determination, and position reporting allow consideration of oceanic altimetry missions using low Earth orbit satellites. Satellite constellation, trace keeping and orbital period, and required payload characteristics are exposed. The mission requirements covering Sun synchronous orbit, service area, ground system, and launcher characteristics as well as constellation maintenance strategy are specified. Two options for the satellite, orbital mechanics, propulsion, onboard power and stabilizing subsystems, onboard management, satellite ground linkings, mechanical and thermal subsystems, budgets, and planning are discussed.
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2012-01-01
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power…
Tests of Independence in Contingency Tables with Small Samples: A Comparison of Statistical Power.
ERIC Educational Resources Information Center
Parshall, Cynthia G.; Kromrey, Jeffrey D.
1996-01-01
Power and Type I error rates were estimated for contingency tables with small sample sizes for the following four types of tests: (1) Pearson's chi-square; (2) chi-square with Yates's continuity correction; (3) the likelihood ratio test; and (4) Fisher's Exact Test. Various marginal distributions, sample sizes, and effect sizes were examined. (SLD)
NASA Astrophysics Data System (ADS)
Barkana, Rennan; Loeb, Abraham
2008-03-01
A new generation of radio telescopes are currently being built with the goal of tracing the cosmic distribution of atomic hydrogen at redshifts 6-15 through its 21-cm line. The observations will probe the large-scale brightness fluctuations sourced by ionization fluctuations during cosmic reionization. Since detailed maps will be difficult to extract due to noise and foreground emission, efforts have focused on a statistical detection of the 21-cm fluctuations. During cosmic reionization, these fluctuations are highly non-Gaussian and thus more information can be extracted than just the one-dimensional function that is usually considered, i.e. the correlation function. We calculate a two-dimensional function that if measured observationally would allow a more thorough investigation of the properties of the underlying ionizing sources. This function is the probability distribution function (PDF) of the difference in the 21-cm brightness temperature between two points, as a function of the separation between the points. While the standard correlation function is determined by a complicated mixture of contributions from density and ionization fluctuations, we show that the difference PDF holds the key to separately measuring the statistical properties of the ionized regions.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
The results of a NASA investigation of a claimed Overlap between two gust response analysis methods: the Statistical Discrete Gust (SDG) Method and the Power Spectral Density (PSD) Method are presented. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented for several different airplanes at several different flight conditions indicate that such an Overlap does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
This paper presents the results of a NASA investigation of a claimed 'Overlap' between two gust response analysis methods: the Statistical Discrete Gust (SDG) method and the Power Spectral Density (PSD) method. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented in this paper for several different airplanes at several different flight conditions indicate that such an 'Overlap' does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
Is a vegetarian diet adequate for children.
Hackett, A; Nathan, I; Burgess, L
1998-01-01
The number of people who avoid eating meat is growing, especially among young people. Benefits to health from a vegetarian diet have been reported in adults but it is not clear to what extent these benefits are due to diet or to other aspects of lifestyles. In children concern has been expressed concerning the adequacy of vegetarian diets especially with regard to growth. The risks/benefits seem to be related to the degree of restriction of he diet; anaemia is probably both the main and the most serious risk but this also applies to omnivores. Vegan diets are more likely to be associated with malnutrition, especially if the diets are the result of authoritarian dogma. Overall, lacto-ovo-vegetarian children consume diets closer to recommendations than omnivores and their pre-pubertal growth is at least as good. The simplest strategy when becoming vegetarian may involve reliance on vegetarian convenience foods which are not necessarily superior in nutritional composition. The vegetarian sector of the food industry could do more to produce foods closer to recommendations. Vegetarian diets can be, but are not necessarily, adequate for children, providing vigilance is maintained, particularly to ensure variety. Identical comments apply to omnivorous diets. Three threats to the diet of children are too much reliance on convenience foods, lack of variety and lack of exercise. PMID:9670174
Statistical modelling and power analysis for detecting trends in total suspended sediment loads
NASA Astrophysics Data System (ADS)
Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason
2015-01-01
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.
The power of the optimal asymptotic tests of composite statistical hypotheses.
Singh, A C; Zhurbenko, I G
1975-02-01
The easily computable asymptotic power of the locally asymptotically optimal test of a composite hypothesis, known as the optimal C(alpha) test, is obtained through a "double" passage to the limit: the number n of observations is indefinitely increased while the conventional measure xi of the error in the hypothesis tested tends to zero so that xi(n)n((1/2)) --> tau not equal 0. Contrary to this, practical problems require information on power, say beta(xi,n), for a fixed xi and for a fixed n. The present paper gives the upper and the lower bounds for beta(xi,n). These bounds can be used to estimate the rate of convergence of beta(xi,n) to unity as n --> infinity. The results obtained can be extended to test criteria other than those labeled C(alpha). The study revealed a difference between situations in which the C(alpha) test criterion is used to test a simple or a composite hypothesis. This difference affects the rate of convergence of the actual probability of type I error to the preassigned level alpha. In the case of a simple hypothesis, the rate is of the order of n(-(1/2)). In the case of a composite hypothesis, the best that it was possible to show is that the rate of convergence cannot be slower than that of the order of n(-(1/2)) ln n. PMID:16592222
Fast fMRI provides high statistical power in the analysis of epileptic networks.
Jacobs, Julia; Stich, Julia; Zahneisen, Benjamin; Assländer, Jakob; Ramantani, Georgia; Schulze-Bonhage, Andreas; Korinthenberg, Rudolph; Hennig, Jürgen; LeVan, Pierre
2014-03-01
EEG-fMRI is a unique method to combine the high temporal resolution of EEG with the high spatial resolution of MRI to study generators of intrinsic brain signals such as sleep grapho-elements or epileptic spikes. While the standard EPI sequence in fMRI experiments has a temporal resolution of around 2.5-3s a newly established fast fMRI sequence called MREG (Magnetic-Resonance-Encephalography) provides a temporal resolution of around 100ms. This technical novelty promises to improve statistics, facilitate correction of physiological artifacts and improve the understanding of epileptic networks in fMRI. The present study compares simultaneous EEG-EPI and EEG-MREG analyzing epileptic spikes to determine the yield of fast MRI in the analysis of intrinsic brain signals. Patients with frequent interictal spikes (>3/20min) underwent EEG-MREG and EEG-EPI (3T, 20min each, voxel size 3×3×3mm, EPI TR=2.61s, MREG TR=0.1s). Timings of the spikes were used in an event-related analysis to generate activation maps of t-statistics. (FMRISTAT, |t|>3.5, cluster size: 7 voxels, p<0.05 corrected). For both sequences, the amplitude and location of significant BOLD activations were compared with the spike topography. 13 patients were recorded and 33 different spike types could be analyzed. Peak T-values were significantly higher in MREG than in EPI (p<0.0001). Positive BOLD effects correlating with the spike topography were found in 8/29 spike types using the EPI and in 22/33 spikes types using the MREG sequence. Negative BOLD responses in the default mode network could be observed in 3/29 spike types with the EPI and in 19/33 with the MREG sequence. With the latter method, BOLD changes were observed even when few spikes occurred during the investigation. Simultaneous EEG-MREG thus is possible with good EEG quality and shows higher sensitivity in regard to the localization of spike-related BOLD responses than EEG-EPI. The development of new methods of analysis for this sequence such as
Statistical power of detecting trends in total suspended sediment loads to the Great Barrier Reef.
Darnell, Ross; Henderson, Brent; Kroon, Frederieke J; Kuhnert, Petra
2012-01-01
The export of pollutant loads from coastal catchments is of primary interest to natural resource management. For example, Reef Plan, a joint initiative by the Australian Government and the Queensland Government, has indicated that a 20% reduction in sediment is required by 2020. There is an obvious need to consider our ability to detect any trend if we are to set realistic targets or to reliably identify changes to catchment loads. We investigate the number of years of monitoring aquatic pollutant loads necessary to detect trends. Instead of modelling the trend in the annual loads directly, given their strong relationship to flow, we consider trends through the reduction in concentration for a given flow. Our simulations show very low power (<40%) of detecting changes of 20% over time periods of several decades, indicating that the chances of detecting trends of reasonable magnitudes over these time frames are very small. PMID:22551850
Weak lensing statistics as a probe of {OMEGA} and power spectrum.
NASA Astrophysics Data System (ADS)
Bernardeau, F.; van Waerbeke, L.; Mellier, Y.
1997-06-01
The possibility of detecting weak lensing effects from deep wide field imaging surveys has opened new means of probing the large-scale structure of the Universe and measuring cosmological parameters. In this paper we present a systematic study of the expected dependence of the low order moments of the filtered gravitational local convergence on the power spectrum of the density fluctuations and on the cosmological parameters {OMEGA}_0_ and {LAMBDA}. The results show a significant dependence on all these parameters. Though we note that this degeneracy could be partially raised by considering two populations of sources, at different redshifts, computing the third moment is more promising since it is expected, in the quasi-linear regime and for Gaussian initial conditions, to be only {OMEGA}_0_ dependent (with a slight degeneracy with {LAMBDA}) when it is correctly expressed in terms of the second moment. More precisely we show that the variance of the convergence varies approximately as P(k){OMEGA}_0_^1.5^z_s_^1.5^, whereas the skewness varies as {OMEGA}_0_^-0.8^z_s_^-1.35^, where P(k) is the projected power spectrum and z_s_ the redshift of the sources. Thus, used jointly they can provide both P(k) and {OMEGA}_0_. However, the dependence on the redshift of the sources is large and could be a major concern for a practical implementation. We have estimated the errors expected for these parameters in a realistic scenario and sketched what would be the observational requirements for doing such measurements. A more detailed study of an observational strategy is left for a second paper.
Statistical power of multilevel modelling in dental caries clinical trials: a simulation study.
Burnside, G; Pine, C M; Williamson, P R
2014-01-01
Outcome data from dental caries clinical trials have a naturally hierarchical structure, with surfaces clustered within teeth, clustered within individuals. Data are often aggregated into the DMF index for each individual, losing tooth- and surface-specific information. If these data are to be analysed by tooth or surface, allowing exploration of effects of interventions on different teeth and surfaces, appropriate methods must be used to adjust for the clustered nature of the data. Multilevel modelling allows analysis of clustered data using individual observations without aggregating data, and has been little used in the field of dental caries. A simulation study was conducted to investigate the performance of multilevel modelling methods and standard caries increment analysis. Data sets were simulated from a three-level binomial distribution based on analysis of a caries clinical trial in Scottish adolescents, with varying sample sizes, treatment effects and random tooth level effects based on trials reported in Cochrane reviews of topical fluoride, and analysed to compare the power of multilevel models and traditional analysis. 40,500 data sets were simulated. Analysis showed that estimated power for the traditional caries increment method was similar to that for multilevel modelling, with more variation in smaller data sets. Multilevel modelling may not allow significant reductions in the number of participants required in a caries clinical trial, compared to the use of traditional analyses, but investigators interested in exploring the effect of their intervention in more detail may wish to consider the application of multilevel modelling to their clinical trial data. PMID:24216573
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
Kinetic and Statistical Analysis of Primary Circuit Water Chemistry Data in a VVER Power Plant
Nagy, Gabor; Tilky, Peter; Horvath, Akos; Pinter, Tamas; Schiller, Robert
2001-12-15
The results of chemical and radiochemical analyses of the primary circuit coolant liquid, obtained between 1995 and 1999 at the four VVER-type blocks of the Paks (Hungary) nuclear power station, are assessed. A model has been developed regarding the pressure vessel with its auxiliary parts plus the fuel elements as the zone, with the six steam generators as one single unit. The stream from the steam generator is split, with its larger part returning to the zone through the main circulating pump and the smaller one passing through the purifier column. Based on this flowchart, the formation kinetics of corrosion products and of radioactive substances are evaluated. Correlation analysis is applied to reveal any eventual interdependence of the processes, whereas the range-per-scatter (R/S) method is used to characterize the random or deterministic nature of a process. The evaluation of the t {yields} {infinity} limits of the kinetic equations enables one to conclude that (a) the total amount of corrosion products per element during one cycle is almost always <15 kg and (b) the zone acts as a highly efficient filter with an efficiency of {approx}1. The R/S results show that the fluctuations in the concentrations of the corrosion products are persistent; this finding indicates that random effects play here little if any role and that the processes in the coolant are under control. Correlation analyses show that the variations of the concentrations are practically uncorrelated and that the processes are independent of each other.
ERIC Educational Resources Information Center
Blair, R. Clifford; Higgins, James J.
1980-01-01
Monte Carlo techniques were used to compare the power of Wilcoxon's rank-sum test to the power of the two independent means t test for situations in which samples were drawn from (1) uniform, (2) Laplace, (3) half-normal, (4) exponential, (5) mixed-normal, and (6) mixed-uniform distributions. (Author/JKS)
NASA Astrophysics Data System (ADS)
Mittendorfer, J.; Zwanziger, P.
2000-03-01
High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section...) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use. Adequate directions for use means directions under which the layman can use a drug safely and for the purposes...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 4 Accounts 1 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 200.14 Section 200.14 Accounts RECOVERY ACCOUNTABILITY AND TRANSPARENCY BOARD PRIVACY ACT OF 1974 § 200.14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 4 Accounts 1 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 200....14 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining adequate technical, physical, and security safeguards to prevent unauthorized disclosure...
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
NASA Astrophysics Data System (ADS)
Kolchev, K. K.; Mezin, S. V.
2015-07-01
A technique for constructing mathematical models simulating the technological processes in thermal power equipment developed on the basis of the statistical approximation method is described. The considered method was used in the developed software module (plug-in) intended for calculating nonlinear mathematical models of gas turbine units and for diagnosing them. The mathematical models constructed using this module are used for describing the current state of a system. Deviations of the system's actual state from the estimate obtained using the mathematical model point to malfunctions in operation of this system. The multidimensional interpolation and approximation method and the theory of random functions serve as a theoretical basis of the developed technique. By using the developed technique it is possible to construct complex static models of plants that are subject to control and diagnostics. The module developed using the proposed technique makes it possible to carry out periodic diagnostics of the operating equipment for revealing deviations from the normal mode of its operation. The specific features relating to construction of mathematical models are considered, and examples of applying them with the use of observations obtained on the equipment of gas turbine units are given.
ERIC Educational Resources Information Center
Tabor, Josh
2010-01-01
On the 2009 AP[c] Statistics Exam, students were asked to create a statistic to measure skewness in a distribution. This paper explores several of the most popular student responses and evaluates which statistic performs best when sampling from various skewed populations. (Contains 8 figures, 3 tables, and 4 footnotes.)
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
NASA Astrophysics Data System (ADS)
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
NASA Astrophysics Data System (ADS)
Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja
2016-04-01
Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.
7 CFR 4290.200 - Adequate capital for RBICs.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 15 2011-01-01 2011-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees... INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital for... Licensee, and to receive Leverage. (a) You must have enough Regulatory Capital to provide...
7 CFR 4290.200 - Adequate capital for RBICs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 503.35 - Inability to obtain adequate capital.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...
10 CFR 503.35 - Inability to obtain adequate capital.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Inability to obtain adequate capital. 503.35 Section 503.35 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES Permanent Exemptions for New Facilities § 503.35 Inability to obtain adequate capital. (a) Eligibility. Section 212(a)(1)(D)...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Adequate exploration plan....
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must find... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Adequate exploration plan....
"Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya
ERIC Educational Resources Information Center
Parker, Jan
2014-01-01
Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo…
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
NASA Astrophysics Data System (ADS)
Olson, Kyle David
A model is presented and confirmed experimentally that explains the anomalous behavior observed in the continuous wave (CW) excitation of thermally-isolated optics. Very low absorption, high reflective optical thin film coatings of HfO2 and SiO2 were prepared. When illuminated with a laser for 30s the coatings survived peak irradiances of 13MW/cm 2. The temperature profile of the optical surfaces was measured using a calibrated thermal imaging camera; about the same peak temperatures were recorded regardless of spot size, which ranged between 500mum and 5mm. This phenomenon is explained by solving the heat diffusion equation for an optic of finite dimensions, including the non-idealities of the measurement. An analytical result is also derived showing the transition from millisecond pulses to CW, where the heating is proportional to the laser irradiance (W/m 2) for millisecond pulses, and proportional to the beam radius (W/m) for CW. Contamination-induced laser breakdown is often viewed as random and simple physical models are difficult to apply. Under continuous wave illumination conditions, failure appears to be induced by a runaway free-carrier absorption process. High power laser illumination is absorbed by the contaminant particles or regions, which heat rapidly. Some of this heat transfers to the substrate, raising its temperature towards that of the vaporizing particle. This generates free carriers, causing more absorption and more heating. If a certain threshold concentration is created, the process becomes unstable, thermally heating the material to catastrophic breakdown. Contamination-induced breakdown is exponentially bandgap dependent, and this prediction is borne out in experimental data from TiO2, Ta2O5, HfO2, Al 2O3, and SiO2. The spectral dependence of blackbody radiation and thermal photon noise is derived analytically for the first time as a function of spectra and mode density. An algorithm by which the analytical expression for the variance can
ERIC Educational Resources Information Center
Safarkhani, Maryam; Moerbeek, Mirjam
2013-01-01
In a randomized controlled trial, a decision needs to be made about the total number of subjects for adequate statistical power. One way to increase the power of a trial is by including a predictive covariate in the model. In this article, the effects of various covariate adjustment strategies on increasing the power is studied for discrete-time…
NASA Astrophysics Data System (ADS)
Shergill, H.; Robinson, T. R.; Dhillon, R. S.; Lester, M.; Milan, S. E.; Yeoman, T. K.
2010-05-01
High-power electromagnetic waves can excite a variety of plasma instabilities in Earth's ionosphere. These lead to the growth of plasma waves and plasma density irregularities within the heated volume, including patches of small-scale field-aligned electron density irregularities. This paper reports a statistical study of intensity distributions in patches of these irregularities excited by the European Incoherent Scatter (EISCAT) heater during beam-sweeping experiments. The irregularities were detected by the Co-operative UK Twin Located Auroral Sounding System (CUTLASS) coherent scatter radar located in Finland. During these experiments the heater beam direction is steadily changed from northward to southward pointing. Comparisons are made between statistical parameters of CUTLASS backscatter power distributions and modeled heater beam power distributions provided by the EZNEC version 4 software. In general, good agreement between the statistical parameters and the modeled beam is observed, clearly indicating the direct causal connection between the heater beam and the irregularities, despite the sometimes seemingly unpredictable nature of unaveraged results. The results also give compelling evidence in support of the upper hybrid theory of irregularity excitation.
Dziak, John J.; Lanza, Stephanie T.; Tan, Xianming
2014-01-01
Selecting the number of different classes which will be assumed to exist in the population is an important step in latent class analysis (LCA). The bootstrap likelihood ratio test (BLRT) provides a data-driven way to evaluate the relative adequacy of a (K −1)-class model compared to a K-class model. However, very little is known about how to predict the power or the required sample size for the BLRT in LCA. Based on extensive Monte Carlo simulations, we provide practical effect size measures and power curves which can be used to predict power for the BLRT in LCA given a proposed sample size and a set of hypothesized population parameters. Estimated power curves and tables provide guidance for researchers wishing to size a study to have sufficient power to detect hypothesized underlying latent classes. PMID:25328371
Arabidopsis: An Adequate Model for Dicot Root Systems?
Zobel, Richard W
2016-01-01
The Arabidopsis root system is frequently considered to have only three classes of root: primary, lateral, and adventitious. Research with other plant species has suggested up to eight different developmental/functional classes of root for a given plant root system. If Arabidopsis has only three classes of root, it may not be an adequate model for eudicot plant root systems. Recent research, however, can be interpreted to suggest that pre-flowering Arabidopsis does have at least five (5) of these classes of root. This then suggests that Arabidopsis root research can be considered an adequate model for dicot plant root systems. PMID:26904040
Lotterhos, Katie E; Whitlock, Michael C
2015-03-01
Although genome scans have become a popular approach towards understanding the genetic basis of local adaptation, the field still does not have a firm grasp on how sampling design and demographic history affect the performance of genome scans on complex landscapes. To explore these issues, we compared 20 different sampling designs in equilibrium (i.e. island model and isolation by distance) and nonequilibrium (i.e. range expansion from one or two refugia) demographic histories in spatially heterogeneous environments. We simulated spatially complex landscapes, which allowed us to exploit local maxima and minima in the environment in 'pair' and 'transect' sampling strategies. We compared F(ST) outlier and genetic-environment association (GEA) methods for each of two approaches that control for population structure: with a covariance matrix or with latent factors. We show that while the relative power of two methods in the same category (F(ST) or GEA) depended largely on the number of individuals sampled, overall GEA tests had higher power in the island model and F(ST) had higher power under isolation by distance. In the refugia models, however, these methods varied in their power to detect local adaptation at weakly selected loci. At weakly selected loci, paired sampling designs had equal or higher power than transect or random designs to detect local adaptation. Our results can inform sampling designs for studies of local adaptation and have important implications for the interpretation of genome scans based on landscape data. PMID:25648189
Is the Marketing Concept Adequate for Continuing Education?
ERIC Educational Resources Information Center
Rittenburg, Terri L.
1984-01-01
Because educators have a social responsibility to those they teach, the marketing concept may not be adequate as a philosophy for continuing education. In attempting to broaden the audience for continuing education, educators should consider a societal marketing concept to meet the needs of the educationally disadvantaged. (SK)
Comparability and Reliability Considerations of Adequate Yearly Progress
ERIC Educational Resources Information Center
Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young
2012-01-01
The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION...
Understanding Your Adequate Yearly Progress (AYP), 2011-2012
ERIC Educational Resources Information Center
Missouri Department of Elementary and Secondary Education, 2011
2011-01-01
The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation Rates.…
Assessing Juvenile Sex Offenders to Determine Adequate Levels of Supervision.
ERIC Educational Resources Information Center
Gerdes, Karen E.; And Others
1995-01-01
This study analyzed the internal consistency of four inventories used by Utah probation officers to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. Three factors accounted for 41.2 percent of variance (custodian's and juvenile's attitude toward intervention, offense characteristics, and historical…
34 CFR 200.13 - Adequate yearly progress in general.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Making adequate yearly progress. 200.20 Section 200.20 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE DISADVANTAGED...
Do Beginning Teachers Receive Adequate Support from Their Headteachers?
ERIC Educational Resources Information Center
Menon, Maria Eliophotou
2012-01-01
The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2010 CFR
2010-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... the reading/language arts and mathematics assessments in the three grade spans required under § 200.5... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2013 CFR
2013-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2012 CFR
2012-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of students in the group must be sufficient to yield statistically reliable information under... disabilities, respectively, is sufficient to yield statistically reliable information under § 200.7(a); or (B... students. (iii) For the purpose of reporting information on report cards under section 1111(h) of the...
Maintaining adequate hydration and nutrition in adult enteral tube feeding.
Dunn, Sasha
2015-01-01
Predicting the nutritional and fluid requirements of enterally-fed patients can be challenging and the practicalities of ensuring adequate delivery must be taken into consideration. Patients who are enterally fed can be more reliant on clinicians, family members and carers to meet their nutrition and hydration needs and identify any deficiencies, excesses or problems with delivery. Estimating a patient's requirements can be challenging due to the limitations of using predictive equations in the clinical setting. Close monitoring by all those involved in the patient's care, as well as regular review by a dietitian, is therefore required to balance the delivery of adequate feed and fluids to meet each patient's individual needs and prevent the complications of malnutrition and dehydration. Increasing the awareness of the signs of malnutrition and dehydration in patients receiving enteral tube feeding among those involved in a patient's care will help any deficiencies to be detected early on and rectified before complications occur. PMID:26087203
Assessing juvenile sex offenders to determine adequate levels of supervision.
Gerdes, K E; Gourley, M M; Cash, M C
1995-08-01
The present study analyzed the internal consistency of four inventories currently being used by probation officers in the state of Utah to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. The internal consistency or reliability of the inventories ranged from moderate to good. Factor analysis was utilized to significantly increase the reliability of the four inventories by collapsing them into the following three factors: (a) Custodian's and Juvenile's Attitude Toward Intervention; (b) Offense Characteristics; and (c) Historical Risk Factors. These three inventories/factors explained 41.2% of the variance in the combined inventories' scores. Suggestions are made regarding the creation of an additional inventory. "Characteristics of the Victim" to account for more of the variance. In addition, suggestions as to how these inventories can be used by probation officers to make objective and consistent decisions about adequate supervision levels and placement for juvenile sex offenders are discussed. PMID:7583754
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Qualley, G.; Newman, J. F.; Miller, W. O.
2015-09-28
The goal of our FY15 project was to explore the use of statistical models and high-resolution atmospheric input data to develop more accurate prediction models for turbine power generation. We modeled power for two operational wind farms in two regions of the country. The first site is a 235 MW wind farm in Northern Oklahoma with 140 GE 1.68 turbines. Our second site is a 38 MW wind farm in the Altamont Pass Region of Northern California with 38 Mitsubishi 1 MW turbines. The farms are very different in topography, climatology, and turbine technology; however, both occupy high wind resource areas in the U.S. and are representative of typical wind farms found in their respective areas.
Influence of the fluid density on the statistics of power fluctuations in von Kármán swirling flows
NASA Astrophysics Data System (ADS)
Opazo, A.; Sáez, A.; Bustamante, G.; Labbé, R.
2016-02-01
Here, we report experimental results on the fluctuations of injected power in confined turbulence. Specifically, we have studied a von Kármán swirling flow with constant external torque applied to the stirrers. Two experiments were performed at nearly equal Reynolds numbers, in geometrically similar experimental setups. Air was utilized in one of them and water in the other. With air, it was found that the probability density function of power fluctuations is strongly asymmetric, while with water, it is nearly Gaussian. This suggests that the outcome of a big change of the fluid density in the flow-stirrer interaction is not simply a change in the amplitude of stirrers' response. In the case of water, with a density roughly 830 times greater than air density, the coupling between the flow and the stirrers is stronger, so that they follow more closely the fluctuations of the average rotation of the nearby flow. When the fluid is air, the coupling is much weaker. The result is not just a smaller response of the stirrers to the torque exerted by the flow; the PDF of the injected power becomes strongly asymmetric and its spectrum acquires a broad region that scales as f-2. Thus, the asymmetry of the probability density functions of torque or angular speed could be related to the inability of the stirrers to respond to flow stresses. This happens, for instance, when the torque exerted by the flow is weak, due to small fluid density, or when the stirrers' moment of inertia is large. Moreover, a correlation analysis reveals that the features of the energy transfer dynamics with water are qualitatively and quantitatively different to what is observed with air as working fluid.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Nagelkerke, Leopold A J; van Densen, Wim L T
2007-02-01
We studied the effects of inter-annual variability and serial correlation on the statistical power of monitoring schemes to detect trends in biomass of bream (Abramis brama) in Lake Veluwemeer (The Netherlands). In order to distinguish between 'true' system variability and sampling variability we simulated the development of the bream population, using estimates for population structure and growth, and compared the resulting inter-annual variabilities and serial correlations with those from field data. In all cases the inter-annual variability in the field data was larger than in simulated data (e.g. for total biomass of all assessed bream sigma = 0.45 in field data, and sigma = 0.03-0.14 in simulated data) indicating that sampling variability decreased statistical power for detecting trends. Moreover, sampling variability obscured the inter-annual dependency (and thus the serial correlation) of biomass, which was expected because in this long-lived population biomass changes are buffered by the many year classes present. We did find the expected serial correlation in our simulation results and concluded that good survey data of long-lived fish populations should show low sampling variability and considerable inter-annual serial correlation. Since serial correlation decreases the power for detecting trends, this means that even when sampling variability would be greatly reduced, the number of sampling years to detect a change of 15%.year(-1) in bream populations (corresponding to a halving or doubling in a six-year period) would in most cases be more than six. This would imply that the six-year reporting periods that are required by the Water Framework Directive of the European Union are too short for the existing fish monitoring schemes. PMID:17219244
Higgins, P; Murray, M L; Williams, E M
1994-03-01
This descriptive, retrospective study examined levels of self-esteem, social support, and satisfaction with prenatal care in 193 low-risk postpartal women who obtained adequate and inadequate care. The participants were drawn from a regional medical center and university teaching hospital in New Mexico. A demographic questionnaire, the Coopersmith self-esteem inventory, the personal resource questionnaire part 2, and the prenatal care satisfaction inventory were used for data collection. Significant differences were found in the level of education, income, insurance, and ethnicity between women who received adequate prenatal care and those who received inadequate care. Women who were likely to seek either adequate or inadequate prenatal care were those whose total family income was $10,000 to $19,999 per year and high school graduates. Statistically significant differences were found in self-esteem, social support, and satisfaction between the two groups of women. Strategies to enhance self-esteem and social support have to be developed to reach women at risk for receiving inadequate prenatal care. PMID:8155221
Quantifying dose to the reconstructed breast: Can we adequately treat?
Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.
2013-04-01
To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.
Purchasing a cycle helmet: are retailers providing adequate advice?
Plumridge, E.; McCool, J.; Chetwynd, J.; Langley, J. D.
1996-01-01
OBJECTIVES: The aim of this study was to examine the selling of cycle helmets in retail stores with particular reference to the adequacy of advice offered about the fit and securing of helmets. METHODS: All 55 retail outlets selling cycle helmets in Christchurch, New Zealand were studied by participant observation. A research entered each store as a prospective customer and requested assistance to purchase a helmet. She took detailed field notes of the ensuing encounter and these were subsequently transcribed, coded, and analysed. RESULTS: Adequate advice for helmet purchase was given in less than half of the stores. In general the sales assistants in specialist cycle shops were better informed and gave more adequate advice than those in department stores. Those who have good advice also tended to be more good advice also tended to be more active in helping with fitting the helmet. Knowledge about safety standards was apparent in one third of sales assistants. Few stores displayed information for customers about the correct fit of cycle helmets. CONCLUSIONS: These findings suggest that the advice and assistance being given to ensure that cycle helmets fit properly is often inadequate and thus the helmets may fail to fulfil their purpose in preventing injury. Consultation between retailers and policy makers is a necessary first step to improving this situation. PMID:9346053
Adequate drainage system design for heap leaching structures.
Majdi, Abbas; Amini, Mehdi; Nasab, Saeed Karimi
2007-08-17
The paper describes an optimum design of a drainage system for a heap leaching structure which has positive impacts on both mine environment and mine economics. In order to properly design a drainage system the causes of an increase in the acid level of the heap which in turn produces severe problems in the hydrometallurgy processes must be evaluated. One of the most significant negative impacts induced by an increase in the acid level within a heap structure is the increase of pore acid pressure which in turn increases the potential of a heap-slide that may endanger the mine environment. In this paper, initially the thickness of gravelly drainage layer is determined via existing empirical equations. Then by assuming that the calculated thickness is constant throughout the heap structure, an approach has been proposed to calculate the required internal diameter of the slotted polyethylene pipes which are used for auxiliary drainage purposes. In order to adequately design this diameter, the pipe's cross-sectional deformation due to stepped heap structure overburden pressure is taken into account. Finally, a design of an adequate drainage system for the heap structure 2 at Sarcheshmeh copper mine is presented and the results are compared with those calculated by exiting equations. PMID:17321044
Charro, Elena; Pardo, Rafael; Peña, Víctor
2013-10-01
Coal-fired power-plants (CFPP) can be a source of contamination because the coal contains trace amounts of natural radionuclides, such as (40)K and (238)U, (232)Th and their decay products. These radionuclides can be released as fly ash from the CFPP and deposited from the atmosphere on the nearby top soils, therefore modifying the natural radioactivity background levels, and subsequently increasing the total radioactive dose received for the nearby population. In this paper, an area of 64 km(2) around the CFPP of Velilla del Río Carrión (Spain) has been studied by collecting 67 surface soil samples and measuring the activities of one artificial and six natural radionuclides by gamma spectrometry. The found results are similar to the background natural levels and ranged from 0 to 209 for (137)Cs, 11 to 50 for (238)U, 14 to 67 for (226)Ra, 29 to 380 for (210)Pb, 15 to 68 for (232)Th, 17 to 78 for (224)Ra, 97 to 790 for (40)K (all values in Bq kg(-1)). Besides the classical radiochemical tools, Analysis of Variance (ANOVA), Principal Component Analysis (PCA), Hierarchical Clustering Analysis (HCA), and kriging mapping have been used to the experimental dataset, allowing us to find the existence of two different models of spatial distribution around the CFPP. The first, followed by (238)U, (226)Ra, (232)Th, (224)Ra and (40)K can be assigned to 'natural background radioactivity', whereas the second model, followed by (210)Pb and (137)Cs, is based on 'atmospheric fallout radioactivity'. The main conclusion of this work is that CFPP has not influence on the radioactivity levels measured in the studied area, with has a mean annual outdoor effective dose E = 71 ± 22 μSv, very close to the average UNSCEAR value of 70 μSv, thus confirming the almost non-existent radioactive risk posed by the presence of the CFPP. PMID:23680923
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that
Are Academic Programs Adequate for the Software Profession?
ERIC Educational Resources Information Center
Koster, Alexis
2010-01-01
According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…
Are PPS payments adequate? Issues for updating and assessing rates
Sheingold, Steven H.; Richter, Elizabeth
1992-01-01
Declining operating margins under Medicare's prospective payment system (PPS) have focused attention on the adequacy of payment rates. The question of whether annual updates to the rates have been too low or cost increases too high has become important. In this article we discuss issues relevant to updating PPS rates and judging their adequacy. We describe a modification to the current framework for recommending annual update factors. This framework is then used to retrospectively assess PPS payment and cost growth since 1985. The preliminary results suggest that current rates are more than adequate to support the cost of efficient care. Also discussed are why using financial margins to evaluate rates is problematic and alternative methods that might be employed. PMID:10127450
Dose Limits for Man do not Adequately Protect the Ecosystem
Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.
2004-08-01
It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.
ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL
Coutts, D
2007-01-22
Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.
Adequate peritoneal dialysis: theoretical model and patient treatment.
Tast, C
1998-01-01
The objective of this study was to evaluate the relationship between adequate PD with sufficient weekly Kt/V (2.0) and Creatinine clearance (CCR) (60l) and necessary daily dialysate volume. This recommended parameter was the result of a recent multi-centre study (CANUSA). For this there were 40 patients in our hospital examined and compared in 1996, who carried out PD for at least 8 weeks and up to 6 years. These goals (CANUSA) are easily attainable in the early treatment of many individuals with a low body surface area (BSA). With higher BSA or missing RRF (Residual Renal Function) the daily dose of dialysis must be adjusted. We found it difficult to obtain the recommended parameters and tried to find a solution to this problem. The simplest method is to increase the volume or exchange rate. The most expensive method is to change from CAPD to APD with the possibility of higher volume or exchange rates. Selection of therapy must take into consideration: 1. patient preference, 2. body mass, 3. peritoneal transport rates, 4. ability to perform therapy, 5. cost of therapy and 6. risk of peritonitis. With this information in mind, an individual prescription can be formulated and matched to the appropriate modality of PD. PMID:10392062
DARHT - an `adequate` EIS: A NEPA case study
Webb, M.D.
1997-08-01
The Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS) provides a case study that is interesting for many reasons. The EIS was prepared quickly, in the face of a lawsuit, for a project with unforeseen environmental impacts, for a facility that was deemed urgently essential to national security. Following judicial review the EIS was deemed to be {open_quotes}adequate.{close_quotes} DARHT is a facility now being built at Los Alamos National Laboratory (LANL) as part of the Department of Energy (DOE) nuclear weapons stockpile stewardship program. DARHT will be used to evaluate the safety and reliability of nuclear weapons, evaluate conventional munitions and study high-velocity impact phenomena. DARHT will be equipped with two accelerator-driven, high-intensity X-ray machines to record images of materials driven by high explosives. DARHT will be used for a variety of hydrodynamic tests, and DOE plans to conduct some dynamic experiments using plutonium at DARHT as well.
Adequate Yearly Progress (AYP) at Your Library Media Center
ERIC Educational Resources Information Center
Anderson, Cynthia
2007-01-01
Together administration and the library media center form a team that can make a difference in student learning and, in turn, in student achievement. The library media center can contribute to improve student learning, and there is an amazingly small cost that administration must pay for this powerful support. This article addresses…
On Adequate Comparisons of Antenna Phase Center Variations
NASA Astrophysics Data System (ADS)
Schoen, S.; Kersten, T.
2013-12-01
One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate
Improving access to adequate pain management in Taiwan.
Scholten, Willem
2015-06-01
There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment. PMID:26068436
Shi, Runhua; McLarty, Jerry W
2009-10-01
In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications. PMID:19891281
Kasap, Burcu; Akbaba, Gülhan; Yeniçeri, Emine N.; Akın, Melike N.; Akbaba, Eren; Öner, Gökalp; Turhan, Nilgün Ö.; Duru, Mehmet E.
2016-01-01
Objectives: To assess current iodine levels and related factors among healthy pregnant women. Methods: In this cross-sectional, hospital-based study, healthy pregnant women (n=135) were scanned for thyroid volume, provided urine samples for urinary iodine concentration and completed a questionnaire including sociodemographic characteristics and dietary habits targeted for iodine consumption at the Department of Obstetrics and Gynecology, School of Medicine, Muğla Sıtkı Koçman University, Muğla, Turkey, between August 2014 and February 2015. Sociodemographic data were analyzed by simple descriptive statistics. Results: Median urinary iodine concentration was 222.0 µg/L, indicating adequate iodine intake during pregnancy. According to World Health Organization (WHO) criteria, 28.1% of subjects had iodine deficiency, 34.1% had adequate iodine intake, 34.8% had more than adequate iodine intake, and 3.0% had excessive iodine intake during pregnancy. Education level, higher monthly income, current employment, consuming iodized salt, and adding salt to food during, or after cooking were associated with higher urinary iodine concentration. Conclusion: Iodine status of healthy pregnant women was adequate, although the percentage of women with more than adequate iodine intake was higher than the reported literature. PMID:27279519
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.
ERIC Educational Resources Information Center
Callamaras, Peter
1983-01-01
This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)
ERIC Educational Resources Information Center
Meyer, Donald L.
Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…
SNAP benefits: Can an adequate benefit be defined?
Yaktine, Ann L; Caswell, Julie A
2014-01-01
The Supplemental Nutrition Assistance Program (SNAP) increases the food purchasing power of participating households. A committee convened by the Institute of Medicine (IOM) examined the question of whether it is feasible to define SNAP allotment adequacy. Total resources; individual, household, and environmental factors; and SNAP program characteristics that affect allotment adequacy were identified from a framework developed by the IOM committee. The committee concluded that it is feasible to define SNAP allotment adequacy; however, such a definition must take into account the degree to which participants' total resources and individual, household, and environmental factors influence the purchasing power of SNAP benefits and the impact of SNAP program characteristics on the calculation of the dollar value of the SNAP allotment. The committee recommended that the USDA Food and Nutrition Service investigate ways to incorporate these factors and program characteristics into research aimed at defining allotment adequacy. PMID:24425718
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
HU, Yi-Juan; SUN, Wei; TZENG, Jung-Ying; PEROU, Charles M.
2015-01-01
Studies of expression quantitative trait loci (eQTLs) offer insight into the molecular mechanisms of loci that were found to be associated with complex diseases and the mechanisms can be classified into cis- and trans-acting regulation. At present, high-throughput RNA sequencing (RNA-seq) is rapidly replacing expression microarrays to assess gene expression abundance. Unlike microarrays that only measure the total expression of each gene, RNA-seq also provides information on allele-specific expression (ASE), which can be used to distinguish cis-eQTLs from trans-eQTLs and, more importantly, enhance cis-eQTL mapping. However, assessing the cis-effect of a candidate eQTL on a gene requires knowledge of the haplotypes connecting the candidate eQTL and the gene, which cannot be inferred with certainty. The existing two-stage approach that first phases the candidate eQTL against the gene and then treats the inferred phase as observed in the association analysis tends to attenuate the estimated cis-effect and reduce the power for detecting a cis-eQTL. In this article, we provide a maximum-likelihood framework for cis-eQTL mapping with RNA-seq data. Our approach integrates the inference of haplotypes and the association analysis into a single stage, and is thus unbiased and statistically powerful. We also develop a pipeline for performing a comprehensive scan of all local eQTLs for all genes in the genome by controlling for false discovery rate, and implement the methods in a computationally efficient software program. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to empirical breast cancer data from The Cancer Genome Atlas project. PMID:26568645
Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A
2008-01-01
Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.
Wide Wide World of Statistics: International Statistics on the Internet.
ERIC Educational Resources Information Center
Foudy, Geraldine
2000-01-01
Explains how to find statistics on the Internet, especially international statistics. Discusses advantages over print sources, including convenience, currency of information, cost effectiveness, and value-added formatting; sources of international statistics; United Nations agencies; search engines and power searching; and evaluating sources. (LRW)
Smith, Alwyn
1969-01-01
This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Statistics of football dynamics
NASA Astrophysics Data System (ADS)
Mendes, R. S.; Malacarne, L. C.; Anteneodo, C.
2007-06-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by q-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Percentage of Adults with High Blood Pressure Whose Hypertension Is Adequately Controlled
... is Adequately Controlled Percentage of Adults with High Blood Pressure Whose Hypertension is Adequately Controlled Heart disease ... Survey. Age Group Percentage of People with High Blood Pressure that is Controlled by Age Group f94q- ...
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...
NASA Astrophysics Data System (ADS)
Hermann, Claudine
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2013 CFR
2013-04-01
... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... production performance, or biased observation. One or more adequate and well-controlled studies are required... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Adequate and well-controlled studies. 514.117... Applications § 514.117 Adequate and well-controlled studies. (a) Purpose. The primary purpose of...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... Standards and Maintaining Adequate Iron Stores in Blood Donors.'' The purpose of this public workshop is to... donor safety and blood availability, and potential measures to maintain adequate iron stores in...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2013 CFR
2013-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2011 CFR
2011-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2012 CFR
2012-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2014 CFR
2014-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186
Statistical Energy Analysis Program
NASA Technical Reports Server (NTRS)
Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.
1985-01-01
Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
NASA Astrophysics Data System (ADS)
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
ERIC Educational Resources Information Center
Martin, Tammy Faith
2012-01-01
The purpose of this study was to examine principal leadership styles and their influence on school performance as measured by adequate yearly progress at selected Title I schools in South Carolina. The main focus of the research study was to complete descriptive statistics on principal leadership styles in schools that met or did not meet adequate…
Scholfield, D.J.; Fields, M.; Beal, T.; Lewis, C.G.; Behall, K.M. )
1989-02-09
The symptoms of copper (Cu) deficiency are known to be more severe when rats are fed a diet with fructose (F) as the principal carbohydrate. Mortality, in males, due to cardiac abnormalities usually occurs after five weeks of a 62% F, 0.6 ppm Cu deficient diet. These effects are not observed if cornstarch (CS) is the carbohydrate (CHO) source. Studies with F containing diets have shown increased catecholamine (C) turnover rates while diets deficient in Cu result in decreased norepinephrine (N) levels in tissues. Dopamine B-hydroxylase (EC 1.14.17.1) is a Cu dependent enzyme which catalyzes the conversion of dopamine (D) to N. An experiment was designed to investigate the effects of CHO and dietary Cu on levels of three C in cardiac tissue. Thirty-two male and female Sprague-Dawley rats were fed Cu deficient or adequate diets with 60% of calories from F or CS for 6 weeks. N, epinephrine (E) and D were measured by HPLC. Statistical analysis indicates that Cu deficiency tends to decrease N levels, while having the reverse effect on E. D did not appear to change. These findings indicate that Cu deficiency but not dietary CHO can affect the concentration of N and E in rat cardiac tissue.
ERIC Educational Resources Information Center
Chicot, Katie; Holmes, Hilary
2012-01-01
The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…
ERIC Educational Resources Information Center
Catley, Alan
2007-01-01
Following the announcement last year that there will be no more math coursework assessment at General Certificate of Secondary Education (GCSE), teachers will in the future be able to devote more time to preparing learners for formal examinations. One of the key things that the author has learned when teaching statistics is that it makes for far…
NASA Astrophysics Data System (ADS)
Munson, Chase A.; De Lucia, Frank C.; Piehler, Thuvan; McNesby, Kevin L.; Miziolek, Andrzej W.
2005-08-01
Laser-induced breakdown spectroscopy spectra of bacterial spores, molds, pollens and nerve agent simulants have been acquired. The performance of several statistical methodologies-linear correlation, principal components analysis, and soft independent model of class analogy-has been evaluated for their ability to differentiate between the various samples. The effect of data selection (total spectra, peak intensities, and intensity ratios) and pre-treatments (e.g., averaging) on the statistical models have also been studied. Results indicate the use of spectral averaging and weighting schemes may be used to significantly improve sample differentiation.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
ERIC Educational Resources Information Center
Hassad, Rossi A.
2002-01-01
Statistics is generally classified as a difficult course. In this regard, promoting an active learning environment is a popular instructional strategy. However, there is much evidence in the literature of a literal and simplistic interpretation of "active learning environment". This report describes an approach to facilitating introductory…
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Montano, Rosa
2013-01-01
We investigate the performance of three statistics, R [subscript 1], R [subscript 2] (Glas in "Psychometrika" 53:525-546, 1988), and M [subscript 2] (Maydeu-Olivares & Joe in "J. Am. Stat. Assoc." 100:1009-1020, 2005, "Psychometrika" 71:713-732, 2006) to assess the overall fit of a one-parameter logistic model (1PL) estimated by (marginal) maximum…
ERIC Educational Resources Information Center
Schochet, Peter Z.
2009-01-01
For RCTs of education interventions, it is often of interest to estimate associations between student and mediating teacher practice outcomes, to examine the extent to which the study's conceptual model is supported by the data, and to identify specific mediators that are most associated with student learning. This paper develops statistical power…
NASA Astrophysics Data System (ADS)
Masuta, Taisuke; Gunjikake, Yasutoshi; Yokoyama, Akihiko; Tada, Yasuyuki
Nowadays, electric power systems confront many problems, such as environmental issues, aging infrastructures, energy security, and quality of electricity supply. The smart grid is a new concept of a better future grid, which enables us to solve the mentioned problems with Information and Communication Technology (ICT). In this research, a number of Heat Pump Water Heaters (HPWHs), one of the energy efficient-use customer equipment, and Battery Energy Storage System (BESS) are considered as controllable equipment for the frequency control. The utilization of customer equipment such as HPWH for power system control is one of the key elements in the concept of Ubiquitous Power Grid, which was proposed by our research group as a smart grid in Japanese context. The frequency control using a number of HPWHs with thermal storage of hot water tank is evaluated. Moreover, a novel statistical modeling of controllable HPWHs taking into account customers' convenience and uncertainty is proposed.
1986-01-01
Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
Cappon, Gregg D; Bowman, Christopher J; Hurtt, Mark E; Grantham, Lonnie E
2012-10-01
An important aspect of the enhanced pre- and postnatal developmental (ePPND) toxicity study in nonhuman primates (NHP) is that it combines in utero and postnatal assessments in a single study. However, it is unclear if NHP ePPND studies are suitable to perform all of the evaluations incorporated into rodent PPND studies. To understand the value of including cognitive assessment in a NHP ePPND toxicity study, we performed a power analysis of object discrimination reversal task data using a modified Wisconsin General Testing Apparatus (ODR-WGTA) from two NHP ePPND studies. ODR-WGTA endpoints evaluated were days to learning and to first reversal, and number of reversals. With α = 0.05 and a one-sided t-test, a sample of seven provided 80% power to predict a 100% increase in all three of the ODR-WGTA endpoints; a sample of 25 provided 80% power to predict a 50% increase. Similar power analyses were performed with data from the Cincinnati Water Maze (CWM) and passive avoidance tests from three rat PPND toxicity studies. Groups of 5 and 15 in the CWM and passive avoidance test, respectively, provided 80% power to detect a 100% change. While the power of the CWM is not far superior to the NHP ODR-WGTA, a clear advantage is the routine use of larger sample size, with a group of 20 rats the CWM provides ~90% power to detect a 50% change. Due to the limitations on the number of animals, the ODR-WGTA may not be suitable for assessing cognitive impairment in NHP ePPND studies. PMID:22930561
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2011 CFR
2011-04-01
...-evident (general anesthetics, drug metabolism). (3) The method of selection of subjects provides adequate... respect to pertinent variables such as age, sex, severity of disease, duration of disease, and use of... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Adequate and well-controlled studies....
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
Calculation of the Cost of an Adequate Education in Kentucky: A Professional Judgment Approach
ERIC Educational Resources Information Center
Verstegen, Deborah A.
2004-01-01
What is an adequate education and how much does it cost? In 1989, Kentucky's State Supreme Court found the entire system of education unconstitutional--"all of its parts and parcels". The Court called for all children to have access to an adequate education, one that is uniform and has as its goal the development of seven capacities, including:…
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a)...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending Veterinarian and Adequate Veterinary Care §...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
... SAFETY BOARD Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers... TO THE SECRETARY OF ENERGY Safety Analysis Requirements for Defining Adequate Protection for the... safety analysis, or DSA, is to be prepared for every DOE nuclear facility. This DSA, once approved by...
Shoulder Arthroscopy Does Not Adequately Visualize Pathology of the Long Head of Biceps Tendon
Saithna, Adnan; Longo, Alison; Leiter, Jeff; Old, Jason; MacDonald, Peter M.
2016-01-01
Background: Pulling the long head of the biceps tendon into the joint at arthroscopy is a common method for evaluation of tendinopathic lesions. However, the rate of missed diagnoses when using this technique is reported to be as high as 30% to 50%. Hypothesis: Tendon excursion achieved using a standard arthroscopic probe does not allow adequate visualization of extra-articular sites of predilection of tendinopathy. Study Design: Descriptive laboratory study. Methods: Seven forequarter amputation cadaveric specimens were evaluated. The biceps tendon was tagged to mark the intra-articular length and the maximum excursions achieved using a probe and a grasper in both beach-chair and lateral positions. Statistical analyses were performed using analysis of variance to compare means. Results: The mean intra-articular and extra-articular lengths of the tendons were 23.9 and 82.3 mm, respectively. The length of tendon that could be visualized by pulling it into the joint with a probe through the anterior midglenoid portal was not significantly different when using either lateral decubitus (mean ± SD, 29.9 ± 3.89 mm; 95% CI, 25.7-34 mm) or beach-chair positions (32.7 ± 4.23 mm; 95% CI, 28.6-36.8 mm). The maximum length of the overall tendon visualized in any specimen using a standard technique was 37 mm. Although there was a trend to greater excursion using a grasper through the same portal, this was not statistically significant. However, using a grasper through the anterosuperior portal gave a significantly greater mean excursion than any other technique (46.7 ± 4.31 mm; 95% CI, 42.6-50.8 mm), but this still failed to allow evaluation of Denard zone C. Conclusion: Pulling the tendon into the joint with a probe via an anterior portal does not allow visualization of distal sites of predilection of pathology. Surgeons should be aware that this technique is inadequate and can result in missed diagnoses. Clinical Relevance: This study demonstrates that glenohumeral
Technology Transfer Automated Retrieval System (TEKTRAN)
This experiment was designed to determine whether hemoglobin as the sole source of dietary iron (Fe) could sustain normal Fe status in growing rats. Because adequate copper (Cu) status is required for efficient Fe absorption in the rat, we also determined the effects of Cu deficiency on Fe status of...
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871
NASA Astrophysics Data System (ADS)
Lambert, I. B.
2012-04-01
This presentation will consider the adequacy of global uranium and thorium resources to meet realistic nuclear power demand scenarios over the next half century. It is presented on behalf of, and based on evaluations by, the Uranium Group - a joint initiative of the OECD Nuclear Energy Agency and the International Atomic Energy Agency, of which the author is a Vice Chair. The Uranium Group produces a biennial report on Uranium Resources, Production and Demand based on information from some 40 countries involved in the nuclear fuel cycle, which also briefly reviews thorium resources. Uranium: In 2008, world production of uranium amounted to almost 44,000 tonnes (tU). This supplied approximately three-quarters of world reactor requirements (approx. 59,000 tU), the remainder being met by previously mined uranium (so-called secondary sources). Information on availability of secondary sources - which include uranium from excess inventories, dismantling nuclear warheads, tails and spent fuel reprocessing - is incomplete, but such sources are expected to decrease in market importance after 2013. In 2008, the total world Reasonably Assured plus Inferred Resources of uranium (recoverable at less than 130/kgU) amounted to 5.4 million tonnes. In addition, it is clear that there are vast amounts of uranium recoverable at higher costs in known deposits, plus many as yet undiscovered deposits. The Uranium Group has concluded that the uranium resource base is more than adequate to meet projected high-case requirements for nuclear power for at least half a century. This conclusion does not assume increasing replacement of uranium by fuels from reprocessing current reactor wastes, or by thorium, nor greater reactor efficiencies, which are likely to ameliorate future uranium demand. However, progressively increasing quantities of uranium will need to be mined, against a backdrop of the relatively small number of producing facilities around the world, geopolitical uncertainties and
Code of Federal Regulations, 2010 CFR
2010-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
Dios, R.A.
1984-01-01
This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels.
Barth, Amy E.; Barnes, Marcia; Francis, David J.; Vaughn, Sharon; York, Mary
2015-01-01
Separate mixed model analyses of variance (ANOVA) were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6–12 (n = 1203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in comprehension. Results suggest that there is considerable growth across the middle and high school years, particularly for adequate comprehenders in those text integration processes that maintain local coherence. Accuracy in text consistency judgments accounted for significant unique variance for passage-level, but not sentence-level comprehension, particularly for adequate comprehenders. PMID:26166946
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible
The Effects of Flare Definitions on the Statistics of Derived Flare Distrubtions
NASA Astrophysics Data System (ADS)
Ryan, Daniel; Dominique, Marie; Seaton, Daniel B.; Stegen, Koen; White, Arthur
2016-05-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. However, statistical flare studies are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds which may affect the derived flare distributions. We explore the effect of the arbitrary thresholds used in the GOES event list and LYRA Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the algorithms’ flare start thresholds. We also find that the power law exponents of these distributions are not stable but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are clearly non-power law. We show that this is consistent with an insufficient degradation correction which causes LYRA absolute irradiance values to be unreliable. This means that they should not be used for flare statistics or energetics unless degradation is adequately accounted for. However they can be used to study time variations over shorter timescales and for space weather monitoring.
NASA Astrophysics Data System (ADS)
Murphy, Brendan; Simon, Anna
2015-10-01
p-process nucleosynthesis is believed to be the origin of 35 stable, proton-rich nuclei called ``p-nuclei'', that cannot be synthesized by neutron captures. The complex p-process network includes, among others, (α , γ) reactions, whose cross-sections are not very well described by current theoretical models. Here, a collection of experimentally measured (α , γ) reactions from the KADoNiS-p database was used as a test for various models obtained from TALYS, a nuclear reaction program, and NON-SMOKER, the principal theoretical database in this field. Statistical models in this investigation required the alpha optical model potential (aOMP), the gamma strength function (gSF), and the level density model (ld) as input. Permutations of all three were used in theoretical calculations; as there exist 5 separate models for aOMP and gSF, and 6 for ld, there were 150 combinations of interest. After calculating cross-sections with these parameters, a χ2 test was used to determine the set of permutes that was closest to the experimental data. The (α , γ) reaction of the 91Zr target is presented as the example case.
Nonstationary statistical theory for multipactor
Anza, S.; Vicente, C.; Gil, J.
2010-06-15
This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.
Statistical properties of exoplanets
NASA Astrophysics Data System (ADS)
Udry, Stéphane
Since the detection a decade ago of the planetary companion of 51 Peg, more than 165 extrasolar planets have been unveiled by radial-velocity measurements. They present a wide variety of characteristics such as large masses with small orbital separations, high eccentricities, period resonances in multi-planet systems, etc. Meaningful features of the statistical distributions of the orbital parameters or parent stellar properties have emerged. We discuss them in the context of the constraints they provide for planet-formation models and in comparison to Neptune-mass planets in short-period orbits recently detected by radial-velocity surveys, thanks to new instrumental developments and adequate observing strategy. We expect continued improvement in velocity precision and anticipate the detection of Neptune-mass planets in longer-period orbits and even lower-mass planets in short-period orbits, giving us new information on the mass distribution function of exoplanets. Finally, the role of radial-velocity follow-up measurements of transit candidates is emphasized.
Cosmetic Plastic Surgery Statistics
2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...
A method for determining adequate resistance form of complete cast crown preparations.
Weed, R M; Baez, R J
1984-09-01
A diagram with various degrees of occlusal convergence, which takes into consideration the length and diameter of complete crown preparations, was designed as a guide to assist the dentist to obtain adequate resistance form. To test the validity of the diagram, five groups of complete cast crown stainless steel dies were prepared (3.5 mm long, occlusal convergence 10, 13, 16, 19, and 22 degrees). Gold copings were cast for each of the 50 preparations. Displacement force was applied to the casting perpendicularly to a simulated 30-degree cuspal incline until the casting was displaced. Castings were deformed at margins except for the 22-degree group. Castings from this group were displaced without deformation, and it was concluded that there was a lack of adequate resistance form as predicted by the diagram. The hypothesis that the diagram could be used to predict adequate or inadequate resistance form was confirmed by this study. PMID:6384470
Comparison of four standards for determining adequate water intake of nursing home residents.
Gaspar, Phyllis M
2011-01-01
Adequate hydration for nursing home residents is problematic. The purpose of this study was to compare four standards used to determine a recommended water intake among nursing home residents. Inconsistencies in the amount of water intake recommended based on the standards compared were identified. The standard based on height and weight provides the most individualized recommendation. An individualized recommendation would facilitate goal setting for the care plan of each older person and assist in the prevention of dehydration. It is essential that a cost-effective and clinically feasible approach to determine adequate water intake be determined for this population to prevent the adverse outcomes associated with dehydration. PMID:21469538
Statistics in fusion experiments
NASA Astrophysics Data System (ADS)
McNeill, D. H.
1997-11-01
Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).
Bello-Silva, Marina Stella; Wehner, Martin; Eduardo, Carlos de Paula; Lampert, Friedrich; Poprawe, Reinhart; Hermans, Martin; Esteves-Oliveira, Marcella
2013-01-01
This study aimed to evaluate the possibility of introducing ultra-short pulsed lasers (USPL) in restorative dentistry by maintaining the well-known benefits of lasers for caries removal, but also overcoming disadvantages, such as thermal damage of irradiated substrate. USPL ablation of dental hard tissues was investigated in two phases. Phase 1--different wavelengths (355, 532, 1,045, and 1,064 nm), pulse durations (picoseconds and femtoseconds) and irradiation parameters (scanning speed, output power, and pulse repetition rate) were assessed for enamel and dentin. Ablation rate was determined, and the temperature increase measured in real time. Phase 2--the most favorable laser parameters were evaluated to correlate temperature increase to ablation rate and ablation efficiency. The influence of cooling methods (air, air-water spray) on ablation process was further analyzed. All parameters tested provided precise and selective tissue ablation. For all lasers, faster scanning speeds resulted in better interaction and reduced temperature increase. The most adequate results were observed for the 1064-nm ps-laser and the 1045-nm fs-laser. Forced cooling caused moderate changes in temperature increase, but reduced ablation, being considered unnecessary during irradiation with USPL. For dentin, the correlation between temperature increase and ablation efficiency was satisfactory for both pulse durations, while for enamel, the best correlation was observed for fs-laser, independently of the power used. USPL may be suitable for cavity preparation in dentin and enamel, since effective ablation and low temperature increase were observed. If adequate laser parameters are selected, this technique seems to be promising for promoting the laser-assisted, minimally invasive approach. PMID:22565342
Not Available
1994-12-08
This report presents a summary of electric power industry statistics at national, regional, and state levels: generating capability and additions, net generation, fossil-fuel statistics, retail sales and revenue, finanical statistics, environmental statistics, power transactions, demand side management, nonutility power producers. Purpose is to provide industry decisionmakers, government policymakers, analysts, and the public with historical data that may be used in understanding US electricity markets.
ERIC Educational Resources Information Center
Ma, Xin; Shen, Jianping; Krenn, Huilan Y.
2014-01-01
Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…
Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.
Mossad, Sherif B
2005-11-01
Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared. PMID:16315443
Calculating and Reducing Errors Associated with the Evaluation of Adequate Yearly Progress.
ERIC Educational Resources Information Center
Hill, Richard
In the Spring, 1996, issue of "CRESST Line," E. Baker and R. Linn commented that, in efforts to measure the progress of schools, "the fluctuations due to differences in the students themselves could conceal differences in instructional effects." This is particularly true in the context of the evaluation of adequate yearly progress required by…
ERIC Educational Resources Information Center
Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron
2014-01-01
To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…
Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring
Technology Transfer Automated Retrieval System (TEKTRAN)
We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... FR 51735. Executive Order 13132, Federalism. This rule involves no policies that have ] federalism....C. 4001 et seq., Reorganization Plan No. 3 of 1978, 3 CFR, 1978 Comp., p. 329; E.O. 12127, 44 FR... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management...
26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.
Code of Federal Regulations, 2010 CFR
2010-04-01
... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...
The Unequal Effect of Adequate Yearly Progress: Evidence from School Visits
ERIC Educational Resources Information Center
Brown, Abigail B.; Clift, Jack W.
2010-01-01
The authors report insights, based on annual site visits to elementary and middle schools in three states from 2004 to 2006, into the incentive effect of the No Child Left Behind Act's requirement that increasing percentages of students make Adequate Yearly Progress (AYP) in every public school. They develop a framework, drawing on the physics…
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
ERIC Educational Resources Information Center
Moser, Sharon
2010-01-01
The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to failure to…
A Model for Touch Technique and Computation of Adequate Cane Length.
ERIC Educational Resources Information Center
Plain-Switzer, Karen
1993-01-01
This article presents a model for the motion of a long-cane executing the touch technique and presents formulas for the projected length of a cane adequate to protect an individual with blindness against wall-type and pole-type hazards. The paper concludes that the long-cane should reach from the floor to the user's armpit. (JDD)
Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.
ERIC Educational Resources Information Center
Pary, Robert J.
1991-01-01
Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2014 CFR
2014-01-01
... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2013 CFR
2013-01-01
... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2010 CFR
2010-01-01
... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2011 CFR
2011-01-01
... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2013 CFR
2013-01-01
... animal health, behavior, and well-being is conveyed to the attending veterinarian; (4) Guidance to... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2014 CFR
2014-01-01
... on problems of animal health, behavior, and well-being is conveyed to the attending veterinarian; (4... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND...
ERIC Educational Resources Information Center
Wilcox, Jennifer E.
2011-01-01
This mixed-methods study researched the special education background experience of principals and the effect on students in the subgroup of Students with Disabilities in making Adequate Yearly Progress (AYP). In the state of Ohio, schools and districts are expected to make AYP as a whole and additionally make AYP for each subgroup (various…
ERIC Educational Resources Information Center
Barth, Amy E.; Barnes, Marcia; Francis, David; Vaughn, Sharon; York, Mary
2015-01-01
Separate mixed model analyses of variance were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6-12 (n = 1,203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in…
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... with the State's requirements for availability of services, as set forth in § 438.206. (e) CMS' right... HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and Performance... 42 Public Health 4 2010-10-01 2010-10-01 false Assurances of adequate capacity and services....
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2014 CFR
2014-10-01
... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2012 CFR
2012-10-01
... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2013 CFR
2013-10-01
... requirements: (1) Offers an appropriate range of preventive, primary care, and specialty services that is adequate for the anticipated number of enrollees for the service area. (2) Maintains a network of providers... enrollment in its service area in accordance with the State's standards for access to care under this...
Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus
NASA Technical Reports Server (NTRS)
Maksimovich, Y. B.; Khinchikashvili, N. V.
1980-01-01
The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... conducting clinical investigations of a drug is to distinguish the effect of a drug from other influences... recognized by the scientific community as the essentials of an adequate and well-controlled clinical... randomization and blinding of patients or investigators, or both. If the intent of the trial is to...
Final 2004 Report on Adequate Yearly Progress in the Montgomery County Public Schools
ERIC Educational Resources Information Center
Stevenson, Jose W.
2005-01-01
The vast majority of Montgomery County public schools made sufficient progress on state testing and accountability standards in 2004 to comply with the adequate yearly progress (AYP) requirements under the "No Child Left Behind (NCLB) Act of 2001." Information released by the Maryland State Department of Education (MSDE) in October 2004 shows that…
Estimates of Adequate School Spending by State Based on National Average Service Levels.
ERIC Educational Resources Information Center
Miner, Jerry
1983-01-01
Proposes a method for estimating expenditures per student needed to provide educational adequacy in each state. Illustrates the method using U.S., Arkansas, New York, Texas, and Washington State data, covering instruction, special needs, operations and maintenance, administration, and other costs. Estimates ratios of "adequate" to actual spending…
ERIC Educational Resources Information Center
Leapley-Portscheller, Claudia Iris
2008-01-01
Principals are responsible for leading efforts to reach increasingly higher levels of student academic proficiency in schools associated with adequate yearly progress (AYP) requirements. The purpose of this quantitative, correlational study was to identify the degree to which perceptions of principal transformational, transactional, and…
Percentage of Adults with High Cholesterol Whose LDL Cholesterol Levels Are Adequately Controlled
... of Adults with High Cholesterol Whose LDL Cholesterol Levels are Adequately Controlled High cholesterol can double a ... with High Cholesterol that is Controlled by Education Level 8k4c-k22f Download these data » Click on legends ...
42 CFR 413.24 - Adequate cost data and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 2 2010-10-01 2010-10-01 false Adequate cost data and cost finding. 413.24 Section 413.24 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM PRINCIPLES OF REASONABLE COST REIMBURSEMENT; PAYMENT FOR END-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY...
ERIC Educational Resources Information Center
Meyer, Jadie K.
2012-01-01
The purpose of this study was to examine the perceptions of principals who have met Adequate Yearly Progress (AYP) with the special education subgroup. This was a qualitative study, utilizing interviews to answer the research questions. The first three research questions analyzed the areas of assessment, building-level leadership, and curriculum…
Human milk feeding supports adequate growth in infants
Technology Transfer Automated Retrieval System (TEKTRAN)
Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
... November 15, 2010 (75 FR 69648). The corrected text of the recommendation approved by the Board is below... or telephone number (202) 694-7000. Correction: In the Federal Register of November 15, 2010 (75 FR... SAFETY BOARD Safety Analysis Requirements for Defining Adequate Protection for the Public and the...
ERIC Educational Resources Information Center
Lee, Jaekyung
2003-01-01
This article examines major threats to the validity of Adequate Yearly Progress (AYP) in the context of rural schools. Although rural students and their schools made significant academic progress in the past on national and state assessments, the current goal of AYP turns out to be highly unrealistic for them unless states set far lower…
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Exemptions for pesticides adequately regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION...
What Is the Cost of an Adequate Vermont High School Education?
ERIC Educational Resources Information Center
Rucker, Frank D.
2010-01-01
Access to an adequate education has been widely considered an undeniable right since Chief Justice Warren stated in his landmark decision that "Today, education is perhaps the most important function of state and local governments...it is doubtful that any child may reasonably be expected to succeed in life if he is denied the opportunity of an…
Three not adequately understood lunar phenomena investigated by the wave planetology
NASA Astrophysics Data System (ADS)
Kochemasov, G. G.
2009-04-01
Three not adequately understood lunar phenomena investigated by the wave planetology G. Kochemasov IGEM of the Russian Academy of Sciences, Moscow, Russia, kochem.36@mail.ru The lunar science notwithstanding rather numerous researches of the last 50 years still debates some important issues. Three of them concern an origin of mascons, the deepest but low ferruginous South Pole-Aitken depression, a strange character of the frequency-crater size curve. Prevailing approaches are mainly based on impacts having made the present geomorphology of the Moon. However practically are ignored the fact of antipodality of basins and marea, a complex character of the frequency-crater size curve obviously implying an involvement of different sources and reasons responsible for crater formation. Attempts to find impactor sources in various sometimes very remote parts of the Solar system are too artificial, besides they do not explain very intensive, like lunar cratering of Mercury. Saturation of the lunar surface by ~70-km diameter craters is very strange for random impacts from any source; to find a time interval for this saturation is difficult if not possible because it affects formations of various ages. Lunar basins and marea completely contradict to a classical frequency- crater size curve. Their presumed ( and measured) different ages make dubious existence of one specialized impactor source. So, if one accepts an impact process as the only process responsible for cratering (ring forms development) then the real mess in crater statistics and timing never will be overcome. The wave planetology [1-3 & others] examined by many planets and satellites of the Solar system proved to be real. In a case of the Moon it can help in answering the above questions. First of all it should be admitted that the complex lunar crater (ring forms) statistics is due to a superposition and mixing of two main processes (a minor involvement of volcanic features is also present): impacts and wave
Statistical approaches to short-term electricity forecasting
NASA Astrophysics Data System (ADS)
Kellova, Andrea
The study of the short-term forecasting of electricity demand has played a key role in the economic optimization of the electric energy industry and is essential for power systems planning and operation. In electric energy markets, accurate short-term forecasting of electricity demand is necessary mainly for economic operations. Our focus is directed to the question of electricity demand forecasting in the Czech Republic. Firstly, we describe the current structure and organization of the Czech, as well as the European, electricity market. Secondly, we provide a complex description of the most powerful external factors influencing electricity consumption. The choice of the most appropriate model is conditioned by these electricity demand determining factors. Thirdly, we build up several types of multivariate forecasting models, both linear and nonlinear. These models are, respectively, linear regression models and artificial neural networks. Finally, we compare the forecasting power of both kinds of models using several statistical accuracy measures. Our results suggest that although the electricity demand forecasting in the Czech Republic is for the considered years rather a nonlinear than a linear problem, for practical purposes simple linear models with nonlinear inputs can be adequate. This is confirmed by the values of the empirical loss function applied to the forecasting results.
Statistics Poker: Reinforcing Basic Statistical Concepts
ERIC Educational Resources Information Center
Leech, Nancy L.
2008-01-01
Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…
Statistical description of turbulent dispersion
NASA Astrophysics Data System (ADS)
Brouwers, J. J. H.
2012-12-01
We derive a comprehensive statistical model for dispersion of passive or almost passive admixture particles such as fine particulate matter, aerosols, smoke, and fumes in turbulent flow. The model rests on the Markov limit for particle velocity. It is in accordance with the asymptotic structure of turbulence at large Reynolds number as described by Kolmogorov. The model consists of Langevin and diffusion equations in which the damping and diffusivity are expressed by expansions in powers of the reciprocal Kolmogorov constant C0. We derive solutions of O(C00) and O(C0-1). We truncate at O(C0-2) which is shown to result in an error of a few percentages in predicted dispersion statistics for representative cases of turbulent flow. We reveal analogies and remarkable differences between the solutions of classical statistical mechanics and those of statistical turbulence.
Nonlinearity sensing via photon-statistics excitation spectroscopy
Assmann, Marc; Bayer, Manfred
2011-11-15
We propose photon-statistics excitation spectroscopy as an adequate tool to describe the optical response of a nonlinear system. To this end we suggest to use optical excitation with varying photon statistics as another spectroscopic degree of freedom to gather information about the system in question. The responses of several simple model systems to excitation beams with different photon statistics are discussed. Possible spectroscopic applications in terms of identifying lasing operation are pointed out.
The concept of adequate causation and Max Weber's comparative sociology of religion.
Buss, A
1999-06-01
Max Weber's The Protestant Ethic and the Spirit of Capitalism, studied in isolation, shows mainly an elective affinity or an adequacy on the level of meaning between the Protestant ethic and the 'spirit' of capitalism. Here it is suggested that Weber's subsequent essays on 'The Economic Ethics of World Religions' are the result of his opinion that adequacy on the level of meaning needs and can be verified by causal adequacy. After some introductory remarks, particularly on elective affinity, the paper tries to develop the concept of adequate causation and the related concept of objective possibility on the basis of the work of v. Kries on whom Weber heavily relied. In the second part, this concept is used to show how the study of the economic ethics of India, China, Rome and orthodox Russia can support the thesis that the 'spirit' of capitalism, although it may not have been caused by the Protestant ethic, was perhaps adequately caused by it. PMID:15260028
Jain, D; Tolg, R; Katus, H A; Richardt, G
2000-12-01
Resistance was encountered in passing a 3 x 18 mm stent across a lesion in the proximal left anterior descending coronary artery. Successive changes in stent with repeated balloon dilatations did not succeed. Finally, a 9 mm stent was passed across the lesion and deployed at the site of maximal resistance. The 18 mm stent was then placed through this stent. A novel strategy to overcome resistance in the stent passage through the lesion after an adequate balloon predilatation is reported. PMID:11103034
Myth 19: Is Advanced Placement an Adequate Program for Gifted Students?
ERIC Educational Resources Information Center
Gallagher, Shelagh A.
2009-01-01
Is it a myth that Advanced Placement (AP) is an adequate program for gifted students? AP is so covered with myths and assumptions that it is hard to get a clear view of the issues. In this article, the author finds the answer about AP by looking at current realties. First, AP is hard for gifted students to avoid. Second, AP never was a program…
Maueröder, C; Chaurio, R A; Dumych, T; Podolska, M; Lootsik, M D; Culemann, S; Friedrich, R P; Bilyy, R; Alexiou, C; Schett, G; Berens, C; Herrmann, M; Munoz, L E
2016-06-01
In this study, we deploy a doxycycline-dependent suicide switch integrated in a tumor challenge model. With this experimental setup, we characterized the immunological consequences of cells dying by four distinct cell death stimuli in vivo. We observed that apoptotic cell death induced by expression of the truncated form of BH3 interacting-domain death agonist (tBid) and a constitutively active form of caspase 3 (revC3), respectively, showed higher immunogenicity than cell death induced by expression of the tuberculosis-necrotizing toxin (TNT). Our data indicate that the early release of ATP induces the silent clearance of dying cells, whereas the simultaneous presence of 'find me' signals and danger-associated molecular patterns (DAMPs) promotes inflammatory reactions and increased immunogenicity. This proposed model is supported by findings showing that the production and release of high concentrations of IL-27 by bone-marrow-derived macrophages (BMDM) is limited to BMDM exposed to those forms of death that simultaneously released ATP and the DAMPs heat-shock protein 90 (HSP90) and high-mobility group box-1 protein (HMGB1). These results demonstrate that the tissue microenvironment generated by dying cells may determine the subsequent immune response. PMID:26943324
Wu, Felicia; Stacy, Shaina L; Kensler, Thomas W
2013-09-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295
Wu, Felicia
2013-01-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295
NASA Astrophysics Data System (ADS)
Baranger, Michel
2002-03-01
It is a remarkable fact that the traditional teaching of thermodynamics, as reflected in the textbooks and including the long developments about ensembles and thermodynamic functions, is almost entirely about systems in equilibrium. The time variable does not enter. There is one exception, however. The single most important item, the flagship of the thermodynamic navy, the second law, is about the irreversibility of the time evolution of systems out of equilibrium. This is a bizarre situation, to say the least; a glaring case of the drunk man looking for his key under the lamp-post, when he knows that he lost it in the dark part of the street. The moment has come for us to go looking in the dark part, the behavior of systems as a function of time. We have been given a powerful new flashlight, chaos theory. We should use it. There, on the formerly dark pavement, we can find Tsallis statistics.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor - Statistics Request Permissions Neuroendocrine Tumor - Statistics Approved by the Cancer.Net Editorial Board , 04/ ... the body. It is important to remember that statistics on how many people survive this type of ...
Statistical mechanics of shell models for two-dimensional turbulence
NASA Astrophysics Data System (ADS)
Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.
1994-12-01
We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.
Statistical Analysis in Climate Research
NASA Astrophysics Data System (ADS)
von Storch, Hans; Zwiers, Francis W.
2002-03-01
The purpose of this book is to help the climatologist understand the basic precepts of the statistician's art and to provide some of the background needed to apply statistical methodology correctly and usefully. The book is self contained: introductory material, standard advanced techniques, and the specialized techniques used specifically by climatologists are all contained within this one source. There are a wealth of real-world examples drawn from the climate literature to demonstrate the need, power and pitfalls of statistical analysis in climate research.
On More Sensitive Periodogram Statistics
NASA Astrophysics Data System (ADS)
Bélanger, G.
2016-05-01
Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.
Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt
Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila
2016-01-01
Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8–10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100–199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925
Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt.
Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila
2016-01-01
Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8-10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100-199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925
Chronic leg ulcer: does a patient always get a correct diagnosis and adequate treatment?
Mooij, Michael C; Huisman, Laurens C
2016-03-01
Patients with chronic leg ulcers have severely impaired quality of life and account for a high percentage of annual healthcare costs. To establish the cause of a chronic leg ulcer, referral to a center with a multidisciplinary team of professionals is often necessary. Treating the underlying cause diminishes healing time and reduces costs. In venous leg ulcers adequate compression therapy is still a problem. It can be improved by training the professionals with pressure measuring devices. A perfect fitting of elastic stockings is important to prevent venous leg ulcer recurrence. In most cases, custom-made stockings are the best choice for this purpose. PMID:26916772
Determining Adequate Margins in Head and Neck Cancers: Practice and Continued Challenges.
Williams, Michelle D
2016-09-01
Margin assessment remains a critical component of oncologic care for head and neck cancer patients. As an integrated team, both surgeons and pathologists work together to assess margins in these complex patients. Differences in method of margin sampling can impact obtainable information and effect outcomes. Additionally, what distance is an "adequate or clear" margin for patient care continues to be debated. Ultimately, future studies and potentially secondary modalities to augment pathologic assessment of margin assessment (i.e., in situ imaging or molecular assessment) may enhance local control in head and neck cancer patients. PMID:27469263
Family Structure Types and Adequate Utilization of Antenatal Care in Kenya.
Owili, Patrick Opiyo; Muga, Miriam Adoyo; Chou, Yiing-Jenq; Hsu, Yi-Hsin Elsa; Huang, Nicole; Chien, Li-Yin
2016-01-01
Features of the health care delivery system may not be the only expounding factors of adequate utilization of antenatal care among women. Other social factors such as the family structure and its environment contribute toward pregnant women's utilization of antenatal care. An understanding of how women in different family structure types and social groups use basic maternal health services is important toward developing and implementing maternal health care policy in the post-Millennium Development Goal era, especially in the sub-Saharan Africa where maternal mortality still remains high. PMID:27214674
Working group on the “adequate minimum” V=volcanic observatory
Tilling, R.I.
1982-01-01
A working group consisting of R. I. Tilling (United States, Chairman), M. Espendola (Mexico), E. Malavassi (Costa Rica), L. Villari (Italy), and J.P Viode (France) met on the island of Guadeloupe on February 20, 1981, to discuss informally the requirements for a "Minimum" volcano observatory, one which would have the essential monitoring equipment and staff to provide reliable information on the state of an active volcno. Given the premise that any monitoring of a volcano is better than none at all, the owrking group then proceeded to consider the concept of an "adequate minimum" observatory.
[Factors associated with adequate prenatal care and delivery in São Tomé and Príncipe, 2008-2009].
Reis, Patrícia Alexandra da Graça Dantas Dos; Pereira, Claudia Cristina de Aguiar; Leite, Iuri da Costa; Theme Filha, Mariza Miranda
2015-09-01
We investigated factors associated with adequacy of prenatal and childbirth care for women in São Tomé and Príncipe. Data were analyzed from the Demographic and Health Survey on a sample of 1,326 newborn infants whose mothers were 15-49 years of age. The survey took place from September 2008 to March 2009. We used multilevel and multinomial logistic regression to analyze the association between demographic and socioeconomic factors and the target outcomes. Prenatal care was adequate in 26% of the sample, and 7% of deliveries were performed by physicians and 76% by nurses or nurse assistants. Statistically significant factors for prenatal care were birth order, maternal schooling, and index of economic well-being. The most important variables for adequate delivery were: birth order, maternal schooling, index of economic well-being, and place of residence. The study showed that socioeconomic factors have the greatest influence on adequate prenatal care and delivery. Future health policies should target social inequalities in São Tomé and Príncipe. PMID:26578017
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Beneath the Skin: Statistics, Trust, and Status
ERIC Educational Resources Information Center
Smith, Richard
2011-01-01
Overreliance on statistics, and even faith in them--which Richard Smith in this essay calls a branch of "metricophilia"--is a common feature of research in education and in the social sciences more generally. Of course accurate statistics are important, but they often constitute essentially a powerful form of rhetoric. For purposes of analysis and…
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Conger, Bruce
2012-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapid Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently and the ventilation flow is adequate for maintaining CO2 washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, testing planned in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing. 1
Taniguchi, Ryosuke; Miura, Yutaka; Koyama, Hiroyuki; Chida, Tsukasa; Anraku, Yasutaka; Kishimura, Akihiro; Shigematsu, Kunihiro; Kataoka, Kazunori; Watanabe, Toshiaki
2016-06-01
In atherosclerotic lesions, the endothelial barrier against the bloodstream can become compromised, resulting in the exposure of the extracellular matrix (ECM) and intimal cells beneath. In theory, this allows adequately sized nanocarriers in circulation to infiltrate into the intimal lesion intravascularly. We sought to evaluate this possibility using rat carotid arteries with induced neointima. Cy5-labeled polyethylene glycol-conjugated polyion complex (PIC) micelles and vesicles, with diameters of 40, 100, or 200 nm (PICs-40, PICs-100, and PICs-200, respectively) were intravenously administered to rats after injury to the carotid artery using a balloon catheter. High accumulation and long retention of PICs-40 in the induced neointima was confirmed by in vivo imaging, while the accumulation of PICs-100 and PICs-200 was limited, indicating that the size of nanocarriers is a crucial factor for efficient delivery. Furthermore, epirubicin-incorporated polymeric micelles with a diameter similar to that of PICs-40 showed significant curative effects in rats with induced neointima, in terms of lesion size and cell number. Specific and effective drug delivery to pre-existing neointimal lesions was demonstrated with adequate size control of the nanocarriers. We consider that this nanocarrier-based drug delivery system could be utilized for the treatment of atherosclerosis. PMID:27183493
Adequate Systemic Perfusion Maintained by a CentriMag during Acute Heart Failure
Favaloro, Roberto R.; Bertolotti, Alejandro; Diez, Mirta; Favaloro, Liliana; Gomez, Carmen; Peradejordi, Margarita; Trentadue, Julio; Hellman, Lorena; Arzani, Yanina; Otero, Pilar Varela
2008-01-01
Mechanical circulatory support during severe acute heart failure presents options for myocardial recovery or cardiac replacement. Short-term circulatory support with the newest generation of magnetically levitated centrifugal-flow pumps affords several potential advantages. Herein, we present our experience with such a pump—the CentriMag® (Levitronix LLC; Waltham, Mass) centrifugal-flow ventricular assist device—in 4 critically ill patients who were in cardiogenic shock. From November 2007 through March 2008, 3 patients were supported after cardiac surgery, and 1 after chronic heart failure worsened. Two patients were bridged to heart transplantation, and 2 died during support. Perfusion during support was evaluated in terms of serum lactic acid levels and oxygenation values. In all of the patients, the CentriMag's pump flow was adequate, and continuous mechanical ventilation support was provided. Lactic acid levels substantially improved with CentriMag support and were maintained at near-normal levels throughout. At the same time, arterial pH, PO2, and carbon dioxide levels remained within acceptable ranges. No thromboembolic events or mechanical failures occurred. Our experience indicates that short-term use of the CentriMag ventricular assist device during acute heart failure can restore and adequately support circulation until recovery or until the application of definitive therapy. PMID:18941648
NASA Astrophysics Data System (ADS)
Nowak, Bernard; Łuczak, Rafał
2015-09-01
The article discusses the improvement of thermal working conditions in underground mine workings, using local refrigeration systems. It considers the efficiency of air cooling with direct action air compression refrigerator of the TS-300B type. As a result of a failure to meet the required operating conditions of the aforementioned air cooling system, frequently there are discrepancies between the predicted (and thus the expected) effects of its work and the reality. Therefore, to improve the operating efficiency of this system, in terms of effective use of the evaporator cooling capacity, quality criteria were developed, which are easy in practical application. They were obtained in the form of statistical models, describing the effect of independent variables, i.e. the parameters of the inlet air to the evaporator (temperature, humidity and volumetric flow rate), as well as the parameters of the water cooling the condenser (temperature and volumetric flow rate), on the thermal power of air cooler, treated as the dependent variable. Statistical equations describing the performance of the analyzed air cooling system were determined, based on the linear and nonlinear multiple regression. The obtained functions were modified by changing the values of the coefficients in the case of linear regression, and of the coefficients and exponents in the case of non-linear regression, with the independent variables. As a result, functions were obtained, which were more convenient in practical applications. Using classical statistics methods, the quality of fitting the regression function to the experimental data was evaluated. Also, the values of the evaporator thermal power of the refrigerator, which were obtained on the basis of the measured air parameters, were compared with the calculated ones, by using the obtained regression functions. These statistical models were built on the basis of the results of measurements in different operating conditions of the TS-300B
Reviewer Bias for Statistically Significant Results: A Reexamination.
ERIC Educational Resources Information Center
Fagley, N. S.; McKinney, I. Jean
1983-01-01
Reexamines the article by Atkinson, Furlong, and Wampold (1982) and questions their conclusion that reviewers were biased toward statistically significant results. A statistical power analysis shows the power of their bogus study was low. Low power in a study reporting nonsignificant findings is a valid reason for recommending not to publish.…
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries
Kashubeck-West, Susan; Coker, Angela D; Awad, Germine H; Stinson, Rebecca D; Bledman, Rashanta; Mintz, Laurie
2013-07-01
This study examines reliability and validity estimates for 3 widely used measures in body image research in a sample of African American college women (N = 278). Internal consistency estimates were adequate (α coefficients above .70) for all measures, and evidence of convergent and discriminant validity was found. Confirmatory factor analyses failed to replicate the hypothesized factor structures of these measures. Exploratory factor analyses indicated that 4 factors found for the Sociocultural Attitudes Toward Appearance Questionnaire were similar to the hypothesized subscales, with fewer items. The factors found for the Multidimensional Body-Self Relations Questionnaire-Appearance Scales and the Body Dissatisfaction subscale of the Eating Disorders Inventory-3 were not similar to the subscales developed by the scale authors. Validity and reliability evidence is discussed for the new factors. PMID:23731233
Beards, S C; Lipman, J; Bothma, P A; Joynt, G M
1994-03-01
A patient with severe tetanus, who had a sympathetic crisis while sedated with 30 mg/h diazepam and 30 mg/h morphine, is described. Satisfactory control of the haemodynamic crisis was achieved with bolus doses of esmolol to a total of 180 mg. A disturbing finding was that although there was adequate control of the tachycardia and hypertension, arterial catecholamine levels remained markedly elevated. Adrenaline levels of 531 pg/ml (normal 10-110 pg/ml) and noradrenaline levels of 1,036 pg/ml (normal 100-500 pg/ml) were recorded when the patient had a systolic arterial pressure of 110 mmHg and a heart rate of 97/min. The implications of this finding are discussed. PMID:11218441
Ben Khedher, Saoussen; Jaoua, Samir; Zouari, Nabil
2014-01-01
The overcoming of catabolite repression, in bioinsecticides production by sporeless Bacillus thuringiensis strain S22 was investigated into fully controlled 3 L fermenter, using glucose based medium. When applying adequate oxygen profile throughout the fermentation period (75% oxygen saturation), it was possible to partially overcome the catabolite repression, normally occurring at high initial glucose concentrations (30 and 40 g/L glucose). Moreover, toxin production yield by sporeless strain S22 was markedly improved by the adoption of the fed-batch intermittent cultures technology. With 22.5 g/L glucose used into culture medium, toxin production was improved by about 36% when applying fed-batch culture compared to one batch. Consequently, the proposed fed-batch strategy was efficient for the overcome of the carbon catabolite repression. So, it was possible to overproduce insecticidal crystal proteins into highly concentrated medium. PMID:25309756
Adequate bases of phase space master integrals for gg → h at NNLO and beyond
NASA Astrophysics Data System (ADS)
Höschele, Maik; Hoff, Jens; Ueda, Takahiro
2014-09-01
We study master integrals needed to compute the Higgs boson production cross section via gluon fusion in the infinite top quark mass limit, using a canonical form of differential equations for master integrals, recently identified by Henn, which makes their solution possible in a straightforward algebraic way. We apply the known criteria to derive such a suitable basis for all the phase space master integrals in afore mentioned process at next-to-next-to-leading order in QCD and demonstrate that the method is applicable to next-to-next-to-next-to-leading order as well by solving a non-planar topology. Furthermore, we discuss in great detail how to find an adequate basis using practical examples. Special emphasis is devoted to master integrals which are coupled by their differential equations.
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda
2011-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapic Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently enough and the ventilation flow is adequate enough to maintain CO2 1 Project Engineer, Space Suit and Crew Survival Systems Branch, Crew and Thermal Systems Division, 2101 NASA Parkway, Houston, TX 77058/EC5. washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, the testing results performed in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing.
Kahalley, Lisa S.; Wilson, Stephanie J.; Tyc, Vida L.; Conklin, Heather M.; Hudson, Melissa M.; Wu, Shengjie; Xiong, Xiaoping; Stancel, Heather H.; Hinds, Pamela S.
2012-01-01
Objectives To describe the psychological needs of adolescent survivors of acute lymphoblastic leukemia (ALL) or brain tumor (BT), we examined: (a) the occurrence of cognitive, behavioral, and emotional concerns identified during a comprehensive psychological evaluation, and (b) the frequency of referrals for psychological follow-up services to address identified concerns. Methods Psychological concerns were identified on measures according to predetermined criteria for 100 adolescent survivors. Referrals for psychological follow-up services were made for concerns previously unidentified in formal assessment or not adequately addressed by current services. Results Most survivors (82%) exhibited at least one concern across domains: behavioral (76%), cognitive (47%), and emotional (19%). Behavioral concerns emerged most often on scales associated with executive dysfunction, inattention, learning, and peer difficulties. CRT was associated with cognitive concerns, χ2(1,N=100)=5.63, p<0.05. Lower income was associated with more cognitive concerns for ALL survivors, t(47)=3.28, p<0.01, and more behavioral concerns for BT survivors, t(48)=2.93, p<0.01. Of survivors with concerns, 38% were referred for psychological follow-up services. Lower-income ALL survivors received more referrals for follow-up, χ2(1,N=41)=8.05, p<0.01. Referred survivors had more concerns across domains than non-referred survivors, ALL: t(39)=2.96, p<0.01, BT: t(39)=3.52, p<0.01. Trends suggest ALL survivors may be at risk for experiencing unaddressed cognitive needs. Conclusions Many adolescent survivors of cancer experience psychological difficulties that are not adequately managed by current services, underscoring the need for long-term surveillance. In addition to prescribing regular psychological evaluations, clinicians should closely monitor whether current support services appropriately meet survivors’ needs, particularly for lower-income survivors and those treated with CRT. PMID:22278930
ERIC Educational Resources Information Center
Hall, Rogers; Horn, Ilana Seidel
2012-01-01
In this article we ask how concepts that organize work in two professional disciplines change during moments of consultation, which represent concerted efforts by participants to work differently now and in the future. Our analysis compares structures of talk, the adequacy of representations of practice, and epistemic and moral stances deployed…
Thermodynamic Limit in Statistical Physics
NASA Astrophysics Data System (ADS)
Kuzemsky, A. L.
2014-03-01
The thermodynamic limit in statistical thermodynamics of many-particle systems is an important but often overlooked issue in the various applied studies of condensed matter physics. To settle this issue, we review tersely the past and present disposition of thermodynamic limiting procedure in the structure of the contemporary statistical mechanics and our current understanding of this problem. We pick out the ingenious approach by Bogoliubov, who developed a general formalism for establishing the limiting distribution functions in the form of formal series in powers of the density. In that study, he outlined the method of justification of the thermodynamic limit when he derived the generalized Boltzmann equations. To enrich and to weave our discussion, we take this opportunity to give a brief survey of the closely related problems, such as the equipartition of energy and the equivalence and nonequivalence of statistical ensembles. The validity of the equipartition of energy permits one to decide what are the boundaries of applicability of statistical mechanics. The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern statistical physics of the infinite and "small" many-particle systems.
41 CFR 102-75.150 - What happens when GSA determines that the report of excess is adequate?
Code of Federal Regulations, 2010 CFR
2010-07-01
... determines that the report of excess is adequate? 102-75.150 Section 102-75.150 Public Contracts and Property... PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real Property Examination for Acceptability § 102-75.150 What happens when GSA determines that the report of excess is adequate? When GSA...
41 CFR 102-75.150 - What happens when GSA determines that the report of excess is adequate?
Code of Federal Regulations, 2014 CFR
2014-01-01
... determines that the report of excess is adequate? 102-75.150 Section 102-75.150 Public Contracts and Property... PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real Property Examination for Acceptability § 102-75.150 What happens when GSA determines that the report of excess is adequate? When GSA...
41 CFR 102-75.150 - What happens when GSA determines that the report of excess is adequate?
Code of Federal Regulations, 2011 CFR
2011-01-01
... determines that the report of excess is adequate? 102-75.150 Section 102-75.150 Public Contracts and Property... PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real Property Examination for Acceptability § 102-75.150 What happens when GSA determines that the report of excess is adequate? When GSA...
41 CFR 102-75.150 - What happens when GSA determines that the report of excess is adequate?
Code of Federal Regulations, 2013 CFR
2013-07-01
... determines that the report of excess is adequate? 102-75.150 Section 102-75.150 Public Contracts and Property... PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real Property Examination for Acceptability § 102-75.150 What happens when GSA determines that the report of excess is adequate? When GSA...
41 CFR 102-75.150 - What happens when GSA determines that the report of excess is adequate?
Code of Federal Regulations, 2012 CFR
2012-01-01
... determines that the report of excess is adequate? 102-75.150 Section 102-75.150 Public Contracts and Property... PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real Property Examination for Acceptability § 102-75.150 What happens when GSA determines that the report of excess is adequate? When GSA...
Code of Federal Regulations, 2014 CFR
2014-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2012 CFR
2012-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2013 CFR
2013-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2010 CFR
2010-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2012 CFR
2012-04-01
... submit adequate prior notice or otherwise failing to comply with this subpart? 1.284 Section 1.284 Food... failing to submit adequate prior notice or otherwise failing to comply with this subpart? (a) The importing or offering for import into the United States of an article of food in violation of...
Code of Federal Regulations, 2011 CFR
2011-04-01
... submit adequate prior notice or otherwise failing to comply with this subpart? 1.284 Section 1.284 Food... failing to submit adequate prior notice or otherwise failing to comply with this subpart? (a) The importing or offering for import into the United States of an article of food in violation of...
Code of Federal Regulations, 2014 CFR
2014-04-01
... submit adequate prior notice or otherwise failing to comply with this subpart? 1.284 Section 1.284 Food... failing to submit adequate prior notice or otherwise failing to comply with this subpart? (a) The importing or offering for import into the United States of an article of food in violation of...
Code of Federal Regulations, 2013 CFR
2013-04-01
... submit adequate prior notice or otherwise failing to comply with this subpart? 1.284 Section 1.284 Food... failing to submit adequate prior notice or otherwise failing to comply with this subpart? (a) The importing or offering for import into the United States of an article of food in violation of...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2011 CFR
2011-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Power Plant Water Intake Assessment.
ERIC Educational Resources Information Center
Zeitoun, Ibrahim H.; And Others
1980-01-01
In order to adequately assess the impact of power plant cooling water intake on an aquatic ecosystem, total ecosystem effects must be considered, rather than merely numbers of impinged or entrained organisms. (Author/RE)
First-digit law in nonextensive statistics.
Shao, Lijing; Ma, Bo-Qiang
2010-10-01
Nonextensive statistics, characterized by a nonextensive parameter q, is a promising and practically useful generalization of the Boltzmann statistics to describe power-law behaviors from physical and social observations. We here explore the unevenness of the first-digit distribution of nonextensive statistics analytically and numerically. We find that the first-digit distribution follows Benford's law and fluctuates slightly in a periodical manner with respect to the logarithm of the temperature. The fluctuation decreases when q increases, and the result converges to Benford's law exactly as q approaches 2. The relevant regularities between nonextensive statistics and Benford's law are also presented and discussed. PMID:21230241
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Minnesota Health Statistics 1988.
ERIC Educational Resources Information Center
Minnesota State Dept. of Health, St. Paul.
This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
ERIC Educational Resources Information Center
Strasser, Nora
2007-01-01
Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…
Statistical quality management
NASA Astrophysics Data System (ADS)
Vanderlaan, Paul
1992-10-01
Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.
Dogan, Mustafa; Mutlu, Levent C; Yilmaz, İbrahim; Bilir, Bulent; Varol Saracoglu, Gamze; Yildirim Guzelant, Aliye
2016-01-01
The aim of the present study was to increase awareness regarding the rational use of medicines. The data were obtained via the Material Resources Management System Module of the Ministry of Health. For the appropriateness of treatments, the Global Initiative for Asthma, the Global Initiative for Chronic Obstructive Lung Disease, and the guidelines for the rational use of medicines were used. We also investigated whether any de-escalation method or physical exercise was performed. Statistical analyses were performed using descriptive statistics to determine the mean, standard deviation, and frequency. The results showed that healthcare providers ignored potential drug reactions or adverse interactions, and reflecting the lack of adherence to the current treatment guides, 35.8% irrational use of medicines was recorded. Thus, de-escalation methods should be used to decrease costs or narrow the antibiotic spectrum, antibiotic selection should consider the resistance patterns, culturing methods should be analyzed, and monotherapy should be preferred over combination treatments. PMID:26166817
Statistical error in particle simulations of low mach number flows
Hadjiconstantinou, N G; Garcia, A L
2000-11-13
We present predictions for the statistical error due to finite sampling in the presence of thermal fluctuations in molecular simulation algorithms. The expressions are derived using equilibrium statistical mechanics. The results show that the number of samples needed to adequately resolve the flowfield scales as the inverse square of the Mach number. Agreement of the theory with direct Monte Carlo simulations shows that the use of equilibrium theory is justified.
Adopting adequate leaching requirement for practical response models of basil to salinity
NASA Astrophysics Data System (ADS)
Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour
2016-07-01
Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science. PMID:27231259
Goutianos, Georgios; Tzioura, Aikaterini; Kyparos, Antonios; Paschalis, Vassilis; Margaritelis, Nikos V; Veskoukis, Aristidis S; Zafeiridis, Andreas; Dipla, Konstantina; Nikolaidis, Michalis G; Vrabas, Ioannis S
2015-02-01
Animal models are widely used in biology and the findings of animal research are traditionally projected to humans. However, recent publications have raised concerns with regard to what extent animals and humans respond similar to physiological stimuli. Original data on direct in vivo comparison between animals and humans are scarce and no study has addressed this issue after exercise. We aimed to compare side by side in the same experimental setup rat and human responses to an acute exercise bout of matched intensity and duration. Rats and humans ran on a treadmill at 86% of maximal velocity until exhaustion. Pre and post exercise we measured 30 blood chemistry parameters, which evaluate iron status, lipid profile, glucose regulation, protein metabolism, liver, and renal function. ANOVA indicated that almost all biochemical parameters followed a similar alteration pattern post exercise in rats and humans. In fact, there were only 2/30 significant species × exercise interactions (in testosterone and globulins), indicating different responses to exercise between rats and humans. On the contrary, the main effect of exercise was significant in 15/30 parameters and marginally nonsignificant in other two parameters (copper, P = 0.060 and apolipoprotein B, P = 0.058). Our major finding is that the rat adequately mimics human responses to exercise in those basic blood biochemical parameters reported here. The physiological resemblance of rat and human blood responses after exercise to exhaustion on a treadmill indicates that the use of blood chemistry in rats for exercise physiology research is justified. PMID:25677548
Aurally-adequate time-frequency analysis for scattered sound in auditoria
NASA Astrophysics Data System (ADS)
Norris, Molly K.; Xiang, Ning; Kleiner, Mendel
2005-04-01
The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.
[Rhythmic nuclear growth of adequately stimulated ganglia cells of acoustic nuclei (rat)].
Köpf-Maier, P; Wüstenfeld, E
1975-01-01
Ganglia cells of the dorsal and ventral cochlear nuclei of white rats were irritated adequately for different periods or left untreated, respectively, and investigated karyometrically. The frequency distribution curves of the nuclear volumes were separated by means of an electronic curve resolver into the component curves, i.e. into groups of nuclei obeying exactly a Gaussian normal distribution and thus representing biologically uniform populations. The analysis of the mean values of the component curves led to the following results: 1. The mean values of the component curves can be arranged in 2 series having the pattern V1, V1 square root 2, V2, V2 square root 2, V4, V4 square root 2...2. The series V1, V1 square root 2, V2, V2 square root 2...is based on a geometrical series of the general formula an = k-qn. 3. It follows from these results that the nuclear volumes grow rhythmically by a factor of square root 2 and, consequently, that there is a periodical doubling in in the growth of the surface. PMID:1200386
Barth, Amy E.; Denton, Carolyn A.; Stuebing, Karla K.; Fletcher, Jack M.; Cirino, Paul T.; Francis, David J.; Vaughn, Sharon
2013-01-01
The cerebellar hypothesis of dyslexia posits that cerebellar deficits are associated with reading disabilities and may explain why some individuals with reading disabilities fail to respond to reading interventions. We tested these hypotheses in a sample of children who participated in a grade 1 reading intervention study (n = 174) and a group of typically achieving children (n = 62). At posttest, children were classified as adequately responding to the intervention (n = 82), inadequately responding with decoding and fluency deficits (n = 36), or inadequately responding with only fluency deficits (n = 56). Based on the Bead Threading and Postural Stability subtests from the Dyslexia Screening Test-Junior, we found little evidence that assessments of cerebellar functions were associated with academic performance or responder status. In addition, we did not find evidence supporting the hypothesis that cerebellar deficits are more prominent for poor readers with “specific” reading disabilities (i.e., with discrepancies relative to IQ) than for poor readers with reading scores consistent with IQ. In contrast, measures of phonological awareness, rapid naming, and vocabulary were strongly associated with responder status and academic outcomes. These results add to accumulating evidence that fails to associate cerebellar functions with reading difficulties. PMID:20298639
The placental pursuit for an adequate oxidant balance between the mother and the fetus
Herrera, Emilio A.; Krause, Bernardo; Ebensperger, German; Reyes, Roberto V.; Casanello, Paola; Parra-Cordero, Mauro; Llanos, Anibal J.
2014-01-01
The placenta is the exchange organ that regulates metabolic processes between the mother and her developing fetus. The adequate function of this organ is clearly vital for a physiologic gestational process and a healthy baby as final outcome. The umbilico-placental vasculature has the capacity to respond to variations in the materno-fetal milieu. Depending on the intensity and the extensity of the insult, these responses may be immediate-, mediate-, and long-lasting, deriving in potential morphostructural and functional changes later in life. These adjustments usually compensate the initial insults, but occasionally may switch to long-lasting remodeling and dysfunctional processes, arising maladaptation. One of the most challenging conditions in modern perinatology is hypoxia and oxidative stress during development, both disorders occurring in high-altitude and in low-altitude placental insufficiency. Hypoxia and oxidative stress may induce endothelial dysfunction and thus, reduction in the perfusion of the placenta and restriction in the fetal growth and development. This Review will focus on placental responses to hypoxic conditions, usually related with high-altitude and placental insufficiency, deriving in oxidative stress and vascular disorders, altering fetal and maternal health. Although day-to-day clinical practice, basic and clinical research are clearly providing evidence of the severe impact of oxygen deficiency and oxidative stress establishment during pregnancy, further research on umbilical and placental vascular function under these conditions is badly needed to clarify the myriad of questions still unsettled. PMID:25009498
Said, Rana R; Rosman, N Paul
2004-08-01
A 10-year-old boy with daily headache for 1 month and intermittent diplopia for 1 week was found to have a unilateral partial abducens palsy and bilateral papilledema; otherwise, his neurologic examination showed no abnormalities. A cranial computed tomographic (CT) scan was normal. Lumbar puncture disclosed a markedly elevated opening pressure of > 550 mm of cerebrospinal fluid with normal cerebrospinal fluid. Medical therapy with acetazolamide for presumed pseudotumor cerebri was begun. Magnetic resonance imaging (MRI) of the brain, done several days later because of continuing symptoms, unexpectedly showed multiple hyperintensities of cerebral white matter on T2-weighted and fluid-attenuated inversion recovery images. Despite high-dose intravenous methylprednisolone for possible demyelinating disease, he failed to improve. A left temporal brain biopsy followed and disclosed an anaplastic oligodendroglioma. In a patient with features indicating pseudotumor cerebri, a negative cranial CT scan is not adequate to rule out underlying pathology; thus, MRI of the brain should probably always be performed. A revised definition of pseudotumor cerebri could better include "normal MRI of the brain" rather than "normal neuroimaging." PMID:15605471
Determination of the need for selenium by chicks fed practical diets adequate in vitamin E
Combs, G.F. Jr.; Su, Q.; Liu, C.H.; Sinisalo, M.; Combs, S.B.
1986-03-01
Experiments were conducted to compare the dietary needs for selenium (Se) by chicks fed either purified (amino acid-based) or practical (corn- and soy-based) diets that were adequate with respect to vitamin E (i.e., contained 100 IU/kg) and all other known nutrients with the single exception of Se (i.e., contained only 0.10 ppm Se). Studies were conducted in Ithaca using Single Comb White Leghorn chicks fed the purified basal diet and in Beijing using chicks of the same breed fed either the same purified basal diet or the practical diet formulated to be similar to that used in poultry production in some parts of China and the US. Results showed that each basal diet produced severe depletion of Se-dependent glutathione peroxidase (SeGSHpx) in plasma, liver and pancreas according to the same time-course, but that other consequences of severe uncomplicated Se deficiency were much more severe among chicks fed the purified diet (e.g., growth depression, pancreatic dysfunction as indicated by elevated plasma amylase and abnormal pancreatic histology). Chicks fed the practical Se-deficient diet showed reduced pancreas levels of copper, zinc and molybdenum and elevated plasma levels of iron; they required ca. 0.10 ppm dietary Se to sustain normal SeGSHpx in several tissues and to prevent elevated amylase in plasma. The dietary Se requirement of the chick is, therefore, estimated to be 0.10 ppm.
[Level of awareness and the adequate application of sunscreen by beauticians].
Cortez, Diógenes Aparício Garcia; Machado, Érica Simionato; Vermelho, Sonia Cristina Soares Dias; Teixeira, Jorge Juarez Vieira; Cortez, Lucia Elaine Ranieri
2016-06-01
The scope of this research was to establish the level of awareness of beauticians regarding the importance of the application of sunscreen and to identify whether their patients had been properly instructed by these professionals. It involved a descriptive and exploratory study with interviews applying qualitative methodology among 30 beauticians. Data were gathered using the semi-structured interview technique in Maringá, in the southern state of Paraná. The data were analyzed using Atlas.ti software after applying quantitative analysis and response classification. Of those interviewed, 83.33% had a degree in Aesthetics, 20% attended ongoing training activities on sunscreen and 73.17% acquired sunscreen for its quality, though 86.67% were not familiar with sunscreens with natural anti-free radical components. Of those interviewed, 80% had never treated patients with skin cancer, though they reported having knowledge of care in relation to sun exposure and how to use the sunscreen and the relationship of these practices with the disease. The results showed that the recommendations and use of sunscreen by beauticians and users has been conducted in an adequate and conscientious manner. PMID:27383359
Maharaj, N R; Gangaram, R; Moodley, J
2007-04-01
Recent evidence on the long-term effects of HRT have resulted in increased emphasis being placed on individualised counselling, patient choice and informed consent when managing the menopause. We assessed whether women in an underresourced country have adequate knowledge of the menopause/HRT to engage in patient - provider discussions and provide full informed consent for HRT. Specific 'knowledge scores' for the menopause and HRT were developed and utilised in structured questionnaires to determine the existing levels of knowledge in 150 women from different racial, educational and occupational backgrounds. Some 92% were aware of the menopause and 54% were aware of HRT. Specific knowledge about the menopause and HRT overall was low (39% and 38%, respectively). There was a significant association between higher education levels, race and occupational status on the knowledge of the menopause but not of HRT. Television, radio and pamphlets were the preferred sources to gain further information. There is a need to create awareness and provide further education to women in underresourced countries about the menopause and HRT to empower them to make informed choices about their health during this period. PMID:17464817
A high UV environment does not ensure adequate Vitamin D status
NASA Astrophysics Data System (ADS)
Kimlin, M. G.; Lang, C. A.; Brodie, A.; Harrison, S.; Nowak, M.; Moore, M. R.
2006-12-01
Queensland has the highest rates of skin cancer in the world and due to the high levels of solar UV in this region it is assumed that incidental UV exposure should provide adequate vitamin D status for the population. This research was undertaken to test this assumption among healthy free-living adults in south-east Queensland, Australia (27°S), at the end of winter. This research was approved by Queensland University of Technology Human Research Ethics Committee and conducted under the guidelines of the Declaration of Helsinki. 10.2% of the sample had serum vitamin D levels below 25nm/L (deficiency) and a further 32.3% had levels between 25nm/L and 50nm/L (insufficiency). Vitamin D deficiency and insufficiency can occur at the end of winter, even in sunny climates. The wintertime UV levels in south-east Queensland (UV index 4-6) are equivalent to summertime UV levels in northern regions of Europe and the USA. These ambient UV levels are sufficient to ensure synthesis of vitamin D requirements. We investigated individual UV exposure (through a self reported sun exposure questionnaire) and found correlations between exposure and Vitamin D status. Further research is needed to explore the interactions between the solar UV environment and vitamin D status, particularly in high UV environments, such as Queensland.
Deng, Chun-hua; Zhang, Ya-dong; Chen, Xin
2015-01-01
Mild-symptom erectile dysfunction (MSED) is commonly seen in clinical practice, but receives inadequate attention from both the patients and clinicians. Increasing researches have indicated that MSED is associated with not only unhealthy living habits and psychological factors but also the early progression of endothelial, metabolic and endocrine diseases. The diagnosis and treatment of MSED should be based on the relevant guidelines, with consideration of both its specific and common features. The therapeutic principle is a combination of integrated and individual solutions aimed at the causes of the disease. Drug intervention should be initiated if psychological therapy fails. Negligence of MSED may affect the quality of life of the patients and their partners, and what's more, might delay the management of some other severe underlying diseases. Adequate attention to the early diagnosis and treatment for MSED is of great significance for a deeper insight into the etiology of ED, the prevention of potential cardiovascular and metabolic diseases, and the improvement of the overall health of males. PMID:25707132
Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios
NASA Technical Reports Server (NTRS)
Juarez, Alfredo; Harper, Susana A.
2016-01-01
The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.
Burnier, Michel; Wuerzner, Gregoire; Bochud, Murielle
2015-01-01
Among the various strategies to reduce the incidence of non-communicable diseases reduction of sodium intake in the general population has been recognized as one of the most cost-effective means because of its potential impact on the development of hypertension and cardiovascular diseases. Yet, this strategic health recommendation of the WHO and many other international organizations is far from being universally accepted. Indeed, there are still several unresolved scientific and epidemiological questions that maintain an ongoing debate. Thus what is the adequate low level of sodium intake to recommend to the general population and whether national strategies should be oriented to the overall population or only to higher risk fractions of the population such as salt-sensitive patients are still discussed. In this paper, we shall review the recent results of the literature regarding salt, blood pressure and cardiovascular risk and we present the recommendations recently proposed by a group of experts of Switzerland. The propositions of the participating medical societies are to encourage national health authorities to continue their discussion with the food industry in order to reduce the sodium intake of food products with a target of mean salt intake of 5–6 grams per day in the population. Moreover, all initiatives to increase the information on the effect of salt on health and on the salt content of food are supported. PMID:26321959
Idvall, J; Aronsen, K F; Lindström, K; Ulmsten, U
1977-09-30
Various catheter-manometer systems possible for intravascular blood pressure measurments on rats have been elaborated and tested in vitro and in vivo. Using a pressure-step calibrator, it was observed from in vitro studies that microtransducers had superior frequency response compared to conventional transducers. Of the catheters tested, Pe-90 tapered to a 40 mm tip with an inner diameter of 0.3 mm had the best frequency response as judged from fall and settling times. Because of the damping effect, tapering increased fall time to 1.8 ms, which was still quite acceptable. By the same token settling time was minimized to 22.4 ms. With a special calculation method the theoretical percentile fault of the recordings was estimated to be 9.66%. When the measurement error was calculated from the actual in vivo recordings, it was found to be no more than 2.7%. These results show that the technique described is adequate for continuous intravascular blood pressure recordings on small animals. Finally it is emphasized that careful handling of the catheters and avoidance of stopcocks and air bubbles are essential for obtaining accurate and reproducible values. PMID:928971
Is reimbursement for childhood immunizations adequate? evidence from two rural areas in colorado.
Glazner, J. E.; Steiner, J. F.; Haas, K. J.; Renfrew, B.; Deutchman, M.; Berman, S.
2001-01-01
OBJECTIVE: To assess adequacy of reimbursement for childhood vaccinations in two rural regions in Colorado, the authors measured medical practice costs of providing childhood vaccinations and compared them with reimbursement. METHODS: A "time-motion" method was used to measure labor costs of providing vaccinations in 13 private and public practices. Practices reported non-labor costs. The authors determined reimbursement by record review. RESULTS: The average vaccine delivery cost per dose (excluding vaccine cost) ranged from $4.69 for community health centers to $5.60 for private practices. Average reimbursement exceeded average delivery costs for all vaccines and contributed to overhead in private practices. Average reimbursement was less than total cost (vaccine-delivery costs + overhead) in private practices for most vaccines in one region with significant managed care penetration. Reimbursement to public providers was less than the average vaccine delivery costs. CONCLUSIONS: Current reimbursement may not be adequate to induce private practices to provide childhood vaccinations, particularly in areas with substantial managed care penetration. PMID:12034911
Finkel, Jonathan; Cira, Courtney; Mazzella, Leanne; Bartyzel, Jim; Ramanna, Annisce; Strimel, Kayla; Waturuocha, Amara; Musser, Nathan; Burress, James; Brammer, Sarah; Wetzel, Robert; Horzempa, Joseph
2016-01-01
Vitamin D is a secosterol that is naturally synthesized in the skin upon contact with ultraviolet rays. This vitamin can also be acquired from dietary and nutritional supplements. The active form, vitamin D3, is primarily responsible for calcium homeostasis and bone health. However, many recent studies have associated low levels of vitamin D3 with asthma and food allergies. In this review, we discuss literature to explore the potential that vitamin D3 deficiency may be contributing toward the development of asthma and food allergies. These studies indicate that mothers who supplement with doses of vitamin D3 recommended for daily consumption (400 IU) by the United States Food and Drug Administration is not enough to deliver adequate levels to breastfed infants. Because sufficient vitamin D3 serum levels correlate with a low incidence of asthma and food allergies, high dose vitamin D3 supplementation (4000 IU) by pregnant and breastfeeding women may limit the development of asthma and food allergies in newborns. PMID:27213185
Barth, Amy E; Denton, Carolyn A; Stuebing, Karla K; Fletcher, Jack M; Cirino, Paul T; Francis, David J; Vaughn, Sharon
2010-05-01
The cerebellar hypothesis of dyslexia posits that cerebellar deficits are associated with reading disabilities and may explain why some individuals with reading disabilities fail to respond to reading interventions. We tested these hypotheses in a sample of children who participated in a grade 1 reading intervention study (n = 174) and a group of typically achieving children (n = 62). At posttest, children were classified as adequately responding to the intervention (n = 82), inadequately responding with decoding and fluency deficits (n = 36), or inadequately responding with only fluency deficits (n = 56). Based on the Bead Threading and Postural Stability subtests from the Dyslexia Screening Test-Junior, we found little evidence that assessments of cerebellar functions were associated with academic performance or responder status. In addition, we did not find evidence supporting the hypothesis that cerebellar deficits are more prominent for poor readers with "specific" reading disabilities (i.e., with discrepancies relative to IQ) than for poor readers with reading scores consistent with IQ. In contrast, measures of phonological awareness, rapid naming, and vocabulary were strongly associated with responder status and academic outcomes. These results add to accumulating evidence that fails to associate cerebellar functions with reading difficulties. PMID:20298639
Gasparetto, Emerson Leandro; Alves-Leon, Soniza; Domingues, Flavio Sampaio; Frossard, João Thiago; Lopes, Selva Paraguassu; Souza, Jorge Marcondes de
2016-06-01
Neurocysticercosis (NCC) is an endemic disease and important public health problem in some areas of the World and epilepsy is the most common neurological manifestation. Multiple intracranial lesions, commonly calcified, are seen on cranial computed tomography (CT) in the chronic phase of the disease and considered one of the diagnostic criteria of the diagnosis. Magnetic resonance imaging (MRI) is the test that better depicts the different stages of the intracranial cysts but does not show clearly calcified lesions. Cerebral cavernous malformations (CCM), also known as cerebral cavernomas, are frequent vascular malformations of the brain, better demonstrated by MRI and have also epilepsy as the main form of clinical presentation. When occurring in the familial form, cerebral cavernomas typically present with multiple lesions throughout the brain and, very often, with foci of calcifications in the lesions when submitted to the CT imaging. In the countries, and geographic areas, where NCC is established as an endemic health problem and neuroimaging screening is done by CT scan, it will be important to consider the differential diagnosis between the two diseases due to the differences in adequate management. PMID:27332076
Timens, W; Leemans, R
1992-01-01
The risk of severe infections after splenectomy, even after many years, is now well established. In attempts to prevent these infections, spleen-saving techniques, including autotransplantation of spleen fragments, have been performed, when possible in combination with vaccination. The problem in autotransplantation is the evaluation of functional activity. The results of the tests used until now often do not seem to correlate very well with the risk of developing an overwhelming postsplenectomy infection (OPSI). This may be related to the fact that the tests used evaluate general functions, and not specific spleen-related functions, such as the capacity to mount a primary response to certain polysaccharide antigens present in the capsule of bacteria known to cause OPSI. In this review, the significance of the spleen in the human immune system is discussed and the effects of splenectomy are described, including the precautions that can be taken to diminish the risk of postsplenectomy infections and sepsis. It appears that postsplenectomy vaccination is more successful when recently developed protein-conjugated polysaccharide vaccines are used. Because the present testing of the function of spleen autotransplants is not adequate, we suggest that new tests should be developed, employing appropriate polysaccharide antigens. PMID:1543398
Statistical modeling of laser welding of DP/TRIP steel sheets
NASA Astrophysics Data System (ADS)
Reisgen, U.; Schleser, M.; Mokrov, O.; Ahmed, E.
2012-02-01
In this research work, a statistical analysis of the CO 2 laser beam welding of dual phase (DP600)/transformation induced plasticity (TRIP700) steel sheets was done using response surface methodology. The analysis considered the effect of laser power (2-2.2 kW), welding speed (40-50 mm/s) and focus position (-1 to 0 mm) on the heat input, the weld bead geometry, uniaxial tensile strength, formability limited dome height and welding operation cost. The experimental design was based on Box-Behnken design using linear and quadratic polynomial equations for predicting the mathematical models. The results indicate that the proposed models predict the responses adequately within the limits of welding parameters being used and the welding speed is the most significant parameter during the welding process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... security safeguards to prevent unauthorized disclosure or destruction of manual and automatic record..., and security safeguards to prevent unauthorized disclosure or destruction of manual and...
Roos, V; Gunnarsson, L; Fick, J; Larsson, D G J; Rudén, C
2012-04-01
The presence of pharmaceuticals in the aquatic environment, and the concerns for negative effects on aquatic organisms, has gained increasing attention over the last years. As ecotoxicity data are lacking for most active pharmaceutical ingredients (APIs), it is important to identify strategies to prioritise APIs for ecotoxicity testing and environmental monitoring. We have used nine previously proposed prioritisation schemes, both risk- and hazard-based, to rank 582 APIs. The similarities and differences in overall ranking results and input data were compared. Moreover, we analysed how well the methods ranked seven relatively well-studied APIs. It is concluded that the hazard-based methods were more successful in correctly ranking the well-studied APIs, but the fish plasma model, which includes human pharmacological data, also showed a high success rate. The results of the analyses show that the input data availability vary significantly; some data, such as logP, are available for most API while information about environmental concentrations and bioconcentration are still scarce. The results also suggest that the exposure estimates in risk-based methods need to be improved and that the inclusion of effect measures at first-tier prioritisation might underestimate risks. It is proposed that in order to develop an adequate prioritisation scheme, improved data on exposure such as degradation and sewage treatment removal and bioconcentration ability should be further considered. The use of ATC codes may also be useful for the development of a prioritisation scheme that includes the mode of action of pharmaceuticals and, to some extent, mixture effects. PMID:22361586
Emotional Experiences of Obese Women with Adequate Gestational Weight Variation: A Qualitative Study
Faria-Schützer, Débora Bicudo; Surita, Fernanda Garanhani de Castro; Alves, Vera Lucia Pereira; Vieira, Carla Maria; Turato, Egberto Ribeiro
2015-01-01
Background As a result of the growth of the obese population, the number of obese women of fertile age has increased in the last few years. Obesity in pregnancy is related to greater levels of anxiety, depression and physical harm. However, pregnancy is an opportune moment for the intervention of health care professionals to address obesity. The objective of this study was to describe how obese pregnant women emotionally experience success in adequate weight control. Methods and Findings Using a qualitative design that seeks to understand content in the field of health, the sample of subjects was deliberated, with thirteen obese pregnant women selected to participate in an individual interview. Data was analysed by inductive content analysis and includes complete transcription of the interviews, re-readings using suspended attention, categorization in discussion topics and the qualitative and inductive analysis of the content. The analysis revealed four categories, three of which show the trajectory of body care that obese women experience during pregnancy: 1) The obese pregnant woman starts to think about her body;2) The challenge of the diet for the obese pregnant woman; 3) The relation of the obese pregnant woman with the team of antenatal professionals. The fourth category reveals the origin of the motivation for the change: 4) The potentializing factors for change: the motivation of the obese woman while pregnant. Conclusions During pregnancy, obese women are more in touch with themselves and with their emotional conflicts. Through the transformations of their bodies, women can start a more refined self-care process and experience of the body-mind unit. The fear for their own and their baby's life, due to the risks posed by obesity, appears to be a great potentializing factor for change. The relationship with the professionals of the health care team plays an important role in the motivational support of the obese pregnant woman. PMID:26529600
Goutianos, Georgios; Tzioura, Aikaterini; Kyparos, Antonios; Paschalis, Vassilis; Margaritelis, Nikos V; Veskoukis, Aristidis S; Zafeiridis, Andreas; Dipla, Konstantina; Nikolaidis, Michalis G; Vrabas, Ioannis S
2015-01-01
Animal models are widely used in biology and the findings of animal research are traditionally projected to humans. However, recent publications have raised concerns with regard to what extent animals and humans respond similar to physiological stimuli. Original data on direct in vivo comparison between animals and humans are scarce and no study has addressed this issue after exercise. We aimed to compare side by side in the same experimental setup rat and human responses to an acute exercise bout of matched intensity and duration. Rats and humans ran on a treadmill at 86% of maximal velocity until exhaustion. Pre and post exercise we measured 30 blood chemistry parameters, which evaluate iron status, lipid profile, glucose regulation, protein metabolism, liver, and renal function. ANOVA indicated that almost all biochemical parameters followed a similar alteration pattern post exercise in rats and humans. In fact, there were only 2/30 significant species × exercise interactions (in testosterone and globulins), indicating different responses to exercise between rats and humans. On the contrary, the main effect of exercise was significant in 15/30 parameters and marginally nonsignificant in other two parameters (copper, P = 0.060 and apolipoprotein B, P = 0.058). Our major finding is that the rat adequately mimics human responses to exercise in those basic blood biochemical parameters reported here. The physiological resemblance of rat and human blood responses after exercise to exhaustion on a treadmill indicates that the use of blood chemistry in rats for exercise physiology research is justified. PMID:25677548
NASA Astrophysics Data System (ADS)
Petropoulos, Z.; Clavin, C.; Zuckerman, B.
2015-12-01
The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large
Maintaining Adequate Carbon Dioxide Washout for an Advanced Extravehicular Mobility Unit
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Navarro, Moses; Conger, Bruce; Korona, Adam; McMillin, Summer; Norcross, Jason; Swickrath, Mike
2013-01-01
Over the past several years, NASA has realized tremendous progress in technology development that is aimed at the production of an Advanced Extravehicular Mobility Unit (AEMU). Of the many functions provided by the spacesuit and portable life support subsystem within the AEMU, delivering breathing gas to the astronaut along with removing the carbon dioxide (CO2) remains one of the most important environmental functions that the AEMU can control. Carbon dioxide washout is the capability of the ventilation flow in the spacesuit helmet to provide low concentrations of CO2 to the crew member to meet breathing requirements. CO2 washout performance is a critical parameter needed to ensure proper and sufficient designs in a spacesuit and in vehicle applications such as sleep stations and hygiene compartments. Human testing to fully evaluate and validate CO2 washout performance is necessary but also expensive due to the levied safety requirements. Moreover, correlation of math models becomes challenging because of human variability and movement. To supplement human CO2 washout testing, a breathing capability will be integrated into a suited manikin test apparatus to provide a safe, lower cost, stable, easily modeled alternative to human testing. Additionally, this configuration provides NASA Johnson Space Center (JSC) the capability to evaluate CO2 washout under off-nominal conditions that would otherwise be unsafe for human testing or difficult due to fatigue of a test subject. Testing has been under way in-house at JSC and analysis has been initiated to evaluate whether the technology provides sufficient performance in ensuring that the CO2 is removed sufficiently and the ventilation flow is adequate for maintaining CO2 washout in the AEMU spacesuit helmet of the crew member during an extravehicular activity. This paper will review recent CO2 washout testing and analysis activities, testing planned in-house with a spacesuit simulator, and the associated analytical work
The adequate stimulus for avian short latency vestibular responses to linear translation
NASA Technical Reports Server (NTRS)
Jones, T. A.; Jones, S. M.; Colbert, S.
1998-01-01
Transient linear acceleration stimuli have been shown to elicit eighth nerve vestibular compound action potentials in birds and mammals. The present study was undertaken to better define the nature of the adequate stimulus for neurons generating the response in the chicken (Gallus domesticus). In particular, the study evaluated the question of whether the neurons studied are most sensitive to the maximum level of linear acceleration achieved or to the rate of change in acceleration (da/dt, or jerk). To do this, vestibular response thresholds were measured as a function of stimulus onset slope. Traditional computer signal averaging was used to record responses to pulsed linear acceleration stimuli. Stimulus onset slope was systematically varied. Acceleration thresholds decreased with increasing stimulus onset slope (decreasing stimulus rise time). When stimuli were expressed in units of jerk (g/ms), thresholds were virtually constant for all stimulus rise times. Moreover, stimuli having identical jerk magnitudes but widely varying peak acceleration levels produced virtually identical responses. Vestibular response thresholds, latencies and amplitudes appear to be determined strictly by stimulus jerk magnitudes. Stimulus attributes such as peak acceleration or rise time alone do not provide sufficient information to predict response parameter quantities. Indeed, the major response parameters were shown to be virtually independent of peak acceleration levels or rise time when these stimulus features were isolated and considered separately. It is concluded that the neurons generating short latency vestibular evoked potentials do so as "jerk encoders" in the chicken. Primary afferents classified as "irregular", and which traditionally fall into the broad category of "dynamic" or "phasic" neurons, would seem to be the most likely candidates for the neural generators of short latency vestibular compound action potentials.
Statistical phenomena in particle beams
Bisognano, J.J.
1984-09-01
Particle beams are subject to a variety of apparently distinct statistical phenomena such as intrabeam scattering, stochastic cooling, electron cooling, coherent instabilities, and radiofrequency noise diffusion. In fact, both the physics and mathematical description of these mechanisms are quite similar, with the notion of correlation as a powerful unifying principle. In this presentation we will attempt to provide both a physical and a mathematical basis for understanding the wide range of statistical phenomena that have been discussed. In the course of this study the tools of the trade will be introduced, e.g., the Vlasov and Fokker-Planck equations, noise theory, correlation functions, and beam transfer functions. Although a major concern will be to provide equations for analyzing machine design, the primary goal is to introduce a basic set of physical concepts having a very broad range of applicability.
A simple statistical model for geomagnetic reversals
NASA Technical Reports Server (NTRS)
Constable, Catherine
1990-01-01
The diversity of paleomagnetic records of geomagnetic reversals now available indicate that the field configuration during transitions cannot be adequately described by simple zonal or standing field models. A new model described here is based on statistical properties inferred from the present field and is capable of simulating field transitions like those observed. Some insight is obtained into what one can hope to learn from paleomagnetic records. In particular, it is crucial that the effects of smoothing in the remanence acquisition process be separated from true geomagnetic field behavior. This might enable us to determine the time constants associated with the dominant field configuration during a reversal.
NASA Astrophysics Data System (ADS)
Schieve, William C.; Horwitz, Lawrence P.
2009-04-01
1. Foundations of quantum statistical mechanics; 2. Elementary examples; 3. Quantum statistical master equation; 4. Quantum kinetic equations; 5. Quantum irreversibility; 6. Entropy and dissipation: the microscopic theory; 7. Global equilibrium: thermostatics and the microcanonical ensemble; 8. Bose-Einstein ideal gas condensation; 9. Scaling, renormalization and the Ising model; 10. Relativistic covariant statistical mechanics of many particles; 11. Quantum optics and damping; 12. Entanglements; 13. Quantum measurement and irreversibility; 14. Quantum Langevin equation: quantum Brownian motion; 15. Linear response: fluctuation and dissipation theorems; 16. Time dependent quantum Green's functions; 17. Decay scattering; 18. Quantum statistical mechanics, extended; 19. Quantum transport with tunneling and reservoir ballistic transport; 20. Black hole thermodynamics; Appendix; Index.
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Statistical Properties of Online Auctions
NASA Astrophysics Data System (ADS)
Namazi, Alireza; Schadschneider, Andreas
We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.
Effects of flare definitions on the statistics of derived flare distributions
NASA Astrophysics Data System (ADS)
Ryan, D. F.; Dominique, M.; Seaton, D.; Stegen, K.; White, A.
2016-08-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. Such examinations can tackle large-scale science questions or give context to detailed single-event studies. However, they are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds. This arbitrariness may lead to important scientific conclusions being drawn from results caused by subjective choices in algorithms rather than the true nature of the Sun. In this paper, we explore the effect of the arbitrary thresholds used in the Geostationary Operational Environmental Satellite (GOES) event list and Large Yield RAdiometer (LYRA) Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the flare start thresholds of the algorithms. We also find that the power law exponents of these distributions are not stable, but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are artificially steep and clearly non-power law. We show that this is consistent with an insufficient degradation correction. This means that PROBA2/LYRA should not be used for flare statistics or energetics unless degradation is adequately accounted for. However, it can be used to study variations over shorter timescales and for space weather monitoring.
On real statistics of relaxation in gases
NASA Astrophysics Data System (ADS)
Kuzovlev, Yu. E.
2016-02-01
By example of a particle interacting with ideal gas, it is shown that the statistics of collisions in statistical mechanics at any value of the gas rarefaction parameter qualitatively differ from that conjugated with Boltzmann's hypothetical molecular chaos and kinetic equation. In reality, the probability of collisions of the particle in itself is random. Because of that, the relaxation of particle velocity acquires a power-law asymptotic behavior. An estimate of its exponent is suggested on the basis of simple kinematic reasons.
Phase Statistics of Seismic Coda Waves
Anache-Menier, D.; Tiggelen, B. A. van; Margerin, L.
2009-06-19
We report the analysis of the statistics of the phase fluctuations in the coda of earthquakes recorded during a temporary experiment deployed at Pinyon Flats Observatory, California. The observed distributions of the spatial derivatives of the phase in the seismic coda exhibit universal power-law decays whose exponents agree accurately with circular Gaussian statistics. The correlation function of the phase derivative is measured and used to estimate the mean free path of Rayleigh waves.
Phase statistics of seismic coda waves.
Anache-Ménier, D; van Tiggelen, B A; Margerin, L
2009-06-19
We report the analysis of the statistics of the phase fluctuations in the coda of earthquakes recorded during a temporary experiment deployed at Pinyon Flats Observatory, California. The observed distributions of the spatial derivatives of the phase in the seismic coda exhibit universal power-law decays whose exponents agree accurately with circular Gaussian statistics. The correlation function of the phase derivative is measured and used to estimate the mean free path of Rayleigh waves. PMID:19659054
Ge, Jiajia; Santanam, Lakshmi; Noel, Camille; Parikh, Parag J.
2013-03-15
Purpose: To evaluate whether planning 4-dimensional computed tomography (4DCT) can adequately represent daily motion of abdominal tumors in regularly fractionated and stereotactic body radiation therapy (SBRT) patients. Methods and Materials: Intrafractional tumor motion of 10 patients with abdominal tumors (4 pancreas-fractionated and 6 liver-stereotactic patients) with implanted fiducials was measured based on daily orthogonal fluoroscopic movies over 38 treatment fractions. The needed internal margin for at least 90% of tumor coverage was calculated based on a 95th and fifth percentile of daily 3-dimensional tumor motion. The planning internal margin was generated by fusing 4DCT motion from all phase bins. The disagreement between needed and planning internal margin was analyzed fraction by fraction in 3 motion axes (superior-inferior [SI], anterior-posterior [AP], and left-right [LR]). The 4DCT margin was considered as an overestimation/underestimation of daily motion when disagreement exceeded at least 3 mm in the SI axis and/or 1.2 mm in the AP and LR axes (4DCT image resolution). The underlying reasons for this disagreement were evaluated based on interfractional and intrafractional breathing variation. Results: The 4DCT overestimated daily 3-dimensional motion in 39% of the fractions in 7 of 10 patients and underestimated it in 53% of the fractions in 8 of 10 patients. Median underestimation was 3.9 mm, 3.0 mm, and 1.7 mm in the SI axis, AP axis, and LR axis, respectively. The 4DCT was found to capture irregular deep breaths in 3 of 10 patients, with 4DCT motion larger than mean daily amplitude by 18 to 21 mm. The breathing pattern varied from breath to breath and day to day. The intrafractional variation of amplitude was significantly larger than intrafractional variation (2.7 mm vs 1.3 mm) in the primary motion axis (ie, SI axis). The SBRT patients showed significantly larger intrafractional amplitude variation than fractionated patients (3.0 mm vs 2
Is serum or sputum eosinophil cationic protein level adequate for diagnosis of mild asthma?
Khakzad, Mohammad Reza; Mirsadraee, Majid; Sankian, Mojtaba; Varasteh, Abdolreza; Meshkat, Mojtaba
2009-09-01
Spirometry has been used as a common diagnostic test in asthma. Most of the patients with a mild asthma have a FEV1 within normal range. Hence, other diagnostic methods are usually used. The aim of this study was to evaluate whether eosinophil Cationic Protein (ECP) could be an accurate diagnostic marker of mild asthma. In this study diagnosis of asthma was made according to internationally accepted criteria. Asthma severity was evaluated according to frequency of symptoms and FEV1.Adequate sputum samples were obtained in 50 untreated subjects. A control group of 12 normal subjects that showed PC20 more than 8 mg/dl was also examined. Sputum was induced by inhalation of hypertonic saline. Inflammatory cells in sputum smears were assessed semi-quantitatively. ECP and IgE concentrations, eosinophil (EO) percentage and ECP/EO ratio in serum and sputum were also determined. The results revealed that Cough and dyspnea were the most frequent clinical findings. Dyspnea and wheezing were the symptoms that correlated with staging of asthma. FEV1 was within normal range (more than 80% of predicted) in 22 (44%) subjects.Asthmatic patients showed significantly higher numbers of blood eosinophils (4.5+/- 3.1% vs. 1.2+/-0.2%, P=0.009), and higher levels of serum ECP than control group (3.1+/- 2.6 % and 22.6+/- 15.8 ng/ml, respectively). Sputum ECP level in asthmatics was significantly higher than non- asthmatics (55.3+/-29.8ng/mL vs. 25.0+/-24.7ng/mL, P=0.045). Regression analysis showed no significant correlation between spirometric parameters and biomarkers, the only exception was significant correlation between FEF(25-75) and serum ECP (r= 0.28, P 0.041). Regarding clinical symptoms, wheezing was significantly correlated with elevation of most of biomarkers. Since, serum and sputum ECP levels are elevated in untreated asthmatics, the ECP level could be used for accurate diagnosis of mild form of asthma in which spirometry is unremarkable. PMID:20124607
Human milk feeding supports adequate growth in infants ≤ 1250 grams birth weight
2013-01-01
Background Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The objective of this study was to evaluate growth velocities and incidence of extrauterine growth restriction in infants ≤ 1250 grams (g) birth weight (BW) receiving an exclusive human milk-based diet with early and rapid advancement of fortification using a donor human milk derived fortifier. Methods In a single center, prospective observational cohort study, preterm infants weighing ≤ 1250 g BW were fed an exclusive human milk-based diet until 34 weeks postmenstrual age. Human milk fortification with donor human milk derived fortifier was started at 60 mL/kg/d and advanced to provide 6 to 8 additional kilocalories per ounce (or 0.21 to 0.28 kilocalories per gram). Data for growth were compared to historical growth standards and previous human milk-fed cohorts. Results We consecutively evaluated 104 infants with mean gestational age of 27.6 ± 2.0 weeks and BW of 913 ± 181 g (mean ± standard deviation). Weight gain was 24.8 ± 5.4 g/kg/day with length 0.99 ± 0.23 cm/week and head circumference 0.72 ± 0.14 cm/week. There were 3 medical NEC cases and 1 surgical NEC case. 22 infants (21%) were small for gestational age at birth. Overall, 45 infants (43%) had extrauterine growth restriction. Weight velocity was affected by day of fortification (p = 0.005) and day of full feeds (p = 0.02). Our cohort had significantly greater growth in weight and length compared to previous entirely human milk-fed cohorts. Conclusions A feeding protocol for infants ≤ 1250 g BW providing an exclusive human milk-based diet with early and rapid advancement of fortification leads to growth meeting targeted standards with a low rate of extrauterine growth restriction. Consistent
Toward Understanding the Role of Technological Tools in Statistical Learning.
ERIC Educational Resources Information Center
Ben-Zvi, Dani
2000-01-01
Begins with some context setting on new views of statistics and statistical education reflected in the introduction of exploratory data analysis (EDA) into the statistics curriculum. Introduces a detailed example of an EDA learning activity in the middle school that makes use of the power of the spreadsheet to mediate students' construction of…
NASA Astrophysics Data System (ADS)
Bieg, Bohdan; Chrzanowski, Janusz; Kravtsov, Yury A.; Orsitto, Francesco
Basic principles and recent findings of quasi-isotropic approximation (QIA) of a geometrical optics method are presented in a compact manner. QIA was developed in 1969 to describe electromagnetic waves in weakly anisotropic media. QIA represents the wave field as a power series in two small parameters, one of which is a traditional geometrical optics parameter, equal to wavelength ratio to plasma characteristic scale, and the other one is the largest component of anisotropy tensor. As a result, "" QIA ideally suits to tokamak polarimetry/interferometry systems in submillimeter range, where plasma manifests properties of weakly anisotropic medium.
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…
Multidimensional Visual Statistical Learning
ERIC Educational Resources Information Center
Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.
2008-01-01
Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST.
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
ERIC Educational Resources Information Center
Council of Ontario Universities, Toronto.
Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…
Introduction to Statistical Physics
NASA Astrophysics Data System (ADS)
Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo
2014-12-01
Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Statistical Mapping by Computer.
ERIC Educational Resources Information Center
Utano, Jack J.
The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…
The purpose of the Disability Statistics Center is to produce and disseminate statistical information on disability and the status of people with disabilities in American society and to establish and monitor indicators of how conditions are changing over time to meet their health...
Januszyk, Michael; Gurtner, Geoffrey C
2011-01-01
The scope of biomedical research has expanded rapidly during the past several decades, and statistical analysis has become increasingly necessary to understand the meaning of large and diverse quantities of raw data. As such, a familiarity with this lexicon is essential for critical appraisal of medical literature. This article attempts to provide a practical overview of medical statistics, with an emphasis on the selection, application, and interpretation of specific tests. This includes a brief review of statistical theory and its nomenclature, particularly with regard to the classification of variables. A discussion of descriptive methods for data presentation is then provided, followed by an overview of statistical inference and significance analysis, and detailed treatment of specific statistical tests and guidelines for their interpretation. PMID:21200241
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
Statistical regimes of random laser fluctuations
Lepri, Stefano; Cavalieri, Stefano; Oppo, Gian-Luca; Wiersma, Diederik S.
2007-06-15
Statistical fluctuations of the light emitted from amplifying random media are studied theoretically and numerically. The characteristic scales of the diffusive motion of light lead to Gaussian or power-law (Levy) distributed fluctuations depending on external control parameters. In the Levy regime, the output pulse is highly irregular leading to huge deviations from a mean-field description. Monte Carlo simulations of a simplified model which includes the population of the medium demonstrate the two statistical regimes and provide a comparison with dynamical rate equations. Different statistics of the fluctuations helps to explain recent experimental observations reported in the literature.
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected. PMID:21302664
NASA Astrophysics Data System (ADS)
Maslov, V. P.; Maslova, T. V.
2013-07-01
We introduce several new notions in mathematical statistics that bridge the gap between this discipline and statistical physics. The analogy between them is useful both for mathematics and for physics. What is more, this new mathematical statistics is adequate for the study of computer networks and self-teaching systems. The role of the web in sociological and economic research is ascertained.
Statistical mechanics of complex networks
NASA Astrophysics Data System (ADS)
Waclaw, B.
2007-04-01
The science of complex networks is a new interdisciplinary branch of science which has arisen recently on the interface of physics, biology, social and computer sciences, and others. Its main goal is to discover general laws governing the creation and growth as well as processes taking place on networks, like e.g. the Internet, transportation or neural networks. It turned out that most real-world networks cannot be simply reduced to a compound of some individual components. Fortunately, the statistical mechanics, being one of pillars of modern physics, provides us with a very powerful set of tools and methods for describing and understanding these systems. In this thesis, we would like to present a consistent approach to complex networks based on statistical mechanics, with the central role played by the concept of statistical ensemble of networks. We show how to construct such a theory and present some practical problems where it can be applied. Among them, we pay attention to the problem of finite-size corrections and the dynamics of a simple model of mass transport on networks.
Winters, Ryan; Winters, Andrew; Amedee, Ronald G.
2010-01-01
The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381
The Importance and Efficacy of Using Statistics in the High School Chemistry Laboratory
ERIC Educational Resources Information Center
Matsumoto, Paul S.
2006-01-01
In high school, many students do not have an opportunity to learn or use statistics. Because statistics is a powerful tool and many students take a statistics class in college, prior exposure to statistics in a chemistry course (or another course) would benefit students. This paper describes some statistical concepts and tests with their…
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Cooperative Learning in Statistics.
ERIC Educational Resources Information Center
Keeler, Carolyn M.; And Others
1994-01-01
Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)
Titanic: A Statistical Exploration.
ERIC Educational Resources Information Center
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
Tuberculosis Data and Statistics
... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...
Purposeful Statistical Investigations
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.
Röher, Katharina; Göpfert, Matthias S
2015-07-01
In the light of a rising percentage of women among employees in anaesthesia and intensive care designing adequate workplaces for pregnant employees plays an increasingly important role. Here it is necessary to align the varied interests of the pregnant employee, fellow employees and the employer, where the legal requirements of the Maternity Protection Act ("Mutterschutzgesetz") form the statutory framework. This review describes how adequate workplaces for pregnant employees in anaesthesia and intensive care can be established considering the scientific evidence on the subject. PMID:26230896
Oakland, J.S.
1986-01-01
Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.
NASA Technical Reports Server (NTRS)
Levy, R.; Mcginness, H.
1976-01-01
Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.
Electric power annual 1995. Volume II
1996-12-01
This document summarizes pertinent statistics on various aspects of the U.S. electric power industry for the year and includes a graphic presentation. Data is included on electric utility retail sales and revenues, financial statistics, environmental statistics of electric utilities, demand-side management, electric power transactions, and non-utility power producers.
Statistical Physics of Particles
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics
Advice on statistical analysis for Circulation Research.
Kusuoka, Hideo; Hoffman, Julien I E
2002-10-18
Since the late 1970s when many journals published articles warning about the misuse of statistical methods in the analysis of data, researchers have become more careful about statistical analysis, but errors including low statistical power and inadequate analysis of repeated-measurement studies are still prevalent. In this review, several statistical methods are introduced that are not always familiar to basic and clinical cardiologists but may be useful for revealing the correct answer from the data. The aim of this review is not only to draw the attention of investigators to these tests but also to stress the conditions in which they are applicable. These methods are now generally available in statistical program packages. Researchers need not know how to calculate the statistics from the data but are required to select the correct method from the menu and interpret the statistical results accurately. With the choice of appropriate statistical programs, the issue is no longer how to do the test but when to do it. PMID:12386142
Fietkau, Rainer . E-mail: rainer.fietkau@med.uni-rostock.de; Roedel, Claus; Hohenberger, Werner; Raab, Rudolf; Hess, Clemens; Liersch, Torsten; Becker, Heinz; Wittekind, Christian; Hutter, Matthias; Hager, Eva; Karstens, Johann; Ewald, Hermann; Christen, Norbert; Jagoditsch, Michael; Martus, Peter; Sauer, Rolf
2007-03-15
Purpose: The impact of the delivery of radiotherapy (RT) on treatment results in rectal cancer patients is unknown. Methods and Materials: The data from 788 patients with rectal cancer treated within the German CAO/AIO/ARO-94 phase III trial were analyzed concerning the impact of the delivery of RT (adequate RT: minimal radiation RT dose delivered, 4300 cGy for neoadjuvant RT or 4700 cGy for adjuvant RT; completion of RT in <44 days for neoadjuvant RT or <49 days for adjuvant RT) in different centers on the locoregional recurrence rate (LRR) and disease-free survival (DFS) at 5 years. The LRR, DFS, and delivery of RT were analyzed as endpoints in multivariate analysis. Results: A significant difference was found between the centers and the delivery of RT. The overall delivery of RT was a prognostic factor for the LRR (no RT, 29.6% {+-} 7.8%; inadequate RT, 21.2% {+-} 5.6%; adequate RT, 6.8% {+-} 1.4%; p = 0.0001) and DFS (no RT, 55.1% {+-} 9.1%; inadequate RT, 57.4% {+-} 6.3%; adequate RT, 69.1% {+-} 2.3%; p = 0.02). Postoperatively, delivery of RT was a prognostic factor for LRR on multivariate analysis (together with pathologic stage) but not for DFS (independent parameters, pathologic stage and age). Preoperatively, on multivariate analysis, pathologic stage, but not delivery of RT, was an independent prognostic parameter for LRR and DFS (together with adequate chemotherapy). On multivariate analysis, the treatment center, treatment schedule (neoadjuvant vs. adjuvant RT), and gender were prognostic parameters for adequate RT. Conclusion: Delivery of RT should be regarded as a prognostic factor for LRR in rectal cancer and is influenced by the treatment center, treatment schedule, and patient gender.