42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
Statistical Power in Meta-Analysis
ERIC Educational Resources Information Center
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Statistical Design of Electric Power Transmission Networks.
NASA Astrophysics Data System (ADS)
Guvenis, Albert
This thesis presents a statistical planning method for expanding electric power transmission networks in the presence of load/generation and network uncertainties. The objective of the proposed algorithm is to obtain a set of alternative network expansion plans which have optimum reliability and expansion cost. Present methods optimize the existing network based on a deterministic performance index. Therefore, they are limited to a nominal design and cannot minimize the loss of load probability due to random fluctuations in the power system parameters. This thesis addresses this problem of designing a power transmission network that minimizes the loss-of-load probability (or equivalently maximizes the reliability) under random load/generation and network fluctuations. Two probabilistic indices, reliability and adequacy, are defined in order to quantify the transmission network performance under uncertainties. Reliability is defined as the probability of supplying the random substation load demands under random circuit outages, whereas adequacy is the same index computed under the non-outage condition. The effectiveness of the proposed statistical planning method is its ability to optimize these two indices by using efficient gradient methods. The approach taken is to first optimize the adequacy using a set-imbedding technique. Then the adequate network obtained in the first design stage is reoptimized with respect to reliability using a modified Parametric Sampling technique. The reliability optimization method developed in this thesis optimizes the discrete reliability of the network by successively optimizing approximate continuous functions. It is shown that the solution of the continuous optimization problems converge to the solution of the discrete reliability optimization problem. Two objectives, reliability and expansion cost, are optimized simultaneously. By weighting these objectives differently a set of alternative expansion plans are obtained, which
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.
ERIC Educational Resources Information Center
Rogers, Bruce G.
This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…
Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power
ERIC Educational Resources Information Center
Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon
2016-01-01
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
New Dynamical-Statistical Techniques for Wind Power Prediction
NASA Astrophysics Data System (ADS)
Stathopoulos, C.; Kaperoni, A.; Galanis, G.; Kallos, G.
2012-04-01
The increased use of renewable energy sources, and especially of wind power, has revealed the significance of accurate environmental and wind power predictions over wind farms that critically affect the integration of the produced power in the general grid. This issue is studied in the present paper by means of high resolution physical and statistical models. Two numerical weather prediction (NWP) systems namely SKIRON and RAMS are used to simulate the flow characteristics in selected wind farms in Greece. The NWP model output is post-processed by utilizing Kalman and Kolmogorov statistics in order to remove systematic errors. Modeled wind predictions in combination with available on-site observations are used for estimation of the wind power potential by utilizing a variety of statistical power prediction models based on non-linear and hyperbolic functions. The obtained results reveal the strong dependence of the forecasts uncertainty on the wind variation, the limited influence of previously recorded power values and the advantages that nonlinear - non polynomial functions could have in the successful control of power curve characteristics. This methodology is developed at the framework of the FP7 projects WAUDIT and MARINA PLATFORM.
Improved Noise-Power Estimators Based On Order Statistics
NASA Technical Reports Server (NTRS)
Zimmerman, George A.
1995-01-01
Technique based on order statistics enables design of improved noise-power estimators. In original intended application, noise-power estimators part of microwave-signal-processing system of Search for Extraterrestrial Intelligence project. Involves limiting dynamic range of value to be estimated; making it possible to achieve performance of order-statistical estimator with simple algorithms and equipment and with only one pass over input data. Technique also applicable to other signal-detection systems and to image-detection systems required to exhibit constant false-alarm rates.
Statistical Modeling of Soi Devices for Low-Power Electronics.
NASA Astrophysics Data System (ADS)
Phelps, Mark Joseph
1995-01-01
This dissertation addresses the needs of low-power, large-scale integrated circuit device design, advanced materials technology, and computer simulation for statistical modeling. The main body of work comprises the creation and implementation of a software shell (STADIUM-SOI) that automates the application of statistics to commercial technology computer-aided design tools. The objective is to demonstrate that statistical design of experiments methodology can be employed for the advanced material technology of Silicon -On-Insulator (SOI) devices. The culmination of this effort was the successful modeling of the effect of manufacturing process variation on SOI device characteristics and the automation of this procedure.
Statistical Power Analysis in Education Research. NCSER 2010-3006
ERIC Educational Resources Information Center
Hedges, Larry V.; Rhoads, Christopher
2010-01-01
This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from…
Power Curve Modeling in Complex Terrain Using Statistical Models
NASA Astrophysics Data System (ADS)
Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.
2014-12-01
Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.
Robust Statistical Detection of Power-Law Cross-Correlation
NASA Astrophysics Data System (ADS)
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
"Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2009-01-01
Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…
Statistical analyses support power law distributions found in neuronal avalanches.
Klaus, Andreas; Yu, Shan; Plenz, Dietmar
2011-01-01
The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
The Role of Atmospheric Measurements in Wind Power Statistical Models
NASA Astrophysics Data System (ADS)
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.
2015-12-01
The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Statistical Models of Power-law Distributions in Homogeneous Plasmas
Roth, Ilan
2011-01-04
A variety of in-situ measurements in space plasmas point out to an intermittent formation of distribution functions with elongated tails and power-law at high energies. Power-laws form ubiquitous signature of many complex systems, plasma being a good example of a non-Boltzmann behavior for distribution functions of energetic particles. Particles, which either undergo mutual collisions or are scattered in phase space by electromagnetic fluctuations, exhibit statistical properties, which are determined by the transition probability density function of a single interaction, while their non-asymptotic evolution may determine the observed high-energy populations. It is shown that relaxation of the Brownian motion assumptions leads to non-analytical characteristic functions and to generalization of the Fokker-Planck equation with fractional derivatives that result in power law solutions parameterized by the probability density function.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Robust Statistical Detection of Power-Law Cross-Correlation.
Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques
NASA Technical Reports Server (NTRS)
Kuan, Gary M
2008-01-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
Statistical power for detecting trends with applications to seabird monitoring
Hatch, Shyla A.
2003-01-01
Power analysis is helpful in defining goals for ecological monitoring and evaluating the performance of ongoing efforts. I examined detection standards proposed for population monitoring of seabirds using two programs (MONITOR and TRENDS) specially designed for power analysis of trend data. Neither program models within- and among-years components of variance explicitly and independently, thus an error term that incorporates both components is an essential input. Residual variation in seabird counts consisted of day-to-day variation within years and unexplained variation among years in approximately equal parts. The appropriate measure of error for power analysis is the standard error of estimation (S.E.est) from a regression of annual means against year. Replicate counts within years are helpful in minimizing S.E.est but should not be treated as independent samples for estimating power to detect trends. Other issues include a choice of assumptions about variance structure and selection of an exponential or linear model of population change. Seabird count data are characterized by strong correlations between S.D. and mean, thus a constant CV model is appropriate for power calculations. Time series were fit about equally well with exponential or linear models, but log transformation ensures equal variances over time, a basic assumption of regression analysis. Using sample data from seabird monitoring in Alaska, I computed the number of years required (with annual censusing) to detect trends of -1.4% per year (50% decline in 50 years) and -2.7% per year (50% decline in 25 years). At ??=0.05 and a desired power of 0.9, estimated study intervals ranged from 11 to 69 years depending on species, trend, software, and study design. Power to detect a negative trend of 6.7% per year (50% decline in 10 years) is suggested as an alternative standard for seabird monitoring that achieves a reasonable match between statistical and biological significance.
Spatial factors affecting statistical power in testing marine fauna displacement.
Pérez Lapeña, B; Wijnberg, K M; Stein, A; Hulscher, S J M H
2011-10-01
Impacts of offshore wind farms on marine fauna are largely unknown. Therefore, one commonly adheres to the precautionary principle, which states that one shall take action to avoid potentially damaging impacts on marine ecosystems, even when full scientific certainty is lacking. We implement this principle by means of a statistical power analysis including spatial factors. Implementation is based on geostatistical simulations, accommodating for zero-inflation in species data. We investigate scenarios in which an impact assessment still has to be carried out. Our results show that the environmental conditions at the time of the survey is the most influential factor on power. This is followed by survey effort and species abundance in the reference situation. Spatial dependence in species numbers at local scales affects power, but its effect is smaller for the scenarios investigated. Our findings can be used to improve effectiveness of the economical investment for monitoring surveys. In addition, unnecessary extra survey effort, and related costs, can be avoided when spatial dependence in species abundance is present and no improvement on power is achieved.
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring
Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Toward improved statistical treatments of wind power forecast errors
NASA Astrophysics Data System (ADS)
Hart, E.; Jacobson, M. Z.
2011-12-01
The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.
ERIC Educational Resources Information Center
Spybrook, Jessaca
2008-01-01
This study examines the reporting of power analyses in the group randomized trials funded by the Institute of Education Sciences from 2002 to 2006. A detailed power analysis provides critical information that allows reviewers to (a) replicate the power analysis and (b) assess whether the parameters used in the power analysis are reasonable.…
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W., Jr.
2003-01-01
A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.
Relative Costs and Statistical Power in the Extreme Groups Approach
ERIC Educational Resources Information Center
Abrahams, Norman M.; Alf, Edward F., Jr.
1978-01-01
The relationship between variables in applied and experimental research is often investigated by the use of extreme groups. Recent analytical work has provided an extreme group procedure that is more powerful than the standard correlational approach. The present article provides procedures to optimize power and thusly resources in such studies.…
Mar, Raymond A; Spreng, R Nathan; Deyoung, Colin G
2013-09-01
Personality neuroscience involves examining relations between cognitive or behavioral variability and neural variables like brain structure and function. Such studies have uncovered a number of fascinating associations but require large samples, which are expensive to collect. Here, we propose a system that capitalizes on neuroimaging data commonly collected for separate purposes and combines it with new behavioral data to test novel hypotheses. Specifically, we suggest that groups of researchers compile a database of structural (i.e., anatomical) and resting-state functional scans produced for other task-based investigations and pair these data with contact information for the participants who contributed the data. This contact information can then be used to collect additional cognitive, behavioral, or individual-difference data that are then reassociated with the neuroimaging data for analysis. This would allow for novel hypotheses regarding brain-behavior relations to be tested on the basis of large sample sizes (with adequate statistical power) for low additional cost. This idea can be implemented at small scales at single institutions, among a group of collaborating researchers, or perhaps even within a single lab. It can also be implemented at a large scale across institutions, although doing so would entail a number of additional complications.
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
Enrichment of statistical power for genome-wide association studies
Technology Transfer Automated Retrieval System (TEKTRAN)
The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...
Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert
2007-03-01
This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.
Mathematical Power: Exploring Critical Pedagogy in Mathematics and Statistics
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Blake, Sally
2007-01-01
Though traditionally viewed as value-free, mathematics is actually one of the most powerful, yet underutilized, venues for working towards the goals of critical pedagogy--social, political and economic justice for all. This emerging awareness is due to how critical mathematics educators such as Frankenstein, Skovsmose and Gutstein have applied the…
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
ERIC Educational Resources Information Center
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis
ERIC Educational Resources Information Center
Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.
2010-01-01
In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…
Estimating Statistical Power for Open Enrollment Group Treatment Trials
Morgan-Lopez, Antonio A.; Saavedra, Lissette M.; Hien, Denise A.; Fals-Stewart, William
2010-01-01
Modeling turnover in group membership has been identified as a key barrier contributing to a disconnect between the manner in which behavioral treatment is conducted (open enrollment groups) and the designs of substance abuse treatment trials (closed enrollment groups, individual therapy). Latent class pattern mixture models (LCPMM) are an emerging tool for modeling data from open enrollment groups with membership turnover in recently proposed treatment trials. The current article illustrates an approach to conducting power analyses for open enrollment designs based on Monte Carlo simulation of LCPMM models using parameters derived from published data from an RCT comparing Seeking Safety to a Community Care condition for women presenting with comorbid PTSD and substance use disorders. The example addresses discrepancies between the analysis framework assumed in power analyses of many recently-proposed open enrollment trials and the proposed use of LCPMM for data analysis. PMID:20832971
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W.
2002-01-01
A simple power law model consisting of a single spectral index, a is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index alpha(sub 2) greater than alpha(sub 1) above E(sub k). The Maximum likelihood (ML) procedure was developed for estimating the single parameter alpha(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (P1) consistency (asymptotically unbiased). (P2) efficiency asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only he ascertained by calculating the CRB for an assumed energy spectrum-detector response function combination, which can be quite formidable in practice. However. the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are attained in practice are investigated. The ML technique is then extended to estimate spectra information from
Efficiency statistics at all times: Carnot limit at finite power.
Polettini, M; Verley, G; Esposito, M
2015-02-01
We derive the statistics of the efficiency under the assumption that thermodynamic fluxes fluctuate with normal law, parametrizing it in terms of time, macroscopic efficiency, and a coupling parameter ζ. It has a peculiar behavior: no moments, one sub-, and one super-Carnot maxima corresponding to reverse operating regimes (engine or pump), the most probable efficiency decreasing in time. The limit ζ→0 where the Carnot bound can be saturated gives rise to two extreme situations, one where the machine works at its macroscopic efficiency, with Carnot limit corresponding to no entropy production, and one where for a transient time scaling like 1/ζ microscopic fluctuations are enhanced in such a way that the most probable efficiency approaches the Carnot limit at finite entropy production.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
ERIC Educational Resources Information Center
Huston, Holly L.
This paper begins with a general discussion of statistical significance, effect size, and power analysis; and concludes by extending the discussion to the multivariate case (MANOVA). Historically, traditional statistical significance testing has guided researchers' thinking about the meaningfulness of their data. The use of significance testing…
Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models
ERIC Educational Resources Information Center
Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles
2012-01-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…
The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study
ERIC Educational Resources Information Center
Dong, Nianbo; Lipsey, Mark
2010-01-01
This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…
Sormani, M; Molyneux, P; Gasperini, C; Barkhof, F; Yousry, T; Miller, D; Filippi, M
1999-01-01
OBJECTIVES—To evaluate the durations of the follow up and the reference population sizes needed to achieve optimal and stable statistical powers for two period cross over and parallel group design clinical trials in multiple sclerosis, when using the numbers of new enhancing lesions and the numbers of active scans as end point variables. METHODS—The statistical power was calculated by means of computer simulations performed using MRI data obtained from 65 untreated relapsing-remitting or secondary progressive patients who were scanned monthly for 9 months. The statistical power was calculated for follow up durations of 2, 3, 6, and 9 months and for sample sizes of 40-100 patients for parallel group and of 20-80 patients for two period cross over design studies. The stability of the estimated powers was evaluated by applying the same procedure on random subsets of the original data. RESULTS—When using the number of new enhancing lesions as the end point, the statistical power increased for all the simulated treatment effects with the duration of the follow up until 3 months for the parallel group design and until 6 months for the two period cross over design. Using the number of active scans as the end point, the statistical power steadily increased until 6 months for the parallel group design and until 9 months for the two period cross over design. The power estimates in the present sample and the comparisons of these results with those obtained by previous studies with smaller patient cohorts suggest that statistical power is significantly overestimated when the size of the reference data set decreases for parallel group design studies or the duration of the follow up decreases for two period cross over studies. CONCLUSIONS—These results should be used to determine the duration of the follow up and the sample size needed when planning MRI monitored clinical trials in multiple sclerosis. PMID:10201417
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations.
Greenland, Sander; Senn, Stephen J; Rothman, Kenneth J; Carlin, John B; Poole, Charles; Goodman, Steven N; Altman, Douglas G
2016-04-01
Misinterpretation and abuse of statistical tests, confidence intervals, and statistical power have been decried for decades, yet remain rampant. A key problem is that there are no interpretations of these concepts that are at once simple, intuitive, correct, and foolproof. Instead, correct use and interpretation of these statistics requires an attention to detail which seems to tax the patience of working scientists. This high cognitive demand has led to an epidemic of shortcut definitions and interpretations that are simply wrong, sometimes disastrously so-and yet these misinterpretations dominate much of the scientific literature. In light of this problem, we provide definitions and a discussion of basic statistics that are more general and critical than typically found in traditional introductory expositions. Our goal is to provide a resource for instructors, researchers, and consumers of statistics whose knowledge of statistical theory and technique may be limited but who wish to avoid and spot misinterpretations. We emphasize how violation of often unstated analysis protocols (such as selecting analyses for presentation based on the P values they produce) can lead to small P values even if the declared test hypothesis is correct, and can lead to large P values even if that hypothesis is incorrect. We then provide an explanatory list of 25 misinterpretations of P values, confidence intervals, and power. We conclude with guidelines for improving statistical interpretation and reporting. PMID:27209009
Narayan, Manjari; Allen, Genevera I.
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Narayan, Manjari; Allen, Genevera I
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches-R (2) based on resampling and random effects test statistics, and R (3) that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R (2) and R (3) have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices.
NASA Astrophysics Data System (ADS)
Ma, W. T.; Sandri, G. vH.; Sarkar, S.
1991-05-01
We use the convolution power of infinite sequences to obtain a novel representation of exponential functions of power series which often arise in statistical mechanics. We thus obtain new formulas for the configuration and cluster integrals of pairwise interacting systems of molecules in an imperfect gas. We prove that the asymptotic behaviour of the Luria-Delbrück distribution is pn∼ cn-2. We derive a new, simple and computationally efficient recursion relation for pn.
Prospective active marker motion correction improves statistical power in BOLD fMRI.
Muraskin, Jordan; Ooi, Melvyn B; Goldman, Robin I; Krueger, Sascha; Thomas, William J; Sajda, Paul; Brown, Truman R
2013-03-01
Group level statistical maps of blood oxygenation level dependent (BOLD) signals acquired using functional magnetic resonance imaging (fMRI) have become a basic measurement for much of systems, cognitive and social neuroscience. A challenge in making inferences from these statistical maps is the noise and potential confounds that arise from the head motion that occurs within and between acquisition volumes. This motion results in the scan plane being misaligned during acquisition, ultimately leading to reduced statistical power when maps are constructed at the group level. In most cases, an attempt is made to correct for this motion through the use of retrospective analysis methods. In this paper, we use a prospective active marker motion correction (PRAMMO) system that uses radio frequency markers for real-time tracking of motion, enabling on-line slice plane correction. We show that the statistical power of the activation maps is substantially increased using PRAMMO compared to conventional retrospective correction. Analysis of our results indicates that the PRAMMO acquisition reduces the variance without decreasing the signal component of the BOLD (beta). Using PRAMMO could thus improve the overall statistical power of fMRI based BOLD measurements, leading to stronger inferences of the nature of processing in the human brain.
ERIC Educational Resources Information Center
Porter, Andrew C.; McSweeney, Maryellen
A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…
Accuracy of Estimates and Statistical Power for Testing Meditation in Latent Growth Curve Modeling
ERIC Educational Resources Information Center
Cheong, JeeWon
2011-01-01
The latent growth curve modeling (LGCM) approach has been increasingly utilized to investigate longitudinal mediation. However, little is known about the accuracy of the estimates and statistical power when mediation is evaluated in the LGCM framework. A simulation study was conducted to address these issues under various conditions including…
[Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].
Suzukawa, Yumi; Toyoda, Hideki
2012-04-01
This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.
Palmisano, Aldo N.; Elder, N.E.
2001-01-01
We examined, under standardized conditions, seawater survival of chinook salmon Oncorhynchus tshawytscha at the smolt stage to evaluate the experimental hatchery practices applied to their rearing. The experimental rearing practices included rearing fish at different densities; attempting to control bacterial kidney disease with broodstock segregation, erythromycin injection, and an experimental diet; rearing fish on different water sources; and freeze branding the fish. After application of experimental rearing practices in hatcheries, smolts were transported to a rearing facility for about 2-3 months of seawater rearing. Of 16 experiments, 4 yielded statistically significant differences in seawater survival. In general we found that high variability among replicates, plus the low numbers of replicates available, resulted in low statistical power. We recommend including four or five replicates and using ?? = 0.10 in 1-tailed tests of hatchery experiments to try to increase the statistical power to 0.80.
Violation of statistical isotropy and homogeneity in the 21-cm power spectrum
NASA Astrophysics Data System (ADS)
Shiraishi, Maresuke; Muñoz, Julian B.; Kamionkowski, Marc; Raccanelli, Alvise
2016-05-01
Most inflationary models predict primordial perturbations to be statistically isotropic and homogeneous. Cosmic microwave background (CMB) observations, however, indicate a possible departure from statistical isotropy in the form of a dipolar power modulation at large angular scales. Alternative models of inflation, beyond the simplest single-field slow-roll models, can generate a small power asymmetry, consistent with these observations. Observations of clustering of quasars show, however, agreement with statistical isotropy at much smaller angular scales. Here, we propose to use off-diagonal components of the angular power spectrum of the 21-cm fluctuations during the dark ages to test this power asymmetry. We forecast results for the planned SKA radio array, a future radio array, and the cosmic-variance-limited case as a theoretical proof of principle. Our results show that the 21-cm line power spectrum will enable access to information at very small scales and at different redshift slices, thus improving upon the current CMB constraints by ˜2 orders of magnitude for a dipolar asymmetry and by ˜1 - 3 orders of magnitude for a quadrupolar asymmetry case.
Beckstead, Jason W
2013-10-01
This is the second in a short series of papers on measurement theory and practice with particular relevance to intervention research in nursing, midwifery, and healthcare. This paper begins with an illustration of how random measurement error decreases the power of statistical tests and a review of the roles of sample size and effect size in hypothesis testing. A simple formula is presented and discussed for calculating sample size during the planning stages of intervention studies. Finally, an approach for incorporating reliability estimates into a priori power analyses is introduced and illustrated with a practical example. The approach permits researchers to compare alternative study designs, in terms of their statistical power. An SPSS program is provided to facilitate this approach and to assist researchers in making optimal decisions when choosing among alternative study designs.
Statistics of injected power on a bouncing ball subjected to a randomly vibrating piston.
García-Cid, Alfredo; Gutiérrez, Pablo; Falcón, Claudio; Aumaître, Sébastien; Falcon, Eric
2015-09-01
We present an experimental study on the statistical properties of the injected power needed to maintain an inelastic ball bouncing constantly on a randomly accelerating piston in the presence of gravity. We compute the injected power at each collision of the ball with the moving piston by measuring the velocity of the piston and the force exerted on the piston by the ball. The probability density function of the injected power has its most probable value close to zero and displays two asymmetric exponential tails, depending on the restitution coefficient, the piston acceleration, and its frequency content. This distribution can be deduced from a simple model assuming quasi-Gaussian statistics for the force and velocity of the piston.
Statistics of injected power on a bouncing ball subjected to a randomly vibrating piston
NASA Astrophysics Data System (ADS)
García-Cid, Alfredo; Gutiérrez, Pablo; Falcón, Claudio; Aumaître, Sébastien; Falcon, Eric
2015-09-01
We present an experimental study on the statistical properties of the injected power needed to maintain an inelastic ball bouncing constantly on a randomly accelerating piston in the presence of gravity. We compute the injected power at each collision of the ball with the moving piston by measuring the velocity of the piston and the force exerted on the piston by the ball. The probability density function of the injected power has its most probable value close to zero and displays two asymmetric exponential tails, depending on the restitution coefficient, the piston acceleration, and its frequency content. This distribution can be deduced from a simple model assuming quasi-Gaussian statistics for the force and velocity of the piston.
Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.
2012-01-01
The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID
The Effect of Cluster Size Variability on Statistical Power in Cluster-Randomized Trials
Lauer, Stephen A.; Kleinman, Ken P.; Reich, Nicholas G.
2015-01-01
The frequency of cluster-randomized trials (CRTs) in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters) needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power. PMID:25830416
Statistical power of model selection strategies for genome-wide association studies.
Wu, Zheyang; Zhao, Hongyu
2009-07-01
Genome-wide association studies (GWAS) aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the Comprehensive R Archive Network
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity.
Beasley, T Mark
2014-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
Statistical Design Model (SDM) of power supply and communication subsystem's Satellite
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza
In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem
Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M
2011-12-01
This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.
Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat
2014-11-01
The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells.
Comparisons of power of statistical methods for gene-environment interaction analyses.
Ege, Markus J; Strachan, David P
2013-10-01
Any genome-wide analysis is hampered by reduced statistical power due to multiple comparisons. This is particularly true for interaction analyses, which have lower statistical power than analyses of associations. To assess gene-environment interactions in population settings we have recently proposed a statistical method based on a modified two-step approach, where first genetic loci are selected by their associations with disease and environment, respectively, and subsequently tested for interactions. We have simulated various data sets resembling real world scenarios and compared single-step and two-step approaches with respect to true positive rate (TPR) in 486 scenarios and (study-wide) false positive rate (FPR) in 252 scenarios. Our simulations confirmed that in all two-step methods the two steps are not correlated. In terms of TPR, two-step approaches combining information on gene-disease association and gene-environment association in the first step were superior to all other methods, while preserving a low FPR in over 250 million simulations under the null hypothesis. Our weighted modification yielded the highest power across various degrees of gene-environment association in the controls. An optimal threshold for step 1 depended on the interacting allele frequency and the disease prevalence. In all scenarios, the least powerful method was to proceed directly to an unbiased full interaction model, applying conventional genome-wide significance thresholds. This simulation study confirms the practical advantage of two-step approaches to interaction testing over more conventional one-step designs, at least in the context of dichotomous disease outcomes and other parameters that might apply in real-world settings.
NASA Astrophysics Data System (ADS)
Najac, Julien
2014-05-01
For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.
Statistical Wind Power Forecasting for U.S. Wind Farms: Preprint
Milligan, M.; Schwartz, M. N.; Wan, Y.
2003-11-01
Electricity markets in the United States are evolving. Accurate wind power forecasts are beneficial for wind plant operators, utility operators, and utility customers. An accurate forecast allows grid operators to schedule economically efficient generation to meet the demand of electrical customers. The evolving markets hold some form of auction for various forward markets, such as hour ahead or day ahead. This paper describes several statistical forecasting models that can be useful in hour-ahead markets. Although longer-term forecasting relies on numerical weather models, the statistical models used here focus on the short-term forecasts that can be useful in the hour-ahead markets. The purpose of the paper is not to develop forecasting models that can compete with commercially available models. Instead, we investigate the extent to which time-series analysis can improve simplistic persistence forecasts. This project applied a class of models known as autoregressive moving average (A RMA) models to both wind speed and wind power output. The results from wind farms in Minnesota, Iowa, and along the Washington-Oregon border indicate that statistical modeling can provide a significant improvement in wind forecasts compared to persistence forecasts.
Gibson, Eli; Fenster, Aaron; Ward, Aaron D
2013-10-01
Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions?
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Statistical analysis of the cosmic microwave background: Power spectra and foregrounds
NASA Astrophysics Data System (ADS)
O'Dwyer, Ian J.
2005-11-01
In this thesis I examine some of the challenges associated with analyzing Cosmic Microwave Background (CMB) data and present a novel approach to solving the problem of power spectrum estimation, which is called MAGIC (MAGIC Allows Global Inference of Covariance). In light of the computational difficulty of a brute force approach to power spectrum estimation, I review several approaches which have been applied to the problem and show an example application of such an approximate method to experimental CMB data from the Background Emission Anisotropy Scanning Telescope (BEAST). I then introduce MAGIC, a new approach to power spectrum estimation; based on a Bayesian statistical analysis of the data utilizing Gibbs Sampling. I demonstrate application of this method to the all-sky Wilkinson Microwave Anistropy Probe WMAP data. The results are in broad agreement with those obtained originally by the WMAP team. Since MAGIC generates a full description of each C l it is possible to examine several issues raised by the best-fit WMAP power spectrum, for example the perceived lack of power at low ℓ. It is found that the distribution of C ℓ's at low l are significantly non-Gaussian and, based on the exact analysis presented here, the "low quadrupole issue" can be attributed to a statistical fluctuation. Finally, I examine the effect of Galactic foreground contamination on CMB experiments and describe the principle foregrounds. I show that it is possible to include the foreground components in a self-consistent fashion within the statistical framework of MAGIC and give explicit examples of how this might be achieved. Foreground contamination will become an increasingly important issue in CMB data analysis and the ability of this new algorithm to produce an exact power spectrum in a computationally feasible time, coupled with the foreground component separation and removal is an exciting development in CMB data analysis. When considered with current algorithmic developments
Statistical evaluation of telephone noise interference caused by AC power line harmonic currents
Kuussaari, M. )
1993-04-01
A statistical approach is applied for the evaluation of the limits for harmonic currents in AC power lines, the goal being to prevent excessive telephone noise interference voltages in subscriber cables in rural areas. The analysis is based on Monte-Carlo simulation which takes into account the effect of the experiental probability distributions of the relevant parameters. In the Finnish conditions, the properties of communication cables permit equivalent disturbing phase currents of 8 to 10 A. The digital exchanges permit approximately the same currents. Some new telephone types that have a low balance may make it necessary to limit the currents to a level that is somewhat lower.
Statistics of the radiated field of a space-to-earth microwave power transfer system
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Leininger, G.
1976-01-01
Statistics such as average power density pattern, variance of the power density pattern and variance of the beam pointing error are related to hardware parameters such as transmitter rms phase error and rms amplitude error. Also a limitation on spectral width of the phase reference for phase control was established. A 1 km diameter transmitter appears feasible provided the total rms insertion phase errors of the phase control modules does not exceed 10 deg, amplitude errors do not exceed 10% rms, and the phase reference spectral width does not exceed approximately 3 kHz. With these conditions the expected radiation pattern is virtually the same as the error free pattern, and the rms beam pointing error would be insignificant (approximately 10 meters).
Statistical distribution of pioneer vegetation: the role of local stream power
NASA Astrophysics Data System (ADS)
Crouzy, B.; Edmaier, K.; Pasquale, N.; Perona, P.
2012-12-01
We discuss results of a flume experiment on the colonization of river bars by pioneer vegetation and focus on the role of a non-constant local stream power in determining the statistics of riverbed and uprooted biomass characteristics (root length, number of roots and stem height). We verify the conjecture that the statistical distribution of riverbed vegetation subject to the action of flood disturbances can be obtained from the distribution before the flooding events combined to the relative resilience to floods of plants with given traits. By using fast growing vegetation (Avena sativa) we can access the competition between growth-associated root stabilization and uprooting by floods. We fix the hydrological timescale (in our experiment the arrival time between periodic flooding events) to be comparable with the biological timescales (plant germination and development rates). The sequence of flooding events is repeated until the surviving riverbed vegetation has grown out of scale with the uprooting capacity of the flood and the competition has stopped. We present and compare laboratory results obtained using converging and parallel channel walls to highlight the role of the local stream power in the process. The convergent geometry can be seen as the laboratory analog of different field conditions. At the scale of the bar it represents regions with flow concentration while at a larger scale it is an analog for a river with convergent banks, for an example see the work on the Tagliamento River by Gurnell and Petts (2006). As expected, we observe that for the convergent geometry the variability in the local stream power results in a longer tail of the distribution of root length for uprooted material compared to parallel geometries with an equal flow rate. More surprisingly, the presence of regions with increased stream power in the convergent experiments allows us to access two fundamentally different regimes. We observe that depending on the development stage
Detecting trends in raptor counts: power and type I error rates of various statistical tests
Hatfield, J.S.; Gould, W.R.; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.
1996-01-01
We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.
Nicol, Samuel; Roach, Jennifer K.; Griffith, Brad
2013-01-01
Over the past 50 years, the number and size of high-latitude lakes have decreased throughout many regions; however, individual lake trends have been variable in direction and magnitude. This spatial heterogeneity in lake change makes statistical detection of temporal trends challenging, particularly in small analysis areas where weak trends are difficult to separate from inter- and intra-annual variability. Factors affecting trend detection include inherent variability, trend magnitude, and sample size. In this paper, we investigated how the statistical power to detect average linear trends in lake size of 0.5, 1.0 and 2.0 %/year was affected by the size of the analysis area and the number of years of monitoring in National Wildlife Refuges in Alaska. We estimated power for large (930–4,560 sq km) study areas within refuges and for 2.6, 12.9, and 25.9 sq km cells nested within study areas over temporal extents of 4–50 years. We found that: (1) trends in study areas could be detected within 5–15 years, (2) trends smaller than 2.0 %/year would take >50 years to detect in cells within study areas, and (3) there was substantial spatial variation in the time required to detect change among cells. Power was particularly low in the smallest cells which typically had the fewest lakes. Because small but ecologically meaningful trends may take decades to detect, early establishment of long-term monitoring will enhance power to detect change. Our results have broad applicability and our method is useful for any study involving change detection among variable spatial and temporal extents.
Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model
Gamisch, Alexander
2016-01-01
The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297
Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model.
Gamisch, Alexander
2016-01-01
The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297
A powerful weighted statistic for detecting group differences of directed biological networks
Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Xu, Jing; Ma, Daoxin; Xue, Fuzhong
2016-01-01
Complex disease is largely determined by a number of biomolecules interwoven into networks, rather than a single biomolecule. Different physiological conditions such as cases and controls may manifest as different networks. Statistical comparison between biological networks can provide not only new insight into the disease mechanism but statistical guidance for drug development. However, the methods developed in previous studies are inadequate to capture the changes in both the nodes and edges, and often ignore the network structure. In this study, we present a powerful weighted statistical test for group differences of directed biological networks, which is independent of the network attributes and can capture the changes in both the nodes and edges, as well as simultaneously accounting for the network structure through putting more weights on the difference of nodes locating on relatively more important position. Simulation studies illustrate that this method had better performance than previous ones under various sample sizes and network structures. One application to GWAS of leprosy successfully identifies the specific gene interaction network contributing to leprosy. Another real data analysis significantly identifies a new biological network, which is related to acute myeloid leukemia. One potential network responsible for lung cancer has also been significantly detected. The source R code is available on our website. PMID:27686331
Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants
NASA Astrophysics Data System (ADS)
Rajasekar, Vidyashree
This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly
Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.
2015-01-01
The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.
34 CFR 85.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 1 2010-07-01 2010-07-01 false Adequate evidence. 85.900 Section 85.900 Education Office of the Secretary, Department of Education GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 85.900 Adequate evidence. Adequate evidence means information sufficient to support...
12 CFR 380.52 - Adequate protection.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...
12 CFR 380.52 - Adequate protection.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...
12 CFR 380.52 - Adequate protection.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Adequate protection. 380.52 Section 380.52... ORDERLY LIQUIDATION AUTHORITY Receivership Administrative Claims Process § 380.52 Adequate protection. (a... interest of a claimant, the receiver shall provide adequate protection by any of the following means:...
21 CFR 1404.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Conductance statistics for the power-law banded random matrix model
Martinez-Mendoza, A. J.; Mendez-Bermudez, J. A.; Varga, Imre
2010-12-21
We study numerically the conductance statistics of the one-dimensional (1D) Anderson model with random long-range hoppings described by the Power-law Banded Random Matrix (PBRM) model. Within a scattering approach to electronic transport, we consider two scattering setups in absence and presence of direct processes: 2M single-mode leads attached to one side and to opposite sides of 1D circular samples. For both setups we show that (i) the probability distribution of the logarithm of the conductance T behaves as w(lnT){proportional_to}T{sup M2/2}, for T<<
NASA Astrophysics Data System (ADS)
Shao, Quanxi; Wang, You-Gan
2009-09-01
Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.
Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,
2016-01-01
Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.
NASA Astrophysics Data System (ADS)
Ladoni, Moslem; Kravchenko, Sasha
2014-05-01
Conservational agricultural managements have a potential to increase soil organic carbon sequestration. However, due to typically slow response of soil organic C to management and due to its large spatial variability many researchers find themselves failing to detect statistically significant management effects on soil organic carbon in their studies. One solution that has been commonly applied is to use active fractions of soil organic C for treatment comparisons. Active pools of soil organic C have been shown to respond to management changes faster than total C; however, it is possible that larger variability associated with these pools can make their use for treatment comparisons more difficult. The objectives of this study are to assess the variability of total C and C active pools and then to use power analysis to investigate the probability of detecting significant differences among the treatments for total C and for different active pools of C. We also explored the benefit of applying additional soil and landscape data as covariates to explain some of the variability and to enhance the statistical power for different pools of C. We collected 66 soil from 10 agricultural fields under three different management treatments, namely corn-soybean-wheat rotation systems with 1) conventional chemical inputs, 2) low chemical inputs with cover crops and 3) organic management with cover crops. The cores were analyzed for total organic carbon (TOC) and for two active C pool characteristics, such as particulate organic carbon (POC) and short-term mineralizable carbon (SMC). In addition, for each core we determined the values of potential covariates including soil particle size distribution, bulk density and topographical terrain attributes. Power analysis was conducted using the estimates of variances from the obtained data and a series of hypothesized management effects. The range of considered hypothesized effects consisted of 10-100% increases under low-input, 10
The profound impact of negative power law noise on statistical estimation.
Reinhardt, Victor S
2010-01-01
This paper investigates the profound impact of negative power law (neg-p) noise - that is, noise with a power spectral density L(p)(f) proportional variant | f |(p) for p < 0 - on the ability of practical implementations of statistical estimation or fitting techniques, such as a least squares fit (LSQF) or a Kalman filter, to generate valid results. It demonstrates that such negp noise behaves more like systematic error than conventional noise, because neg-p noise is highly correlated, non-stationary, non-mean ergodic, and has an infinite correlation time tau(c). It is further demonstrated that stationary but correlated noise will also cause invalid estimation behavior when the condition T > tau(c) is not met, where T is the data collection interval for estimation. Thus, it is shown that neg-p noise, with its infinite Tau(c), can generate anomalous estimation results for all values of T, except in certain circumstances. A covariant theory is developed explaining much of this anomalous estimation behavior. However, simulations of the estimation behavior of neg-p noise demonstrate that the subject cannot be fully understood in terms of covariant theory or mean ergodicity. It is finally conjectured that one must investigate the variance ergodicity properties of neg-p noise through the use of 4th order correlation theory to fully explain such simulated behavior. PMID:20040429
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz
2015-03-01
FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.
NASA Astrophysics Data System (ADS)
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.
Perles, Stephanie J; Wagner, Tyler; Irwin, Brian J; Manning, Douglas R; Callahan, Kristina K; Marshall, Matthew R
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year(-1) for all indicators and is appropriate for detecting a 1 % trend·year(-1) in most indicators.
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-01-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.
Nakamura, Kunio; Guizard, Nicolas; Fonov, Vladimir S.; Narayanan, Sridar; Collins, D. Louis; Arnold, Douglas L.
2013-01-01
Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS) and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials. PMID:24266007
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required.
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required. PMID:26673526
Statistics of the epoch of reionization 21-cm signal - I. Power spectrum error-covariance
NASA Astrophysics Data System (ADS)
Mondal, Rajesh; Bharadwaj, Somnath; Majumdar, Suman
2016-02-01
The non-Gaussian nature of the epoch of reionization (EoR) 21-cm signal has a significant impact on the error variance of its power spectrum P(k). We have used a large ensemble of seminumerical simulations and an analytical model to estimate the effect of this non-Gaussianity on the entire error-covariance matrix {C}ij. Our analytical model shows that {C}ij has contributions from two sources. One is the usual variance for a Gaussian random field which scales inversely of the number of modes that goes into the estimation of P(k). The other is the trispectrum of the signal. Using the simulated 21-cm Signal Ensemble, an ensemble of the Randomized Signal and Ensembles of Gaussian Random Ensembles we have quantified the effect of the trispectrum on the error variance {C}ii. We find that its relative contribution is comparable to or larger than that of the Gaussian term for the k range 0.3 ≤ k ≤ 1.0 Mpc-1, and can be even ˜200 times larger at k ˜ 5 Mpc-1. We also establish that the off-diagonal terms of {C}ij have statistically significant non-zero values which arise purely from the trispectrum. This further signifies that the error in different k modes are not independent. We find a strong correlation between the errors at large k values (≥0.5 Mpc-1), and a weak correlation between the smallest and largest k values. There is also a small anticorrelation between the errors in the smallest and intermediate k values. These results are relevant for the k range that will be probed by the current and upcoming EoR 21-cm experiments.
Schroeder, Carl B.; Fawley, William M.; Esarey, Eric
2002-09-24
We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond first saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuation level reaches a minimum.
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Gaskin, Cadeyrn J; Happell, Brenda
2013-02-01
Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes.
Asbestos/NESHAP adequately wet guidance
Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.
1990-12-01
The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.
Chen, W M; Deng, H W
2001-07-01
Transmission disequilibrium test (TDT) is a nuclear family-based analysis that can test linkage in the presence of association. It has gained extensive attention in theoretical investigation and in practical application; in both cases, the accuracy and generality of the power computation of the TDT are crucial. Despite extensive investigations, previous approaches for computing the statistical power of the TDT are neither accurate nor general. In this paper, we develop a general and highly accurate approach to analytically compute the power of the TDT. We compare the results from our approach with those from several other recent papers, all against the results obtained from computer simulations. We show that the results computed from our approach are more accurate than or at least the same as those from other approaches. More importantly, our approach can handle various situations, which include (1) families that consist of one or more children and that have any configuration of affected and nonaffected sibs; (2) families ascertained through the affection status of parent(s); (3) any mixed sample with different types of families in (1) and (2); (4) the marker locus is not a disease susceptibility locus; and (5) existence of allelic heterogeneity. We implement this approach in a user-friendly computer program: TDT Power Calculator. Its applications are demonstrated. The approach and the program developed here should be significant for theoreticians to accurately investigate the statistical power of the TDT in various situations, and for empirical geneticists to plan efficient studies using the TDT.
ERIC Educational Resources Information Center
Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael
2014-01-01
Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…
The Power of Student's t and Wilcoxon W Statistics: A Comparison.
ERIC Educational Resources Information Center
Rasmussen, Jeffrey Lee
1985-01-01
A recent study (Blair and Higgins, 1980) indicated a power advantage for the Wilcoxon W Test over student's t-test when calculated from a common mixed-normal sample. Results of the present study indicate that the t-test corrected for outliers shows a superior power curve to the Wilcoxon W.
ERIC Educational Resources Information Center
Jiang, Depeng; Pepler, Debra; Yao, Hongxing
2010-01-01
Do interventions work and for whom? For this article, we examined the influence of population heterogeneity on power in designing and evaluating interventions. On the basis of Monte Carlo simulations in Study 1, we demonstrated that the power to detect the overall intervention effect is lower for a mixture of two subpopulations than for a…
Supervision of Student Teachers: How Adequate?
ERIC Educational Resources Information Center
Dean, Ken
This study attempted to ascertain how adequately student teachers are supervised by college supervisors and supervising teachers. Questions to be answered were as follows: a) How do student teachers rate the adequacy of supervision given them by college supervisors and supervising teachers? and b) Are there significant differences between ratings…
Small Rural Schools CAN Have Adequate Curriculums.
ERIC Educational Resources Information Center
Loustaunau, Martha
The small rural school's foremost and largest problem is providing an adequate curriculum for students in a changing world. Often the small district cannot or is not willing to pay the per-pupil cost of curriculum specialists, specialized courses using expensive equipment no more than one period a day, and remodeled rooms to accommodate new…
Toward More Adequate Quantitative Instructional Research.
ERIC Educational Resources Information Center
VanSickle, Ronald L.
1986-01-01
Sets an agenda for improving instructional research conducted with classical quantitative experimental or quasi-experimental methodology. Includes guidelines regarding the role of a social perspective, adequate conceptual and operational definition, quality instrumentation, control of threats to internal and external validity, and the use of…
An Adequate Education Defined. Fastback 476.
ERIC Educational Resources Information Center
Thomas, M. Donald; Davis, E. E. (Gene)
Court decisions historically have dealt with educational equity; now they are helping to establish "adequacy" as a standard in education. Legislatures, however, have been slow to enact remedies. One debate over education adequacy, though, is settled: Schools are not financed at an adequate level. This fastback is divided into three sections.…
Funding the Formula Adequately in Oklahoma
ERIC Educational Resources Information Center
Hancock, Kenneth
2015-01-01
This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…
NASA Astrophysics Data System (ADS)
Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.
2014-03-01
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.
Bellan, Steven E.; Pulliam, Juliet R. C.; Pearson, Carl A. B.; Champredon, David; Fox, Spencer J.; Skrip, Laura; Galvani, Alison P.; Gambhir, Manoj; Lopman, Ben A.; Porco, Travis C.; Meyers, Lauren Ancel; Dushoff, Jonathan
2016-01-01
Background Safe and effective vaccines may help end the ongoing Ebola virus disease (EVD) epidemic in West Africa, and mitigate future outbreaks. We evaluate the statistical validity and power of randomized controlled (RCT) and stepped-wedge cluster trial (SWCT) designs in Sierra Leone, where EVD incidence is spatiotemporally heterogeneous, and rapidly declining. Methods We forecasted district-level EVD incidence over the next six months using a stochastic model fit to data from Sierra Leone. We then simulated RCT and SWCT designs in trial populations comprising geographically distinct clusters of high risk, taking into account realistic logistical constraints, as well as both individual-level and cluster-level variation in risk. We assessed false positive rates and power for parametric and nonparametric analyses of simulated trial data, across a range of vaccine efficacies and trial start dates. Findings For an SWCT, regional variation in EVD incidence trends produced inflated false positive rates (up to 0.11 at α=0.05) under standard statistical models, but not when analyzed by a permutation test, whereas all analyses of RCTs remained valid. Assuming a six-month trial starting February 18, 2015, we estimate the power to detect a 90% efficacious vaccine to be between 48% and 89% for an RCT, and between 6.4% and 26% for an SWCT, depending on incidence within the trial population. We estimate that a one-month delay in implementation will reduce the power of the RCT and SWCT by 20% and 49%, respectively. Interpretation Spatiotemporal variation in infection risk undermines the SWCT's statistical power. This variation also undercuts the SWCT's expected ethical advantages over the RCT, because the latter but not the former can prioritize high-risk clusters. Funding US National Institutes of Health, US National Science Foundation, Canadian Institutes of Health Research PMID:25886798
Two universal physical principles shape the power-law statistics of real-world networks.
Lorimer, Tom; Gomez, Florian; Stoop, Ruedi
2015-01-01
The study of complex networks has pursued an understanding of macroscopic behaviour by focusing on power-laws in microscopic observables. Here, we uncover two universal fundamental physical principles that are at the basis of complex network generation. These principles together predict the generic emergence of deviations from ideal power laws, which were previously discussed away by reference to the thermodynamic limit. Our approach proposes a paradigm shift in the physics of complex networks, toward the use of power-law deviations to infer meso-scale structure from macroscopic observations.
Blinking in quantum dots: The origin of the grey state and power law statistics.
Ye, Mao; Searson, Peter C
2011-09-16
Quantum dot (QD) blinking is characterized by switching between an "on" state and an "off" state, and a power-law distribution of on and off times with exponents from 1.0 to 2.0. The origin of blinking behavior in QDs, however, has remained a mystery. Here we describe an energy-band model for QDs that captures the full range of blinking behavior reported in the literature and provides new insight into features such as the gray state, the power-law distribution of on and off times, and the power-law exponents.
Blinking in quantum dots: The origin of the grey state and power law statistics
NASA Astrophysics Data System (ADS)
Ye, Mao; Searson, Peter C.
2011-09-01
Quantum dot (QD) blinking is characterized by switching between an “on” state and an “off” state, and a power-law distribution of on and off times with exponents from 1.0 to 2.0. The origin of blinking behavior in QDs, however, has remained a mystery. Here we describe an energy-band model for QDs that captures the full range of blinking behavior reported in the literature and provides new insight into features such as the gray state, the power-law distribution of on and off times, and the power-law exponents.
NASA Astrophysics Data System (ADS)
Białous, Małgorzata; Yunko, Vitalii; Bauch, Szymon; Ławniczak, Michał; Dietz, Barbara; Sirko, Leszek
2016-09-01
We present experimental studies of the power spectrum and other fluctuation properties in the spectra of microwave networks simulating chaotic quantum graphs with violated time reversal invariance. On the basis of our data sets, we demonstrate that the power spectrum in combination with other long-range and also short-range spectral fluctuations provides a powerful tool for the identification of the symmetries and the determination of the fraction of missing levels. Such a procedure is indispensable for the evaluation of the fluctuation properties in the spectra of real physical systems like, e.g., nuclei or molecules, where one has to deal with the problem of missing levels.
ERIC Educational Resources Information Center
Porter, Kristin E.
2016-01-01
In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…
ERIC Educational Resources Information Center
Endress, Ansgar D.; Mehler, Jacques
2009-01-01
Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…
Modern Robust Statistical Methods: An Easy Way to Maximize the Accuracy and Power of Your Research
ERIC Educational Resources Information Center
Erceg-Hurn, David M.; Mirosevich, Vikki M.
2008-01-01
Classic parametric statistical significance tests, such as analysis of variance and least squares regression, are widely used by researchers in many disciplines, including psychology. For classic parametric tests to produce accurate results, the assumptions underlying them (e.g., normality and homoscedasticity) must be satisfied. These assumptions…
ERIC Educational Resources Information Center
Groth, Randall E.
2013-01-01
A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives. PMID:26903208
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.
Tang, Jau; Marcus, R A
2005-09-01
A mechanism involving diffusion-controlled electron transfer processes in Debye and non-Debye dielectric media is proposed to elucidate the power-law distribution for the lifetime of a blinking quantum dot. This model leads to two complementary regimes of power law with a sum of the exponents equal to 2, and to a specific value for the exponent in terms of a distribution of the diffusion correlation times. It also links the exponential bending tail with energetic and kinetic parameters.
Konstantopoulos, Spyros
2012-06-18
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power without adding many clusters that make the study much more expensive. In this article I discuss how power estimates of tests of treatment effects in balanced cluster randomized designs are affected by covariates at different levels. I use third-grade data from Project STAR, a field experiment about class size, to demonstrate how covariates that explain a considerable proportion of variance in outcomes increase power significantly. When lower level covariates are group-mean centered and clustering effects are larger, top-level covariates increase power more than lower level covariates. In contrast, when clustering effects are smaller and lower level covariates are grand-mean centered or uncentered, lower level covariates increase power more than top-level covariates.
A statistical framework for genetic association studies of power curves in bird flight.
Lin, Min; Zhao, Wei; Wu, Rongling
2006-01-01
How the power required for bird flight varies as a function of forward speed can be used to predict the flight style and behavioral strategy of a bird for feeding and migration. A U-shaped curve was observed between the power and flight velocity in many birds, which is consistent to the theoretical prediction by aerodynamic models. In this article, we present a general genetic model for fine mapping of quantitative trait loci (QTL) responsible for power curves in a sample of birds drawn from a natural population. This model is developed within the maximum likelihood context, implemented with the EM algorithm for estimating the population genetic parameters of QTL and the simplex algorithm for estimating the QTL genotype-specific parameters of power curves. Using Monte Carlo simulation derived from empirical observations of power curves in the European starling (Sturnus vulgaris), we demonstrate how the underlying QTL for power curves can be detected from molecular markers and how the QTL detected affect the most appropriate flight speeds used to design an optimal migration strategy. The results from our model can be directly integrated into a conceptual framework for understanding flight origin and evolution.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
Kneller, James P.; Kabadi, Neel V.
2015-07-16
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. In this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Furthermore, since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
Kneller, James P.; Kabadi, Neel V.
2015-07-16
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. Inmore » this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Furthermore, since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.« less
NASA Technical Reports Server (NTRS)
Smith, Wayne Farrior
1973-01-01
The effect of finite source size on the power statistics in a reverberant room for pure tone excitation was investigated. Theoretical results indicate that the standard deviation of low frequency, pure tone finite sources is always less than that predicted by point source theory and considerably less when the source dimension approaches one-half an acoustic wavelength or greater. A supporting experimental study was conducted utilizing an eight inch loudspeaker and a 30 inch loudspeaker at eleven source positions. The resulting standard deviation of sound power output of the smaller speaker is in excellent agreement with both the derived finite source theory and existing point source theory, if the theoretical data is adjusted to account for experimental incomplete spatial averaging. However, the standard deviation of sound power output of the larger speaker is measurably lower than point source theory indicates, but is in good agreement with the finite source theory.
NASA Astrophysics Data System (ADS)
Woolley, Thomas W.; Dawson, George O.
It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marcos-Caballero, A.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.
2014-11-01
We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. This map shows an obvious galaxy cluster tSZ signal that is well matched with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales (ℓ < 60), the major foreground contaminant is the diffuse thermal dust emission. At small angular scales (ℓ > 500) the clustered cosmic infrared background and residual point sources are the major contaminants. These foregrounds are carefully modelled and subtracted. We thus measure the tSZ power spectrum over angular scales 0.17° ≲ θ ≲ 3.0° that were previously unexplored. The measured tSZ power spectrum is consistent with that expected from the Planck catalogue of SZ sources, with clear evidence of additional signal from unresolved clusters and, potentially, diffuse warm baryons. Marginalized band-powers of the Planck tSZ power spectrum and the best-fit model are given. The non-Gaussianity of the Compton parameter map is further characterized by computing its 1D probability distribution function and its bispectrum. The measured tSZ power spectrum and high order statistics are used to place constraints on σ8.
Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint
Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.
2012-08-01
Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.
The power of 41%: A glimpse into the life of a statistic.
Tanis, Justin
2016-01-01
"Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record
The power of 41%: A glimpse into the life of a statistic.
Tanis, Justin
2016-01-01
"Forty-one percent?" the man said with anguish on his face as he addressed the author, clutching my handout. "We're talking about my granddaughter here." He was referring to the finding from the National Transgender Discrimination Survey (NTDS) that 41% of 6,450 respondents said they had attempted suicide at some point in their lives. The author had passed out the executive summary of the survey's findings during a panel discussion at a family conference to illustrate the critical importance of acceptance of transgender people. During the question and answer period, this gentleman rose to talk about his beloved 8-year-old granddaughter who was in the process of transitioning socially from male to female in her elementary school. The statistics that the author was citing were not just numbers to him; and he wanted strategies-effective ones-to keep his granddaughter alive and thriving. The author has observed that the statistic about suicide attempts has, in essence, developed a life of its own. It has had several key audiences-academics and researchers, public policymakers, and members of the community, particularly transgender people and our families. This article explores some of the key takeaways from the survey and the ways in which the 41% statistic has affected conversations about the injustices transgender people face and the importance of family and societal acceptance. (PsycINFO Database Record PMID:27380151
Discriminative power of basketball game-related statistics by level of competition and sex.
Sampaio, Jaime; Godoy, Sergio Ibáñez; Feu, Sebastian
2004-12-01
The purpose of this study was to identify the basketball game-related statistics that best discriminate performances by sex of players and level of competition. Archival data were obtained from the International Basketball Federation boxscores for all games during men's senior (n=62), men's junior (n=64), women's senior (n=62), and women's junior (n=42) World Championships. The game-related statistics gathered included 2- and 3-point field-goals (both successful and unsuccessful), free-throws (both successful and unsuccessful), defensive and offensive rebounds, blocks, assists, fouls, steals and turnovers. For the analysis only the close games were selected (N= 105, 1 to 12 points difference). Men's teams were discriminated from women's teams by their higher percentage of blocks and lower percentage of steals and unsuccessful 2-point field goals. Junior teams were discriminated from senior teams by their lower percentage of assists and higher percentage of turnovers. In the two-factor interaction, the teams were mainly discriminated by the game-related statistics identified for level of competition.
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2012-01-01
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power…
A Bootstrap Method for Statistical Power System Mode Estimation and Probing Signal Selection
Zhou, Ning; Pierre, John W; Trudnowski, Daniel
2006-06-30
Near real-time measurement-based electromechanical-mode estimation offers considerable potential for many future power-system operation and control strategies. Recent research has investigated the use of low-level Pseudo Random Noise (PRN) probing signals injected into power systems to estimate the low-frequency electromechanical modes. Because of the random na-ture of a power system, estimating the modes from a single prob-ing experiment is very difficult. Ideally, one would use a Monte-Carlo approach with multiple independent probing experiments resulting in a mode estimation distribution. Then, one could state that the mode is within a region of the complex plane. Unfortu-nately, conducting multiple probing experiments is prohibitive for most power-system applications. This paper presents a method-ology for estimating the mode distribution based on one probing test using a Bootstrap algorithm. The proposed method is applied to both simulation data and actual-system measurement data to illustrate its performance and application. It is demonstrated that the method can provide valuable information to PRN tests and guide future PRN probing signal design and selection.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
This paper presents the results of a NASA investigation of a claimed 'Overlap' between two gust response analysis methods: the Statistical Discrete Gust (SDG) method and the Power Spectral Density (PSD) method. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented in this paper for several different airplanes at several different flight conditions indicate that such an 'Overlap' does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
The results of a NASA investigation of a claimed Overlap between two gust response analysis methods: the Statistical Discrete Gust (SDG) Method and the Power Spectral Density (PSD) Method are presented. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented for several different airplanes at several different flight conditions indicate that such an Overlap does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
Statistical modelling and power analysis for detecting trends in total suspended sediment loads
NASA Astrophysics Data System (ADS)
Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason
2015-01-01
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.
Statistical Methods for Quantifying Uncertainty in ENSO on Wind Power in the Northern Great Plains
NASA Astrophysics Data System (ADS)
Harper, B. R.
2006-12-01
The El Nino Southern Oscillation (ENSO) is a well-known source of inter-annual climate variability for both precipitation and temperature in the northern Great Plains. The northern Great Plains also have the largest wind resource in the United States. With the continued growth of wind energy, ENSO's effect on wind speed needs to be examined because of our current lack of understanding about how wind speeds are affected by inter-annual variability. After having previously established that a teleconnection to ENSO exists, we set out to quantify the uncertainty in this relationship with this study. Our method uses the sign test and resampling of hourly airport wind speed measurements for the past half-century at 4 airports in both North Dakota and South Dakota. Airport data are useful in this case because they have very long and continuous measurement of hourly wind speed. With this data, we were able to show that ENSO did have an effect on wind speeds as well as on wind power. The warm phase of El Nino, in particular, was correlated with the largest reductions in wind speed in South Dakota. In North Dakota, it was the cold phase that produced the largest reduction in wind power. The largest differences occurred in April, while the smallest differences occurred in July. It is our hope that this method will also be a useful tool for wind farm developers across the country to more accurately assess the value of their site based on limited in-situ data.
Carvajal-Rodríguez, Antonio; de Uña-Alvarez, Jacobo; Rolán-Alvarez, Emilio
2009-01-01
Background The detection of true significant cases under multiple testing is becoming a fundamental issue when analyzing high-dimensional biological data. Unfortunately, known multitest adjustments reduce their statistical power as the number of tests increase. We propose a new multitest adjustment, based on a sequential goodness of fit metatest (SGoF), which increases its statistical power with the number of tests. The method is compared with Bonferroni and FDR-based alternatives by simulating a multitest context via two different kinds of tests: 1) one-sample t-test, and 2) homogeneity G-test. Results It is shown that SGoF behaves especially well with small sample sizes when 1) the alternative hypothesis is weakly to moderately deviated from the null model, 2) there are widespread effects through the family of tests, and 3) the number of tests is large. Conclusion Therefore, SGoF should become an important tool for multitest adjustment when working with high-dimensional biological data. PMID:19586526
Guilera, Georgina; Gómez-Benito, Juana; Hidalgo, Maria Dolores; Sánchez-Meca, Julio
2013-12-01
This article presents a meta-analysis of studies investigating the effectiveness of the Mantel-Haenszel (MH) procedure when used to detect differential item functioning (DIF). Studies were located electronically in the main databases, representing the codification of 3,774 different simulation conditions, 1,865 related to Type I error and 1,909 to statistical power. The homogeneity of effect-size distributions was assessed by the Q statistic. The extremely high heterogeneity in both error rates (I² = 94.70) and power (I² = 99.29), due to the fact that numerous studies test the procedure in extreme conditions, means that the main interest of the results lies in explaining the variability in detection rates. One-way analysis of variance was used to determine the effects of each variable on detection rates, showing that the MH test was more effective when purification procedures were used, when the data fitted the Rasch model, when test contamination was below 20%, and with sample sizes above 500. The results imply a series of recommendations for practitioners who wish to study DIF with the MH test. A limitation, one inherent to all meta-analyses, is that not all the possible moderator variables, or the levels of variables, have been explored. This serves to remind us of certain gaps in the scientific literature (i.e., regarding the direction of DIF or variances in ability distribution) and is an aspect that methodologists should consider in future simulation studies. PMID:24127986
Statistical power of detecting trends in total suspended sediment loads to the Great Barrier Reef.
Darnell, Ross; Henderson, Brent; Kroon, Frederieke J; Kuhnert, Petra
2012-01-01
The export of pollutant loads from coastal catchments is of primary interest to natural resource management. For example, Reef Plan, a joint initiative by the Australian Government and the Queensland Government, has indicated that a 20% reduction in sediment is required by 2020. There is an obvious need to consider our ability to detect any trend if we are to set realistic targets or to reliably identify changes to catchment loads. We investigate the number of years of monitoring aquatic pollutant loads necessary to detect trends. Instead of modelling the trend in the annual loads directly, given their strong relationship to flow, we consider trends through the reduction in concentration for a given flow. Our simulations show very low power (<40%) of detecting changes of 20% over time periods of several decades, indicating that the chances of detecting trends of reasonable magnitudes over these time frames are very small. PMID:22551850
Wille, Anja; Gruissem, Wilhelm; Bühlmann, Peter; Hennig, Lars
2007-11-01
Accurately identifying differentially expressed genes from microarray data is not a trivial task, partly because of poor variance estimates of gene expression signals. Here, after analyzing 380 replicated microarray experiments, we found that probesets have typical, distinct variances that can be estimated based on a large number of microarray experiments. These probeset-specific variances depend at least in part on the function of the probed gene: genes for ribosomal or structural proteins often have a small variance, while genes implicated in stress responses often have large variances. We used these variance estimates to develop a statistical test for differentially expressed genes called EVE (external variance estimation). The EVE algorithm performs better than the t-test and LIMMA on some real-world data, where external information from appropriate databases is available. Thus, EVE helps to maximize the information gained from a typical microarray experiment. Nonetheless, only a large number of replicates will guarantee to identify nearly all truly differentially expressed genes. However, our simulation studies suggest that even limited numbers of replicates will usually result in good coverage of strongly differentially expressed genes.
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
Kinetic and Statistical Analysis of Primary Circuit Water Chemistry Data in a VVER Power Plant
Nagy, Gabor; Tilky, Peter; Horvath, Akos; Pinter, Tamas; Schiller, Robert
2001-12-15
The results of chemical and radiochemical analyses of the primary circuit coolant liquid, obtained between 1995 and 1999 at the four VVER-type blocks of the Paks (Hungary) nuclear power station, are assessed. A model has been developed regarding the pressure vessel with its auxiliary parts plus the fuel elements as the zone, with the six steam generators as one single unit. The stream from the steam generator is split, with its larger part returning to the zone through the main circulating pump and the smaller one passing through the purifier column. Based on this flowchart, the formation kinetics of corrosion products and of radioactive substances are evaluated. Correlation analysis is applied to reveal any eventual interdependence of the processes, whereas the range-per-scatter (R/S) method is used to characterize the random or deterministic nature of a process. The evaluation of the t {yields} {infinity} limits of the kinetic equations enables one to conclude that (a) the total amount of corrosion products per element during one cycle is almost always <15 kg and (b) the zone acts as a highly efficient filter with an efficiency of {approx}1. The R/S results show that the fluctuations in the concentrations of the corrosion products are persistent; this finding indicates that random effects play here little if any role and that the processes in the coolant are under control. Correlation analyses show that the variations of the concentrations are practically uncorrelated and that the processes are independent of each other.
Is a vegetarian diet adequate for children.
Hackett, A; Nathan, I; Burgess, L
1998-01-01
The number of people who avoid eating meat is growing, especially among young people. Benefits to health from a vegetarian diet have been reported in adults but it is not clear to what extent these benefits are due to diet or to other aspects of lifestyles. In children concern has been expressed concerning the adequacy of vegetarian diets especially with regard to growth. The risks/benefits seem to be related to the degree of restriction of he diet; anaemia is probably both the main and the most serious risk but this also applies to omnivores. Vegan diets are more likely to be associated with malnutrition, especially if the diets are the result of authoritarian dogma. Overall, lacto-ovo-vegetarian children consume diets closer to recommendations than omnivores and their pre-pubertal growth is at least as good. The simplest strategy when becoming vegetarian may involve reliance on vegetarian convenience foods which are not necessarily superior in nutritional composition. The vegetarian sector of the food industry could do more to produce foods closer to recommendations. Vegetarian diets can be, but are not necessarily, adequate for children, providing vigilance is maintained, particularly to ensure variety. Identical comments apply to omnivorous diets. Three threats to the diet of children are too much reliance on convenience foods, lack of variety and lack of exercise.
Blalock, E M; Chen, K-C; Stromberg, A J; Norris, C M; Kadish, I; Kraner, S D; Porter, N M; Landfield, P W
2005-11-01
During normal brain aging, numerous alterations develop in the physiology, biochemistry and structure of neurons and glia. Aging changes occur in most brain regions and, in the hippocampus, have been linked to declining cognitive performance in both humans and animals. Age-related changes in hippocampal regions also may be harbingers of more severe decrements to come from neurodegenerative disorders such as Alzheimer's disease (AD). However, unraveling the mechanisms underlying brain aging, AD and impaired function has been difficult because of the complexity of the networks that drive these aging-related changes. Gene microarray technology allows massively parallel analysis of most genes expressed in a tissue, and therefore is an important new research tool that potentially can provide the investigative power needed to address the complexity of brain aging/neurodegenerative processes. However, along with this new analytic power, microarrays bring several major bioinformatics and resource problems that frequently hinder the optimal application of this technology. In particular, microarray analyses generate extremely large and unwieldy data sets and are subject to high false positive and false negative rates. Concerns also have been raised regarding their accuracy and uniformity. Furthermore, microarray analyses can result in long lists of altered genes, most of which may be difficult to evaluate for functional relevance. These and other problems have led to some skepticism regarding the reliability and functional usefulness of microarray data and to a general view that microarray data should be validated by an independent method. Given recent progress, however, we suggest that the major problem for current microarray research is no longer validity of expression measurements, but rather, the reliability of inferences from the data, an issue more appropriately redressed by statistical approaches than by validation with a separate method. If tested using statistically
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
ERIC Educational Resources Information Center
Blair, R. Clifford; Higgins, James J.
1980-01-01
Monte Carlo techniques were used to compare the power of Wilcoxon's rank-sum test to the power of the two independent means t test for situations in which samples were drawn from (1) uniform, (2) Laplace, (3) half-normal, (4) exponential, (5) mixed-normal, and (6) mixed-uniform distributions. (Author/JKS)
Adequate mathematical modelling of environmental processes
NASA Astrophysics Data System (ADS)
Chashechkin, Yu. D.
2012-04-01
In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same
NASA Astrophysics Data System (ADS)
Mittendorfer, J.; Zwanziger, P.
2000-03-01
High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
ERIC Educational Resources Information Center
Tabor, Josh
2010-01-01
On the 2009 AP[c] Statistics Exam, students were asked to create a statistic to measure skewness in a distribution. This paper explores several of the most popular student responses and evaluates which statistic performs best when sampling from various skewed populations. (Contains 8 figures, 3 tables, and 4 footnotes.)
NASA Astrophysics Data System (ADS)
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
NASA Astrophysics Data System (ADS)
Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja
2016-04-01
Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.
Keller, Jacob; Keller, Jacob Pearson; Homma, Kazuaki; Dallos, Peter
2013-01-01
Especially in the last decade or so, there have been dramatic advances in fluorescence-based imaging methods designed to measure a multitude of functions in living cells. Despite this, many of the methods used to analyze the resulting images are limited. Perhaps the most common mode of analysis is the choice of regions of interest (ROIs), followed by quantification of the signal contained therein in comparison with another "control" ROI. While this method has several advantages, such as flexibility and capitalization on the power of human visual recognition capabilities, it has the drawbacks of potential subjectivity and lack of precisely defined criteria for ROI selection. This can lead to analyses which are less precise or accurate than the data might allow for, and generally a regrettable loss of information. Herein, we explore the possibility of abandoning the use of conventional ROIs, and instead propose treating individual pixels as ROIs, such that all information can be extracted systematically with the various statistical cutoffs we discuss. As a test case for this approach, we monitored intracellular pH in cells transfected with the chloride/bicarbonate transporter slc26a3 using the ratiometric dye SNARF-5F under various conditions. We performed a parallel analysis using two different levels of stringency in conventional ROI analysis as well as the pixels-as-ROIs (PAR) approach, and found that pH differences between control and transfected cells were accentuated by ~50-100% by using the PAR approach. We therefore consider this approach worthy of adoption, especially in cases in which higher accuracy and precision are required.
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Hollenbeck, John R; DeRue, D Scott; Mannor, Michael
2006-01-01
Comments on the original article "The impact of chief executive officer personality on top management team dynamics: One mechanism by which leadership affects organizational performance", by R. S. Peterson et al.. This comment illustrates how small sample sizes, when combined with many statistical tests, can generate unstable parameter estimates and invalid inferences. Although statistical power for 1 test in a small-sample context is too low, the experimentwise power is often high when many tests are conducted, thus leading to Type I errors that will not replicate when retested. This comment's results show how radically the specific conclusions and inferences in R. S. Peterson, D. B. Smith, P. V. Martorana, and P. D. Owens's (2003) study changed with the inclusion or exclusion of 1 data point. When a more appropriate experimentwise statistical test was applied, the instability in the inferences was eliminated, but all the inferences become nonsignificant, thus changing the positive conclusions.
Quantifying variability within water samples: the need for adequate subsampling.
Donohue, Ian; Irvine, Kenneth
2008-01-01
Accurate and precise determination of the concentration of nutrients and other substances in waterbodies is an essential requirement for supporting effective management and legislation. Owing primarily to logistic and financial constraints, however, national and regional agencies responsible for monitoring surface waters tend to quantify chemical indicators of water quality using a single sample from each waterbody, thus largely ignoring spatial variability. We show here that total sample variability, which comprises both analytical variability and within-sample heterogeneity, of a number of important chemical indicators of water quality (chlorophyll a, total phosphorus, total nitrogen, soluble molybdate-reactive phosphorus and dissolved inorganic nitrogen) varies significantly both over time and among determinands, and can be extremely high. Within-sample heterogeneity, whose mean contribution to total sample variability ranged between 62% and 100%, was significantly higher in samples taken from rivers compared with those from lakes, and was shown to be reduced by filtration. Our results show clearly that neither a single sample, nor even two sub-samples from that sample is adequate for the reliable, and statistically robust, detection of changes in the quality of surface waters. We recommend strongly that, in situations where it is practicable to take only a single sample from a waterbody, a minimum of three sub-samples are analysed from that sample for robust quantification of both the concentrations of determinands and total sample variability. PMID:17706740
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order to... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 1 2012-07-01 2012-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 1 2013-07-01 2013-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 1 2010-07-01 2010-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 1 2014-07-01 2014-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...
34 CFR 200.20 - Making adequate yearly progress.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Making adequate yearly progress. 200.20 Section 200.20... Basic Programs Operated by Local Educational Agencies Adequate Yearly Progress (ayp) § 200.20 Making... State data system; (vi) Include, as separate factors in determining whether schools are making AYP for...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Responsibility for maintaining adequate safeguards. 1304.114 Section 1304.114 Energy NUCLEAR WASTE TECHNICAL REVIEW BOARD PRIVACY ACT OF 1974 § 1304.114 Responsibility for maintaining adequate safeguards. The Board has the responsibility for maintaining...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES Qualifying for an SBIC License Capitalizing An Sbic § 107.200 Adequate capital...
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....
7 CFR 4290.200 - Adequate capital for RBICs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 15 2010-01-01 2010-01-01 false Adequate capital for RBICs. 4290.200 Section 4290.200 Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... Qualifications for the RBIC Program Capitalizing A Rbic § 4290.200 Adequate capital for RBICs. You must meet...
"Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya
ERIC Educational Resources Information Center
Parker, Jan
2014-01-01
Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo…
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM HEALTH MAINTENANCE ORGANIZATIONS... capital assets, rather than the expenditure for the capital asset, is allowable. (c) Provider...
Dziak, John J.; Lanza, Stephanie T.; Tan, Xianming
2014-01-01
Selecting the number of different classes which will be assumed to exist in the population is an important step in latent class analysis (LCA). The bootstrap likelihood ratio test (BLRT) provides a data-driven way to evaluate the relative adequacy of a (K −1)-class model compared to a K-class model. However, very little is known about how to predict the power or the required sample size for the BLRT in LCA. Based on extensive Monte Carlo simulations, we provide practical effect size measures and power curves which can be used to predict power for the BLRT in LCA given a proposed sample size and a set of hypothesized population parameters. Estimated power curves and tables provide guidance for researchers wishing to size a study to have sufficient power to detect hypothesized underlying latent classes. PMID:25328371
García-Hermoso, Antonio; Dávila-Romero, Carlos; Saavedra, Jose M
2013-02-01
This study compared volleyball game-related statistics by outcome (winners and losers of sets) and set number (total, initial, and last) to identify characteristics that discriminated game performance. Game-related statistics from 314 sets (44 matches) played by teams of male 14- to 15-year-olds in a regional volleyball championship were analysed (2011). Differences between contexts (winning or losing teams) and "set number" (total, initial, and last) were assessed. A discriminant analysis was then performed according to outcome (winners and losers of sets) and "set number" (total, initial, and last). The results showed differences (winning or losing sets) in several variables of Complexes I (attack point and error reception) and II (serve and aces). Game-related statistics which discriminate performance in the sets index the serve, positive reception, and attack point. The predictors of performance at these ages when players are still learning could help coaches plan their training. PMID:23829141
ERIC Educational Resources Information Center
Safarkhani, Maryam; Moerbeek, Mirjam
2013-01-01
In a randomized controlled trial, a decision needs to be made about the total number of subjects for adequate statistical power. One way to increase the power of a trial is by including a predictive covariate in the model. In this article, the effects of various covariate adjustment strategies on increasing the power is studied for discrete-time…
ERIC Educational Resources Information Center
Stone, Clement A.
2003-01-01
Developed and investigated a goodness-of-fit statistic that considers the uncertainty with which ability is estimated and a resampling-based hypothesis testing procedure. Simulation study results indicate that the procedure should be useful for evaluating goodness-of-fit item response theory models for most testing applications when uncertainty in…
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
NASA Astrophysics Data System (ADS)
Gorbunov, Oleg A.; Sugavanam, Srikanth; Churkin, Dmitry V.
2014-05-01
In the present paper we numerically study instrumental impact on statistical properties of quasi-CW Raman fiber laser using a simple model of multimode laser radiation. Effects, that have the most influence, are limited electrical bandwidth of measurement equipment and noise. To check this influence, we developed a simple model of the multimode quasi- CW generation with exponential statistics (i.e. uncorrelated modes). We found that the area near zero intensity in probability density function (PDF) is strongly affected by both factors, for example both lead to formation of a negative wing of intensity distribution. But far wing slope of PDF is not affected by noise and, for moderate mismatch between optical and electrical bandwidth, is only slightly affected by bandwidth limitation. The generation spectrum often becomes broader at higher power in experiments, so the spectral/electrical bandwidth mismatch factor increases over the power that can lead to artificial dependence of the PDF slope over the power. It was also found that both effects influence the ACF background level: noise impact decreases it, while limited bandwidth leads to its increase.
NASA Astrophysics Data System (ADS)
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Qualley, G.; Newman, J. F.; Miller, W. O.
2015-09-28
The goal of our FY15 project was to explore the use of statistical models and high-resolution atmospheric input data to develop more accurate prediction models for turbine power generation. We modeled power for two operational wind farms in two regions of the country. The first site is a 235 MW wind farm in Northern Oklahoma with 140 GE 1.68 turbines. Our second site is a 38 MW wind farm in the Altamont Pass Region of Northern California with 38 Mitsubishi 1 MW turbines. The farms are very different in topography, climatology, and turbine technology; however, both occupy high wind resource areas in the U.S. and are representative of typical wind farms found in their respective areas.
Influence of the fluid density on the statistics of power fluctuations in von Kármán swirling flows
NASA Astrophysics Data System (ADS)
Opazo, A.; Sáez, A.; Bustamante, G.; Labbé, R.
2016-02-01
Here, we report experimental results on the fluctuations of injected power in confined turbulence. Specifically, we have studied a von Kármán swirling flow with constant external torque applied to the stirrers. Two experiments were performed at nearly equal Reynolds numbers, in geometrically similar experimental setups. Air was utilized in one of them and water in the other. With air, it was found that the probability density function of power fluctuations is strongly asymmetric, while with water, it is nearly Gaussian. This suggests that the outcome of a big change of the fluid density in the flow-stirrer interaction is not simply a change in the amplitude of stirrers' response. In the case of water, with a density roughly 830 times greater than air density, the coupling between the flow and the stirrers is stronger, so that they follow more closely the fluctuations of the average rotation of the nearby flow. When the fluid is air, the coupling is much weaker. The result is not just a smaller response of the stirrers to the torque exerted by the flow; the PDF of the injected power becomes strongly asymmetric and its spectrum acquires a broad region that scales as f-2. Thus, the asymmetry of the probability density functions of torque or angular speed could be related to the inability of the stirrers to respond to flow stresses. This happens, for instance, when the torque exerted by the flow is weak, due to small fluid density, or when the stirrers' moment of inertia is large. Moreover, a correlation analysis reveals that the features of the energy transfer dynamics with water are qualitatively and quantitatively different to what is observed with air as working fluid.
Understanding Your Adequate Yearly Progress (AYP), 2011-2012
ERIC Educational Resources Information Center
Missouri Department of Elementary and Secondary Education, 2011
2011-01-01
The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation Rates.…
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Adequate exploration plan. 970.404...) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of...
Adequate Schools and Inadequate Education: An Anthropological Perspective.
ERIC Educational Resources Information Center
Wolcott, Harry F.
To illustrate his claim that schools generally do a remarkably good job of schooling while the society makes inadequate use of other means to educate young people, the author presents a case history of a young American (identified pseudonymously as "Brad") whose schooling was adequate but whose education was not. Brad, jobless and homeless,…
Comparability and Reliability Considerations of Adequate Yearly Progress
ERIC Educational Resources Information Center
Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young
2012-01-01
The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS... operate actively in accordance with your Articles and within the context of your business plan,...
Assessing Juvenile Sex Offenders to Determine Adequate Levels of Supervision.
ERIC Educational Resources Information Center
Gerdes, Karen E.; And Others
1995-01-01
This study analyzed the internal consistency of four inventories used by Utah probation officers to determine adequate and efficacious supervision levels and placement for juvenile sex offenders. Three factors accounted for 41.2 percent of variance (custodian's and juvenile's attitude toward intervention, offense characteristics, and historical…
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... identifiable personal data and automated systems shall be adequately trained in the security and privacy of... records in which identifiable personal data are processed or maintained, including all reports and output... personal records or data; must minimize, to the extent practicable, the risk that skilled technicians...
Do Beginning Teachers Receive Adequate Support from Their Headteachers?
ERIC Educational Resources Information Center
Menon, Maria Eliophotou
2012-01-01
The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…
NASA Astrophysics Data System (ADS)
Soua, S.; Bridge, B.; Cebulski, L.; Gan, T.-H.
2012-03-01
The use of a shock accelerometer for the continuous in-service monitoring of wear of the slip ring on a wind turbine generator is proposed and supporting results are presented. Five wear defects in the form of out-of-round circumference acceleration data with average radial dimensions in the range 5.9-25 µm were studied. A theoretical model of the acceleration at a point on the circumference of a ring as a function of the defect profile is presented. Acceleration data as a continuous function of time have been obtained for ring rotation frequencies that span the range of frequencies arising with the variation of wind speeds experienced under all in-service conditions. As a result, the measured RMS acceleration is proven to follow an overall increasing trend with frequency for all defects at all brush pressures. A statistical analysis of the root mean square of the time acceleration data as a function of the defect profiles, rotation speeds and brush contact pressure has been performed. The detection performance is considered in terms of the achievement of a signal to noise ratio exceeding 3 (99.997% defect detection probability). Under all conditions of rotation speed and pressure, this performance was achieved for average defect sizes as small as 10 µm, which is only 0.004% of the ring diameter. These results form the basis of a very sensitive defect alarm system.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-06-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.
Cafri, Guy; Kromrey, Jeffrey D; Brannick, Michael T
2010-03-31
This article uses meta-analyses published in Psychological Bulletin from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual moderators in multivariate analyses, and tests of residual variability within individual levels of categorical moderators had the lowest and most concerning levels of power. Using methods of calculating power prospectively for significance tests in meta-analysis, we illustrate how power varies as a function of the number of effect sizes, the average sample size per effect size, effect size magnitude, and level of heterogeneity of effect sizes. In most meta-analyses many significance tests were conducted, resulting in a sizable estimated probability of a Type I error, particularly for tests of means within levels of a moderator, univariate categorical moderators, and residual variability within individual levels of a moderator. Across all surveyed studies, the median effect size and the median difference between two levels of study level moderators were smaller than Cohen's (1988) conventions for a medium effect size for a correlation or difference between two correlations. The median Birge's (1932) ratio was larger than the convention of medium heterogeneity proposed by Hedges and Pigott (2001) and indicates that the typical meta-analysis shows variability in underlying effects well beyond that expected by sampling error alone. Fixed-effects models were used with greater frequency than random-effects models; however, random-effects models were used with increased frequency over time. Results related to model selection of this study are carefully compared with those from Schmidt, Oh, and Hayes (2009), who independently designed and produced a study similar to the one reported here. Recommendations for conducting future meta
NASA Astrophysics Data System (ADS)
Slaski, G.; Ohde, B.
2016-09-01
The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.
Glazowski, Christopher; Rajadhyaksha, Milind
2012-08-01
Coherent speckle influences the resulting image when narrow spectral line-width and single spatial mode illumination are used, though these are the same light-source properties that provide the best radiance-to-cost ratio. However, a suitable size of the detection pinhole can be chosen to maintain adequate optical sectioning while making the probability density of the speckle noise more normal and reducing its effect. The result is a qualitatively better image with improved contrast, which is easier to read. With theoretical statistics and experimental results, we show that the detection pinhole size is a fundamental parameter for designing imaging systems for use in turbid media.
NASA Astrophysics Data System (ADS)
Bahauddin, Shah Mohammad; Mehedi Faruk, Mir
2016-09-01
From the unified statistical thermodynamics of quantum gases, the virial coefficients of ideal Bose and Fermi gases, trapped under generic power law potential are derived systematically. From the general result of virial coefficients, one can produce the known results in d = 3 and d = 2. But more importantly we found that, the virial coefficients of Bose and Fermi gases become identical (except the second virial coefficient, where the sign is different) when the gases are trapped under harmonic potential in d = 1. This result suggests the equivalence between Bose and Fermi gases established in d = 1 (J. Stat. Phys. DOI 10.1007/s10955-015-1344-4). Also, it is found that the virial coefficients of two-dimensional free Bose (Fermi) gas are equal to the virial coefficients of one-dimensional harmonically trapped Bose (Fermi) gas.
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that
da Costa Lobato, Tarcísio; Hauser-Davis, Rachel Ann; de Oliveira, Terezinha Ferreira; Maciel, Marinalva Cardoso; Tavares, Maria Regina Madruga; da Silveira, Antônio Morais; Saraiva, Augusto Cesar Fonseca
2015-02-15
The Amazon area has been increasingly suffering from anthropogenic impacts, especially due to the construction of hydroelectric power plant reservoirs. The analysis and categorization of the trophic status of these reservoirs are of interest to indicate man-made changes in the environment. In this context, the present study aimed to categorize the trophic status of a hydroelectric power plant reservoir located in the Brazilian Amazon by constructing a novel Water Quality Index (WQI) and Trophic State Index (TSI) for the reservoir using major ion concentrations and physico-chemical water parameters determined in the area and taking into account the sampling locations and the local hydrological regimes. After applying statistical analyses (factor analysis and cluster analysis) and establishing a rule base of a fuzzy system to these indicators, the results obtained by the proposed method were then compared to the generally applied Carlson and a modified Lamparelli trophic state index (TSI), specific for trophic regions. The categorization of the trophic status by the proposed fuzzy method was shown to be more reliable, since it takes into account the specificities of the study area, while the Carlson and Lamparelli TSI do not, and, thus, tend to over or underestimate the trophic status of these ecosystems. The statistical techniques proposed and applied in the present study, are, therefore, relevant in cases of environmental management and policy decision-making processes, aiding in the identification of the ecological status of water bodies. With this, it is possible to identify which factors should be further investigated and/or adjusted in order to attempt the recovery of degraded water bodies.
[Abdominal cure procedures. Adequate use of Nobecutan Spray].
López Soto, Rosa María
2009-12-01
Open abdominal wounds, complicated by infection and/or risk of eventration tend to become chronic and usually require frequent prolonged cure. Habitual changing of bandages develop into one of the clearest risk factors leading to the deterioration of perilesional cutaneous integrity. This brings with it new complications which draw out the evolution of the process, provoking an important deterioration in quality of life for the person who suffers this and a considerable increase in health costs. What is needed is a product and a procedure which control the risk of irritation, which protect the skin, which favor a patient's comfort and which shorten treatment requirements while lowering health care expenses. This report invites medical personnel to think seriously about the scientific rationale, and treatment practice, as to why and how to apply Nobecutan adequately, this reports concludes stating the benefits in the adequate use of this product. The objective of this report is to guarantee the adequate use of this product in treatment of complicated abdominal wounds. This product responds to the needs which are present in these clinical cases favoring skin care apt isolation and protection, while at the same time, facilitating the placement and stability of dressings and bandages used to cure wounds. In order for this to happen, the correct use of this product is essential; medical personnel must pay attention to precautions and recommendations for proper application. The author's experiences in habitual handling of this product during various years, included in the procedures for standardized cures for these wounds, corroborates its usefulness; the author considers use of this product to be highly effective while being simple to apply; furthermore, one succeeds in providing quality care and optimizes resources employed.
Quantifying dose to the reconstructed breast: Can we adequately treat?
Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.
2013-04-01
To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.
Statistical design for microwave systems
NASA Technical Reports Server (NTRS)
Cooke, Roland; Purviance, John
1991-01-01
This paper presents an introduction to statistical system design. Basic ideas needed to understand statistical design and a method for implementing statistical design are presented. The nonlinear characteristics of the system amplifiers and mixers are accounted for in the given examples. The specification of group delay, signal-to-noise ratio and output power are considered in these statistical designs.
Choices for achieving adequate dietary calcium with a vegetarian diet.
Weaver, C M; Proulx, W R; Heaney, R
1999-09-01
To achieve adequate dietary calcium intake, several choices are available that accommodate a variety of lifestyles and tastes. Liberal consumption of dairy products in the diet is the approach of most Americans. Some plants provide absorbable calcium, but the quantity of vegetables required to reach sufficient calcium intake make an exclusively plant-based diet impractical for most individuals unless fortified foods or supplements are included. Also, dietary constituents that decrease calcium retention, such as salt, protein, and caffeine, can be high in the vegetarian diet. Although it is possible to obtain calcium balance from a plant-based diet in a Western lifestyle, it may be more convenient to achieve calcium balance by increasing calcium consumption than by limiting other dietary factors.
Genetic Modification of Preimplantation Embryos: Toward Adequate Human Research Policies
Dresser, Rebecca
2004-01-01
Citing advances in transgenic animal research and setbacks in human trials of somatic cell genetic interventions, some scientists and others want to begin planning for research involving the genetic modification of human embryos. Because this form of genetic modification could affect later-born children and their offspring, the protection of human subjects should be a priority in decisions about whether to proceed with such research. Yet because of gaps in existing federal policies, embryo modification proposals might not receive adequate scientific and ethical scrutiny. This article describes current policy shortcomings and recommends policy actions designed to ensure that the investigational genetic modification of embryos meets accepted standards for research on human subjects. PMID:15016248
Are Academic Programs Adequate for the Software Profession?
ERIC Educational Resources Information Center
Koster, Alexis
2010-01-01
According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
HU, Yi-Juan; SUN, Wei; TZENG, Jung-Ying; PEROU, Charles M.
2015-01-01
Studies of expression quantitative trait loci (eQTLs) offer insight into the molecular mechanisms of loci that were found to be associated with complex diseases and the mechanisms can be classified into cis- and trans-acting regulation. At present, high-throughput RNA sequencing (RNA-seq) is rapidly replacing expression microarrays to assess gene expression abundance. Unlike microarrays that only measure the total expression of each gene, RNA-seq also provides information on allele-specific expression (ASE), which can be used to distinguish cis-eQTLs from trans-eQTLs and, more importantly, enhance cis-eQTL mapping. However, assessing the cis-effect of a candidate eQTL on a gene requires knowledge of the haplotypes connecting the candidate eQTL and the gene, which cannot be inferred with certainty. The existing two-stage approach that first phases the candidate eQTL against the gene and then treats the inferred phase as observed in the association analysis tends to attenuate the estimated cis-effect and reduce the power for detecting a cis-eQTL. In this article, we provide a maximum-likelihood framework for cis-eQTL mapping with RNA-seq data. Our approach integrates the inference of haplotypes and the association analysis into a single stage, and is thus unbiased and statistically powerful. We also develop a pipeline for performing a comprehensive scan of all local eQTLs for all genes in the genome by controlling for false discovery rate, and implement the methods in a computationally efficient software program. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to empirical breast cancer data from The Cancer Genome Atlas project. PMID:26568645
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
Dose Limits for Man do not Adequately Protect the Ecosystem
Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.
2004-08-01
It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.
DARHT - an `adequate` EIS: A NEPA case study
Webb, M.D.
1997-08-01
The Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS) provides a case study that is interesting for many reasons. The EIS was prepared quickly, in the face of a lawsuit, for a project with unforeseen environmental impacts, for a facility that was deemed urgently essential to national security. Following judicial review the EIS was deemed to be {open_quotes}adequate.{close_quotes} DARHT is a facility now being built at Los Alamos National Laboratory (LANL) as part of the Department of Energy (DOE) nuclear weapons stockpile stewardship program. DARHT will be used to evaluate the safety and reliability of nuclear weapons, evaluate conventional munitions and study high-velocity impact phenomena. DARHT will be equipped with two accelerator-driven, high-intensity X-ray machines to record images of materials driven by high explosives. DARHT will be used for a variety of hydrodynamic tests, and DOE plans to conduct some dynamic experiments using plutonium at DARHT as well.
ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL
Coutts, D
2007-01-22
Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.
On Adequate Comparisons of Antenna Phase Center Variations
NASA Astrophysics Data System (ADS)
Schoen, S.; Kersten, T.
2013-12-01
One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate
Improving access to adequate pain management in Taiwan.
Scholten, Willem
2015-06-01
There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment.
Are women with psychosis receiving adequate cervical cancer screening?
Tilbrook, Devon; Polsky, Jane; Lofters, Aisha
2010-01-01
ABSTRACT OBJECTIVE To investigate the rates of cervical cancer screening among female patients with psychosis compared with similar patients without psychosis, as an indicator of the quality of primary preventive health care. DESIGN A retrospective cohort study using medical records between November 1, 2004, and November 1, 2007. SETTING Two urban family medicine clinics associated with an academic hospital in Toronto, Ont. PARTICIPANTS A random sample of female patients with and without psychosis between the ages of 20 and 69 years. MAIN OUTCOME MEASURES Number of Papanicolaou tests in a 3-year period. RESULTS Charts for 51 female patients with psychosis and 118 female patients without psychosis were reviewed. Of those women with psychosis, 62.7% were diagnosed with schizophrenia, 19.6% with bipolar disorder, 17.6% with schizoaffective disorder, and 29.4% with other psychotic disorders. Women in both groups were similar in age, rate of comorbidities, and number of full physical examinations. Women with psychosis were significantly more likely to smoke (P < .0001), to have more primary care appointments (P = .035), and to miss appointments (P = .0002) than women without psychosis. After adjustment for age, other psychiatric illnesses, number of physical examinations, number of missed appointments, and having a gynecologist, women with psychosis were significantly less likely to have had a Pap test in the previous 3 years compared with women without psychosis (47.1% vs 73.7%, respectively; odds ratio 0.19, 95% confidence interval 0.06 to 0.58). CONCLUSION Women with psychosis are more than 5 times less likely to receive adequate Pap screening compared with the general population despite their increased rates of smoking and increased number of primary care visits. PMID:20393098
Improving access to adequate pain management in Taiwan.
Scholten, Willem
2015-06-01
There is a global crisis in access to pain management in the world. WHO estimates that 4.65 billion people live in countries where medical opioid consumption is near to zero. For 2010, WHO considered a per capita consumption of 216.7 mg morphine equivalents adequate, while Taiwan had a per capita consumption of 0.05 mg morphine equivalents in 2007. In Asia, the use of opioids is sensitive because of the Opium Wars in the 19th century and for this reason, the focus of controlled substances policies has been on the prevention of diversion and dependence. However, an optimal public health outcome requires that also the beneficial aspects of these substances are acknowledged. Therefore, WHO recommends a policy based on the Principle of Balance: ensuring access for medical and scientific purposes while preventing diversion, harmful use and dependence. Furthermore, international law requires that countries ensure access to opioid analgesics for medical and scientific purposes. There is evidence that opioid analgesics for chronic pain are not associated with a major risk for developing dependence. Barriers for access can be classified in the categories of overly restrictive laws and regulations; insufficient medical training on pain management and problems related to assessment of medical needs; attitudes like an excessive fear for dependence or diversion; and economic and logistical problems. The GOPI project found many examples of such barriers in Asia. Access to opioid medicines in Taiwan can be improved by analysing the national situation and drafting a plan. The WHO policy guidelines Ensuring Balance in National Policies on Controlled Substances can be helpful for achieving this purpose, as well as international guidelines for pain treatment. PMID:26068436
Kasap, Burcu; Akbaba, Gülhan; Yeniçeri, Emine N.; Akın, Melike N.; Akbaba, Eren; Öner, Gökalp; Turhan, Nilgün Ö.; Duru, Mehmet E.
2016-01-01
Objectives: To assess current iodine levels and related factors among healthy pregnant women. Methods: In this cross-sectional, hospital-based study, healthy pregnant women (n=135) were scanned for thyroid volume, provided urine samples for urinary iodine concentration and completed a questionnaire including sociodemographic characteristics and dietary habits targeted for iodine consumption at the Department of Obstetrics and Gynecology, School of Medicine, Muğla Sıtkı Koçman University, Muğla, Turkey, between August 2014 and February 2015. Sociodemographic data were analyzed by simple descriptive statistics. Results: Median urinary iodine concentration was 222.0 µg/L, indicating adequate iodine intake during pregnancy. According to World Health Organization (WHO) criteria, 28.1% of subjects had iodine deficiency, 34.1% had adequate iodine intake, 34.8% had more than adequate iodine intake, and 3.0% had excessive iodine intake during pregnancy. Education level, higher monthly income, current employment, consuming iodized salt, and adding salt to food during, or after cooking were associated with higher urinary iodine concentration. Conclusion: Iodine status of healthy pregnant women was adequate, although the percentage of women with more than adequate iodine intake was higher than the reported literature. PMID:27279519
SNAP benefits: Can an adequate benefit be defined?
Yaktine, Ann L; Caswell, Julie A
2014-01-01
The Supplemental Nutrition Assistance Program (SNAP) increases the food purchasing power of participating households. A committee convened by the Institute of Medicine (IOM) examined the question of whether it is feasible to define SNAP allotment adequacy. Total resources; individual, household, and environmental factors; and SNAP program characteristics that affect allotment adequacy were identified from a framework developed by the IOM committee. The committee concluded that it is feasible to define SNAP allotment adequacy; however, such a definition must take into account the degree to which participants' total resources and individual, household, and environmental factors influence the purchasing power of SNAP benefits and the impact of SNAP program characteristics on the calculation of the dollar value of the SNAP allotment. The committee recommended that the USDA Food and Nutrition Service investigate ways to incorporate these factors and program characteristics into research aimed at defining allotment adequacy.
SNAP benefits: Can an adequate benefit be defined?
Yaktine, Ann L; Caswell, Julie A
2014-01-01
The Supplemental Nutrition Assistance Program (SNAP) increases the food purchasing power of participating households. A committee convened by the Institute of Medicine (IOM) examined the question of whether it is feasible to define SNAP allotment adequacy. Total resources; individual, household, and environmental factors; and SNAP program characteristics that affect allotment adequacy were identified from a framework developed by the IOM committee. The committee concluded that it is feasible to define SNAP allotment adequacy; however, such a definition must take into account the degree to which participants' total resources and individual, household, and environmental factors influence the purchasing power of SNAP benefits and the impact of SNAP program characteristics on the calculation of the dollar value of the SNAP allotment. The committee recommended that the USDA Food and Nutrition Service investigate ways to incorporate these factors and program characteristics into research aimed at defining allotment adequacy. PMID:24425718
Smith, Alwyn
1969-01-01
This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722
ERIC Educational Resources Information Center
Hsiung, Tung-Hsing; Olejnik, Stephen
This study investigated the robustness of the James second-order test (James 1951; Wilcox, 1989) and the univariate F test under a two-factor fixed-effect analysis of variance (ANOVA) model in which cell variances were heterogeneous and/or distributions were nonnormal. With computer-simulated data, Type I error rates and statistical power for the…
Wide Wide World of Statistics: International Statistics on the Internet.
ERIC Educational Resources Information Center
Foudy, Geraldine
2000-01-01
Explains how to find statistics on the Internet, especially international statistics. Discusses advantages over print sources, including convenience, currency of information, cost effectiveness, and value-added formatting; sources of international statistics; United Nations agencies; search engines and power searching; and evaluating sources. (LRW)
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…
Adequate iron stores and the 'Nil nocere' principle.
Hollán, S; Johansen, K S
1993-01-01
There is a need to change the policy of unselective iron supplementation during periods of life with physiologically increased cell proliferation. Levels of iron stores to be regarded as adequate during infancy and pregnancy are still not well established. Recent data support the view that it is not justified to interfere with physiological adaptations developed through millions of years by sophisticated and precisely coordinated regulation of iron absorption, utilization and storage. Recent data suggest that the chelatable intracellular iron pool regulates the expression of proteins with central importance in cellular iron metabolism (TfR, ferritin, and erythroid 5-aminolevulinic synthetase) in a coordinately controlled way through an iron dependent cytosolic mRNA binding protein, the iron regulating factor (IRF). This factor is simultaneously a sensor and a regulator of iron levels. The reduction of ferritin levels during highly increased cell proliferation is a mirror of the increased density of TfRs. An abundance of data support the vigorous competition for growth-essential iron between microbial pathogens and their vertebrate hosts. The highly coordinated regulation of iron metabolism is probably crucial in achieving a balance between the blockade of readily accessible iron to invading organisms and yet providing sufficient iron for the immune system of the host. The most evident adverse clinical effects of excess iron have been observed in immunodeficient patients in tropical countries and in AIDS patients. Excess iron also increases the risk of initiation and promotion of malignant processes by iron binding to DNA and by the iron-catalysed release of free radicals. Oxygen radicals were shown to damage critical biomolecules leading, apart from cancer, to a variety of human disease states, including inflammation and atherosclerosis. They are also involved in processes of aging and thrombosis. Recent clinical trials have suggested that the use of iron
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Freedman, Laurence S; Tasevska, Natasa; Kipnis, Victor; Schatzkin, Arthur; Mares, Julie; Tinker, Lesley; Potischman, Nancy
2010-10-01
A major problem in detecting diet-disease associations in nutritional cohort studies is measurement error in self-reported intakes, which causes loss of statistical power. The authors propose using biomarkers correlated with dietary intake to strengthen analyses of diet-disease hypotheses and to increase statistical power. They consider combining self-reported intakes and biomarker levels using principal components or a sum of ranks and relating the combined measure to disease in conventional regression analyses. They illustrate their method in a study of the inverse association of dietary lutein plus zeaxanthin with nuclear cataracts, using serum lutein plus zeaxanthin as the biomarker, with data from the Carotenoids in Age-Related Eye Disease Study (United States, 2001-2004). This example demonstrates that the combined measure provides higher statistical significance than the dietary measure or the serum measure alone, and it potentially provides sample savings of 8%-53% over analysis with dietary intake alone and of 6%-48% over analysis with serum level alone, depending on the definition of the outcome variable and the choice of confounders entered into the regression model. The authors conclude that combining appropriate biomarkers with dietary data in a cohort can strengthen the investigation of diet-disease associations by increasing the statistical power to detect them.
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...
NASA Astrophysics Data System (ADS)
Hermann, Claudine
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
Dumenko, V N; Kozlov, M K
2006-01-01
Power spectra over the frequency range 1-225 Hz in short-term (less than 1 sec) EEG reactions arising in different areas of the cerebral cortex in response to presentation of differential signals were investigated in dogs during operant feeding behavior in conditions of both adequate and erroneous responses. The energy levels of these reactions decreased several-fold as compared with responses to positive signals, mainly because of frequencies in the high-frequency range (90-225 Hz), where power was greater than not only the traditional range of 1-30 Hz, but also the gamma range of 30-80 Hz. The frequency composition of EEG reactions in adequate responses was determined by a series of discrete frequency subgroups belonging predominantly to the high-frequency band. In erroneous reactions, the discrete structure of the corresponding EEG reactions was lost.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
Percentage of Adults with High Blood Pressure Whose Hypertension Is Adequately Controlled
... is Adequately Controlled Percentage of Adults with High Blood Pressure Whose Hypertension is Adequately Controlled Heart disease ... Survey. Age Group Percentage of People with High Blood Pressure that is Controlled by Age Group f94q- ...
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... Standards and Maintaining Adequate Iron Stores in Blood Donors.'' The purpose of this public workshop is to... donor safety and blood availability, and potential measures to maintain adequate iron stores in...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2013 CFR
2013-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2012 CFR
2012-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2014 CFR
2014-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2011 CFR
2011-07-01
... adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE... Preserve Snowmachine (snowmobile) Operations § 13.960 Who determines when there is adequate snow cover? The superintendent will determine when snow cover is adequate for snowmachine use. The superintendent will follow...
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Montano, Rosa
2013-01-01
We investigate the performance of three statistics, R [subscript 1], R [subscript 2] (Glas in "Psychometrika" 53:525-546, 1988), and M [subscript 2] (Maydeu-Olivares & Joe in "J. Am. Stat. Assoc." 100:1009-1020, 2005, "Psychometrika" 71:713-732, 2006) to assess the overall fit of a one-parameter logistic model (1PL) estimated by (marginal) maximum…
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
NASA Astrophysics Data System (ADS)
Masuta, Taisuke; Gunjikake, Yasutoshi; Yokoyama, Akihiko; Tada, Yasuyuki
Nowadays, electric power systems confront many problems, such as environmental issues, aging infrastructures, energy security, and quality of electricity supply. The smart grid is a new concept of a better future grid, which enables us to solve the mentioned problems with Information and Communication Technology (ICT). In this research, a number of Heat Pump Water Heaters (HPWHs), one of the energy efficient-use customer equipment, and Battery Energy Storage System (BESS) are considered as controllable equipment for the frequency control. The utilization of customer equipment such as HPWH for power system control is one of the key elements in the concept of Ubiquitous Power Grid, which was proposed by our research group as a smart grid in Japanese context. The frequency control using a number of HPWHs with thermal storage of hot water tank is evaluated. Moreover, a novel statistical modeling of controllable HPWHs taking into account customers' convenience and uncertainty is proposed.
ERIC Educational Resources Information Center
Martin, Tammy Faith
2012-01-01
The purpose of this study was to examine principal leadership styles and their influence on school performance as measured by adequate yearly progress at selected Title I schools in South Carolina. The main focus of the research study was to complete descriptive statistics on principal leadership styles in schools that met or did not meet adequate…
Scholfield, D.J.; Fields, M.; Beal, T.; Lewis, C.G.; Behall, K.M. )
1989-02-09
The symptoms of copper (Cu) deficiency are known to be more severe when rats are fed a diet with fructose (F) as the principal carbohydrate. Mortality, in males, due to cardiac abnormalities usually occurs after five weeks of a 62% F, 0.6 ppm Cu deficient diet. These effects are not observed if cornstarch (CS) is the carbohydrate (CHO) source. Studies with F containing diets have shown increased catecholamine (C) turnover rates while diets deficient in Cu result in decreased norepinephrine (N) levels in tissues. Dopamine B-hydroxylase (EC 1.14.17.1) is a Cu dependent enzyme which catalyzes the conversion of dopamine (D) to N. An experiment was designed to investigate the effects of CHO and dietary Cu on levels of three C in cardiac tissue. Thirty-two male and female Sprague-Dawley rats were fed Cu deficient or adequate diets with 60% of calories from F or CS for 6 weeks. N, epinephrine (E) and D were measured by HPLC. Statistical analysis indicates that Cu deficiency tends to decrease N levels, while having the reverse effect on E. D did not appear to change. These findings indicate that Cu deficiency but not dietary CHO can affect the concentration of N and E in rat cardiac tissue.
1986-01-01
Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831
ERIC Educational Resources Information Center
Kromrey, Jeffrey D.; Dickinson, Wendy B.
1996-01-01
Empirical estimates of the power and Type I error rate of the test of the classrooms-within-treatments effect in the nested analysis of variance approach are provided for a variety of nominal alpha levels and a range of classroom effect sizes and research designs. (SLD)
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
Calculation of the Cost of an Adequate Education in Kentucky: A Professional Judgment Approach
ERIC Educational Resources Information Center
Verstegen, Deborah A.
2004-01-01
What is an adequate education and how much does it cost? In 1989, Kentucky's State Supreme Court found the entire system of education unconstitutional--"all of its parts and parcels". The Court called for all children to have access to an adequate education, one that is uniform and has as its goal the development of seven capacities, including:…
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801.5 Section 801.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2013 CFR
2013-07-01
... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2012 CFR
2012-07-01
... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2014 CFR
2014-07-01
... regulated by another Federal agency. 152.20 Section 152.20 Protection of Environment ENVIRONMENTAL... Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The pesticides... has determined, in accordance with FIFRA sec. 25(b)(1), that they are adequately regulated by...
When are studies adequate for regulatory purposes? View of one regulated.
Bundy, M
1981-01-01
The question of adequacy of studies for regulatory purposes has been debated for years. Nine questions need answers to determine adequacy: (1) Does the study deal with a defined problem or a defined segment of it? (2) Do the study data justify the conclusions drawn? (3) Were appropriate statistical analyses used? Is there evidence of bias versus objectivity in the collection or analysis of data? (4) Does the study support, supplement (or complement) or refute information in the literature? Is the study truly new information? (5) Does the study conform to the Interagency Regulatory Liaison Group (IRLG) guidelines for documentation of Epidemiologic Studies? (6) Does the study stand up to peer review? (7) Have other investigators been able to confirm the findings by duplicating the study? (8) Is the study acceptable or can it be made acceptable for publication in a reputable scientific journal? (9) Is the problem of such magnitude or significance that regulation is required? Because there is no such thing as a risk-free environment or absolute safety and there is no definitive "yes" answer to each of the questions, the regulated would hope--yes, insist--that the regulators exercise judgement with great skill in promulgation of rules or regulations. The application of safety factors and the determination of acceptable levels of risk should be social decisions. A discussion of instances where the "regulated" believes that studies have not been adequate, or others habe been ignored, or misinterpreted for regulatory purposes in included.A method of settling controversial questions to eliminate the litigation route is proposed. Judgment which is so often eliminated by regulation needs to find its way back into the regulatory process. The regulated recognize the need for regulations. However, when these regulations are based on less than good scientific judgment, harm will be done to the regulatory process itself in the long run. PMID:7333262
Shoulder Arthroscopy Does Not Adequately Visualize Pathology of the Long Head of Biceps Tendon
Saithna, Adnan; Longo, Alison; Leiter, Jeff; Old, Jason; MacDonald, Peter M.
2016-01-01
Background: Pulling the long head of the biceps tendon into the joint at arthroscopy is a common method for evaluation of tendinopathic lesions. However, the rate of missed diagnoses when using this technique is reported to be as high as 30% to 50%. Hypothesis: Tendon excursion achieved using a standard arthroscopic probe does not allow adequate visualization of extra-articular sites of predilection of tendinopathy. Study Design: Descriptive laboratory study. Methods: Seven forequarter amputation cadaveric specimens were evaluated. The biceps tendon was tagged to mark the intra-articular length and the maximum excursions achieved using a probe and a grasper in both beach-chair and lateral positions. Statistical analyses were performed using analysis of variance to compare means. Results: The mean intra-articular and extra-articular lengths of the tendons were 23.9 and 82.3 mm, respectively. The length of tendon that could be visualized by pulling it into the joint with a probe through the anterior midglenoid portal was not significantly different when using either lateral decubitus (mean ± SD, 29.9 ± 3.89 mm; 95% CI, 25.7-34 mm) or beach-chair positions (32.7 ± 4.23 mm; 95% CI, 28.6-36.8 mm). The maximum length of the overall tendon visualized in any specimen using a standard technique was 37 mm. Although there was a trend to greater excursion using a grasper through the same portal, this was not statistically significant. However, using a grasper through the anterosuperior portal gave a significantly greater mean excursion than any other technique (46.7 ± 4.31 mm; 95% CI, 42.6-50.8 mm), but this still failed to allow evaluation of Denard zone C. Conclusion: Pulling the tendon into the joint with a probe via an anterior portal does not allow visualization of distal sites of predilection of pathology. Surgeons should be aware that this technique is inadequate and can result in missed diagnoses. Clinical Relevance: This study demonstrates that glenohumeral
NASA Astrophysics Data System (ADS)
Lambert, I. B.
2012-04-01
This presentation will consider the adequacy of global uranium and thorium resources to meet realistic nuclear power demand scenarios over the next half century. It is presented on behalf of, and based on evaluations by, the Uranium Group - a joint initiative of the OECD Nuclear Energy Agency and the International Atomic Energy Agency, of which the author is a Vice Chair. The Uranium Group produces a biennial report on Uranium Resources, Production and Demand based on information from some 40 countries involved in the nuclear fuel cycle, which also briefly reviews thorium resources. Uranium: In 2008, world production of uranium amounted to almost 44,000 tonnes (tU). This supplied approximately three-quarters of world reactor requirements (approx. 59,000 tU), the remainder being met by previously mined uranium (so-called secondary sources). Information on availability of secondary sources - which include uranium from excess inventories, dismantling nuclear warheads, tails and spent fuel reprocessing - is incomplete, but such sources are expected to decrease in market importance after 2013. In 2008, the total world Reasonably Assured plus Inferred Resources of uranium (recoverable at less than 130/kgU) amounted to 5.4 million tonnes. In addition, it is clear that there are vast amounts of uranium recoverable at higher costs in known deposits, plus many as yet undiscovered deposits. The Uranium Group has concluded that the uranium resource base is more than adequate to meet projected high-case requirements for nuclear power for at least half a century. This conclusion does not assume increasing replacement of uranium by fuels from reprocessing current reactor wastes, or by thorium, nor greater reactor efficiencies, which are likely to ameliorate future uranium demand. However, progressively increasing quantities of uranium will need to be mined, against a backdrop of the relatively small number of producing facilities around the world, geopolitical uncertainties and
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible
Code of Federal Regulations, 2010 CFR
2010-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...
NASA Astrophysics Data System (ADS)
Faruk, Mir Mehedi
2015-11-01
A unified description for the Bose and Fermi gases trapped in an external generic power law potential U=sum _{i=1} ^d c_i |x_i/a_i|^{n_i} is presented using the grandpotential of the system in d dimensional space. The thermodynamic quantities of the quantum gases are derived from the grand potential. An equivalence between the trapped Bose and Fermi gases is constructed from the thermodynamic quantities in one dimension (d=1) using the Landen relation. It is also found that the established equivalence between the ideal free Bose and Fermi gases in d=2 (Lee in Phys Rev E 55:1518, 1997) is lost when external potential is applied.
Coburn, Timothy C
2004-09-01
An extensive experimental program has been conducted to evaluate the comparative effects of California Air Resources Board diesel fuel and an ultra-low-sulfur (S) diesel (with and without aftermarket passive filtering devices) on mass emissions of particulate matter (PM) in heavy vehicles. Tests have been performed on 20 Class 8 trucks at two high-mileage levels using two different driving schedules. The design of the test program facilitates the use of mixed-model statistical analysis, which allows more appropriate treatment of the explanatory variables than normally achieved. The analysis suggests that the ultra-low-S diesel fuel yields extremely low mean PM emissions when used in conjunction with a particulate filter, even at high mileage, but that the estimates are highly variable. The high degree of uncertainty, caused at least in part by large vehicle-to-vehicle variation, may obscure the true PM response and adversely impact attainment of increasingly more stringent diesel PM emissions standards in the United States.
The Effects of Flare Definitions on the Statistics of Derived Flare Distrubtions
NASA Astrophysics Data System (ADS)
Ryan, Daniel; Dominique, Marie; Seaton, Daniel B.; Stegen, Koen; White, Arthur
2016-05-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. However, statistical flare studies are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds which may affect the derived flare distributions. We explore the effect of the arbitrary thresholds used in the GOES event list and LYRA Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the algorithms’ flare start thresholds. We also find that the power law exponents of these distributions are not stable but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are clearly non-power law. We show that this is consistent with an insufficient degradation correction which causes LYRA absolute irradiance values to be unreliable. This means that they should not be used for flare statistics or energetics unless degradation is adequately accounted for. However they can be used to study time variations over shorter timescales and for space weather monitoring.
Nonstationary statistical theory for multipactor
Anza, S.; Vicente, C.; Gil, J.
2010-06-15
This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.
The Need for Domestic Violence Laws with Adequate Legal and Social Support Services.
ERIC Educational Resources Information Center
Hemmons, Willa M.
1981-01-01
Describes the need for comprehensive domestic violence programs that include medical, legal, economic, psychological, and child care services. Although most states have family violence legislation, more work is needed to adequately implement these programs. (Author/JAC)
Schmidt, Barbara; Seshia, Mary; Shankaran, Seetha; Mildenhall, Lindsay; Tyson, Jon; Lui, Kei; Fok, Tai; Roberts, Robin
2012-01-01
Objective To examine if antenatal steroids modify the immediate and long-term effects of prophylactic indomethacin in extremely low birth weight infants. Design Post-hoc subgroup analysis of data from the Trial of Indomethacin Prophylaxis in Preterms. Setting Thirty-two neonatal intensive care units in Canada, the United States, Australia, New Zealand, and Hong Kong. Participants A total of 1195 infants with birth weights of 500 to 999 g and known exposure to antenatal steroids. We defined as “adequate” any exposure to antenatal steroids that occurred at least 24 hours before delivery. Intervention Indomethacin or placebo intravenously once daily for the first three days. Outcome Measures Death or survival to 18 months with 1 or more of cerebral palsy, cognitive delay, severe hearing loss, and bilateral blindness; severe peri-and intraventricular hemorrhage; patent ductus arteriosus; and surgical closure of a patent ductus arteriosus. Results Of the 1195 infants in this analysis cohort, 670 had adequate and 525 had inadequate exposure to antenatal steroids. There was little statistical evidence of heterogeneity in the effects of prophylactic indomethacin between the subgroups for any of the outcomes. The adjusted p values for interaction were as low as 0.15 for the end point of death or impairment at 18 months, and as high as 0.80 for the outcome of surgical duct closure. Conclusion There was little evidence that the effects of prophylactic indomethacin vary in extremely low birth weight infants with and without adequate exposure to antenatal steroids. PMID:21727276
Cosmetic Plastic Surgery Statistics
2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...
General statistical framework for quantitative proteomics by stable isotope labeling.
Navarro, Pedro; Trevisan-Herraz, Marco; Bonzon-Kulichenko, Elena; Núñez, Estefanía; Martínez-Acedo, Pablo; Pérez-Hernández, Daniel; Jorge, Inmaculada; Mesa, Raquel; Calvo, Enrique; Carrascal, Montserrat; Hernáez, María Luisa; García, Fernando; Bárcena, José Antonio; Ashman, Keith; Abian, Joaquín; Gil, Concha; Redondo, Juan Miguel; Vázquez, Jesús
2014-03-01
The combination of stable isotope labeling (SIL) with mass spectrometry (MS) allows comparison of the abundance of thousands of proteins in complex mixtures. However, interpretation of the large data sets generated by these techniques remains a challenge because appropriate statistical standards are lacking. Here, we present a generally applicable model that accurately explains the behavior of data obtained using current SIL approaches, including (18)O, iTRAQ, and SILAC labeling, and different MS instruments. The model decomposes the total technical variance into the spectral, peptide, and protein variance components, and its general validity was demonstrated by confronting 48 experimental distributions against 18 different null hypotheses. In addition to its general applicability, the performance of the algorithm was at least similar than that of other existing methods. The model also provides a general framework to integrate quantitative and error information fully, allowing a comparative analysis of the results obtained from different SIL experiments. The model was applied to the global analysis of protein alterations induced by low H₂O₂ concentrations in yeast, demonstrating the increased statistical power that may be achieved by rigorous data integration. Our results highlight the importance of establishing an adequate and validated statistical framework for the analysis of high-throughput data.
Patient acceptance of adequately filled breast implants using the tilt test.
Tebbetts, J B
2000-07-01
Adequate fill of any breast implant, regardless of shell characteristics, shape, or filler material, is important to prevent implant shell wrinkling, folding, or collapse that could potentially decrease the life of the implant. Implant shell life is a major factor that affects reoperation rates. The greater the necessity of reoperations, regardless of implant type, the greater the rate of local complications, necessitating additional surgery with additional risks and costs to patients. Palpable shell folding, visible wrinkling or rippling, palpable shifts of filler material, sloshing, and compromised aesthetic results can result from an under-filled implant. Any of these complications can necessitate reoperations with increased risks and costs to patients. This is a study of 609 consecutive patients from January of 1993 to December of 1998 who were given detailed preoperative informed consent and a choice of implant shape and type and who chose the increased firmness associated with an implant that is adequately filled to pass the tilt test. This study addresses two questions: (1) Will patients accept the increased firmness of an implant that is filled to pass the tilt test? and (2) Is adequate fill by the tilt test useful clinically to help reduce the incidence of postoperative rippling, wrinkling, and spontaneous deflation in saline implants? Patients were followed by postoperative examinations and questionnaires. No patient requested implant replacement to a softer implant postoperatively, and no reoperations were performed for visible rippling or wrinkling. The spontaneous deflation rate over this 6-year period was 9 of 1218 implants, or 0.739 percent. If patients will accept more firmness with an adequately filled implant, regardless of the filler material, surgeons might worry less about recommending an adequately filled implant to patients, and manufacturers might feel more comfortable producing adequately filled implants and redefining fill volumes for
Not Available
1994-12-08
This report presents a summary of electric power industry statistics at national, regional, and state levels: generating capability and additions, net generation, fossil-fuel statistics, retail sales and revenue, finanical statistics, environmental statistics, power transactions, demand side management, nonutility power producers. Purpose is to provide industry decisionmakers, government policymakers, analysts, and the public with historical data that may be used in understanding US electricity markets.
Montoliu, Lluís; Whitelaw, C Bruce A
2011-04-01
Mice provide an unlimited source of animal models to study mammalian gene function and human diseases. The powerful genetic modification toolbox existing for the mouse genome enables the creation of, literally, thousands of genetically modified mouse strains, carrying spontaneous or induced mutations, transgenes or knock-out/knock-in alleles which, in addition, can exist in hundreds of different genetic backgrounds. Such an immense diversity of individuals needs to be adequately annotated, to ensure that the most relevant information is kept associated with the name of each mouse line, and hence, the scientific community can correctly interpret and benefit from the reported animal model. Therefore, rules and guidelines for correctly naming genes, alleles and mouse strains are required. The Mouse Genome Informatics Database is the authoritative source of official names for mouse genes, alleles, and strains. Nomenclature follows the rules and guidelines established by the International Committee on Standardized Genetic Nomenclature for Mice. Herewith, both from the International Society for Transgenic Technologies (ISTT) and from the scientific journal Transgenic Research, we would like to encourage all our colleagues to adhere and follow adequately the standard nomenclature rules when describing mouse models. The entire scientific community using genetically modified mice in experiments will benefit.
Broadband inversion of 1J(CC) responses in 1,n-ADEQUATE spectra.
Reibarkh, Mikhail; Williamson, R Thomas; Martin, Gary E; Bermel, Wolfgang
2013-11-01
Establishing the carbon skeleton of a molecule greatly facilitates the process of structure elucidation, both manual and computer-assisted. Recent advances in the family of ADEQUATE experiments demonstrated their potential in this regard. 1,1-ADEQUATE, which provides direct (13)C-(13)C correlation via (1)J(CC), and 1,n-ADEQUATE, which typically yields (3)J(CC) and (1)J(CC) correlations, are more sensitive and more widely applicable experiments than INADEQUATE and PANACEA. A recently reported modified pulse sequence that semi-selectively inverts (1)J(CC) correlations in 1,n-ADEQUATE spectra provided a significant improvement, allowing (1)J(CC) and (n)J(CC) correlations to be discerned in the same spectrum. However, the reported experiment requires a careful matching of the amplitude transfer function with (1)J(CC) coupling constants in order to achieve the inversion, and even then some (1)J(CC) correlations could still have positive intensity due to the oscillatory nature of the transfer function. Both shortcomings limit the practicality of the method. We now report a new, dual-optimized inverted (1)J(CC) 1,n-ADEQUATE experiment, which provides more uniform inversion of (1)J(CC) correlations across the range of 29-82 Hz. Unlike the original method, the dual optimization experiment does not require fine-tuning for the molecule's (1)J(CC) coupling constant values. Even more usefully, the dual-optimized version provides up to two-fold improvement in signal-to-noise for some long-range correlations. Using modern, cryogenically-cooled probes, the experiment can be successfully applied to samples of ~1 mg under favorable circumstances. The improvements afforded by dual optimization inverted (1)J(CC) 1,n-ADEQUATE experiment make it a useful and practical tool for NMR structure elucidation and should facilitate the implementation and utilization of the experiment.
Goodman, Melody S; Gaskin, Darrell J; Si, Xuemei; Stafford, Jewel D; Lachance, Christina; Kaphingst, Kimberly A
2012-09-01
Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent's race, ethnicity, age, education, and country of birth.
Bello-Silva, Marina Stella; Wehner, Martin; Eduardo, Carlos de Paula; Lampert, Friedrich; Poprawe, Reinhart; Hermans, Martin; Esteves-Oliveira, Marcella
2013-01-01
This study aimed to evaluate the possibility of introducing ultra-short pulsed lasers (USPL) in restorative dentistry by maintaining the well-known benefits of lasers for caries removal, but also overcoming disadvantages, such as thermal damage of irradiated substrate. USPL ablation of dental hard tissues was investigated in two phases. Phase 1--different wavelengths (355, 532, 1,045, and 1,064 nm), pulse durations (picoseconds and femtoseconds) and irradiation parameters (scanning speed, output power, and pulse repetition rate) were assessed for enamel and dentin. Ablation rate was determined, and the temperature increase measured in real time. Phase 2--the most favorable laser parameters were evaluated to correlate temperature increase to ablation rate and ablation efficiency. The influence of cooling methods (air, air-water spray) on ablation process was further analyzed. All parameters tested provided precise and selective tissue ablation. For all lasers, faster scanning speeds resulted in better interaction and reduced temperature increase. The most adequate results were observed for the 1064-nm ps-laser and the 1045-nm fs-laser. Forced cooling caused moderate changes in temperature increase, but reduced ablation, being considered unnecessary during irradiation with USPL. For dentin, the correlation between temperature increase and ablation efficiency was satisfactory for both pulse durations, while for enamel, the best correlation was observed for fs-laser, independently of the power used. USPL may be suitable for cavity preparation in dentin and enamel, since effective ablation and low temperature increase were observed. If adequate laser parameters are selected, this technique seems to be promising for promoting the laser-assisted, minimally invasive approach.
Statistical approaches to short-term electricity forecasting
NASA Astrophysics Data System (ADS)
Kellova, Andrea
The study of the short-term forecasting of electricity demand has played a key role in the economic optimization of the electric energy industry and is essential for power systems planning and operation. In electric energy markets, accurate short-term forecasting of electricity demand is necessary mainly for economic operations. Our focus is directed to the question of electricity demand forecasting in the Czech Republic. Firstly, we describe the current structure and organization of the Czech, as well as the European, electricity market. Secondly, we provide a complex description of the most powerful external factors influencing electricity consumption. The choice of the most appropriate model is conditioned by these electricity demand determining factors. Thirdly, we build up several types of multivariate forecasting models, both linear and nonlinear. These models are, respectively, linear regression models and artificial neural networks. Finally, we compare the forecasting power of both kinds of models using several statistical accuracy measures. Our results suggest that although the electricity demand forecasting in the Czech Republic is for the considered years rather a nonlinear than a linear problem, for practical purposes simple linear models with nonlinear inputs can be adequate. This is confirmed by the values of the empirical loss function applied to the forecasting results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... FR 51735. Executive Order 13132, Federalism. This rule involves no policies that have ] federalism....C. 4001 et seq., Reorganization Plan No. 3 of 1978, 3 CFR, 1978 Comp., p. 329; E.O. 12127, 44 FR... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management...
26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.
Code of Federal Regulations, 2011 CFR
2011-04-01
... interest (the stated rate of interest) on deferred or prepaid fixed rent at a single fixed rate (as defined in § 1.1273-1(c)(1)(iii)); (B) The stated rate of interest on fixed rent is no lower than 110 percent... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph...
Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.
ERIC Educational Resources Information Center
Pary, Robert J.
1991-01-01
Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…
ERIC Educational Resources Information Center
Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron
2014-01-01
To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…
Evaluating the Reliability of Selected School-Based Indices of Adequate Reading Progress
ERIC Educational Resources Information Center
Wheeler, Courtney E.
2010-01-01
The present study examined the stability (i.e., 4-month and 12-month test-retest reliability) of six selected school-based indices of adequate reading progress. The total sampling frame included between 3970 and 5655 schools depending on the index and research question. Each school had at least 40 second-grade students that had complete Oral…
Understanding the pelvic pain mechanism is key to find an adequate therapeutic approach.
Van Kerrebroeck, Philip
2016-06-25
Pain is a natural mechanism to actual or potential tissue damage and involves both a sensory and an emotional experience. In chronic pelvic pain, localisation of pain can be widespread and can cause considerable distress. A multidisciplinary approach is needed in order to fully understand the pelvic pain mechanism and to identify an adequate therapeutic approach.
33 CFR 155.4050 - Ensuring that the salvors and marine firefighters are adequate.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Ensuring that the salvors and marine firefighters are adequate. 155.4050 Section 155.4050 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION...
ERIC Educational Resources Information Center
Hemelt, Steven W.
2011-01-01
As the No Child Left Behind (NCLB) law moves through the reauthorization process, it is important to understand the basic performance impacts of its central structure of accountability. In this paper, I examine the effects of failure to make Adequate Yearly Progress (AYP) under NCLB on subsequent student math and reading performance at the school…
Determining Adequate Yearly Progress in a State Performance or Proficiency Index Model
ERIC Educational Resources Information Center
Erpenbach, William J.
2009-01-01
The purpose of this paper is to present an overview regarding how several states use a performance or proficiency index in their determination of adequate yearly progress (AYP) under the No Child Left Behind Act of 2001 (NCLB). Typically, indexes are based on one of two weighting schemes: (1) either they weight academic performance levels--also…
ERIC Educational Resources Information Center
Ma, Xin; Shen, Jianping; Krenn, Huilan Y.
2014-01-01
Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…
Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus
NASA Technical Reports Server (NTRS)
Maksimovich, Y. B.; Khinchikashvili, N. V.
1980-01-01
The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.
Human milk feeding supports adequate growth in infants
Technology Transfer Automated Retrieval System (TEKTRAN)
Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...
Costa, Larissa da Cunha Feio; Vasconcelos, Francisco de Assis Guedes de; Corso, Arlete Catarina Tittoni
2012-06-01
This study aimed to estimate fruit and vegetable intake and identify associated factors among schoolchildren in Santa Catarina State, Brazil. A cross-sectional study was conducted with 4,964 students from public and private schools in eight districts in the State, analyzing socioeconomic and anthropometric data and dietary intake. Adequate fruit and vegetable intake was defined as five or more servings per day. Poisson regression was performed to test associations between fruit and vegetable intake and independent variables (p < 0.05). Adequate intake was found in 2.7% of children, while 26.6% of the sample did not consume any fruits and vegetables. In the analysis of the association between independent variables and adequate fruit and vegetable intake in the total sample, only geographic region (residents in western Santa Catarina) and consumption of candy were significantly associated. In the stratified analysis by sex, for boys, only geographic region was associated, while among girls, region and candy consumption were significantly associated with adequate fruit and vegetable intake. The findings indicate the need for specific strategies in the school community to improve fruit and vegetable intake by schoolchildren.
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 4 2014-10-01 2014-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...
Percentage of Adults with High Cholesterol Whose LDL Cholesterol Levels Are Adequately Controlled
... of Adults with High Cholesterol Whose LDL Cholesterol Levels are Adequately Controlled High cholesterol can double a ... with High Cholesterol that is Controlled by Education Level 8k4c-k22f Download these data » Click on legends ...
ERIC Educational Resources Information Center
Moser, Sharon
2010-01-01
The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to failure to…
The Unequal Effect of Adequate Yearly Progress: Evidence from School Visits
ERIC Educational Resources Information Center
Brown, Abigail B.; Clift, Jack W.
2010-01-01
The authors report insights, based on annual site visits to elementary and middle schools in three states from 2004 to 2006, into the incentive effect of the No Child Left Behind Act's requirement that increasing percentages of students make Adequate Yearly Progress (AYP) in every public school. They develop a framework, drawing on the physics…
Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.
Mossad, Sherif B
2005-11-01
Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared. PMID:16315443
Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring
Technology Transfer Automated Retrieval System (TEKTRAN)
We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...
ERIC Educational Resources Information Center
Barth, Amy E.; Barnes, Marcia; Francis, David; Vaughn, Sharon; York, Mary
2015-01-01
Separate mixed model analyses of variance were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6-12 (n = 1,203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in…
What Is the Cost of an Adequate Vermont High School Education?
ERIC Educational Resources Information Center
Rucker, Frank D.
2010-01-01
Access to an adequate education has been widely considered an undeniable right since Chief Justice Warren stated in his landmark decision that "Today, education is perhaps the most important function of state and local governments...it is doubtful that any child may reasonably be expected to succeed in life if he is denied the opportunity of an…
Calculating and Reducing Errors Associated with the Evaluation of Adequate Yearly Progress.
ERIC Educational Resources Information Center
Hill, Richard
In the Spring, 1996, issue of "CRESST Line," E. Baker and R. Linn commented that, in efforts to measure the progress of schools, "the fluctuations due to differences in the students themselves could conceal differences in instructional effects." This is particularly true in the context of the evaluation of adequate yearly progress required by…
26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.
Code of Federal Regulations, 2010 CFR
2010-04-01
... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...
26 CFR 1.467-2 - Rent accrual for section 467 rental agreements without adequate interest.
Code of Federal Regulations, 2012 CFR
2012-04-01
... provide for a variable rate of interest. For purposes of the adequate interest test under paragraph (b)(1) of this section, if a section 467 rental agreement provides for variable interest, the rental... date as the issue date) for the variable rates called for by the rental agreement. For purposes of...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2014 CFR
2014-01-01
... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2013 CFR
2013-01-01
... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
Harris, Alex; Reeder, Rachelle; Hyun, Jenny
2011-01-01
The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.
Statistical description of turbulent dispersion
NASA Astrophysics Data System (ADS)
Brouwers, J. J. H.
2012-12-01
We derive a comprehensive statistical model for dispersion of passive or almost passive admixture particles such as fine particulate matter, aerosols, smoke, and fumes in turbulent flow. The model rests on the Markov limit for particle velocity. It is in accordance with the asymptotic structure of turbulence at large Reynolds number as described by Kolmogorov. The model consists of Langevin and diffusion equations in which the damping and diffusivity are expressed by expansions in powers of the reciprocal Kolmogorov constant C0. We derive solutions of O(C00) and O(C0-1). We truncate at O(C0-2) which is shown to result in an error of a few percentages in predicted dispersion statistics for representative cases of turbulent flow. We reveal analogies and remarkable differences between the solutions of classical statistical mechanics and those of statistical turbulence.
Rossell, David
2016-01-01
Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040
Recent Developments in Energy Level Statistics in Generic Systems between Integrability and Chaos
NASA Astrophysics Data System (ADS)
Robnik, M.
During the past decade or so there has been growing theoretical, numerical and experimental support for the Bohigas-Giannoni-Schmit Conjecture (1984) on the applicability of the random matrix theories statistics (GOE, GUE) in the classically ergodic quantal Hamiltonian systems. In the classically integrable systems the spectral fluctuations of the corresponding quantal Hamiltonians are well described by the Poissonian statistics. In the present paper we discuss the statistical properties of energy spectra of generic Hamiltonians in the transition region between integrability and ergodicity (KAM-systems). We present convincing statistically highly significant evidence for the fractional power law level repulsion (in the non-semiclassical limit, or near semiclassical limit), which is quite well fitted by the Brody distribution and even more so by the Izrailev distribution. However, at sufficiently large level spacings, say S > 1, the Berry-Robnik formulae for the level spacing distribution are found to be adequate. We discuss the possible theoretical approaches and explanations. The phenomenon of power law level repulsion is partially understood in terms of the sparsed banded random matrix ensembles (SBRME).
Three not adequately understood lunar phenomena investigated by the wave planetology
NASA Astrophysics Data System (ADS)
Kochemasov, G. G.
2009-04-01
Three not adequately understood lunar phenomena investigated by the wave planetology G. Kochemasov IGEM of the Russian Academy of Sciences, Moscow, Russia, kochem.36@mail.ru The lunar science notwithstanding rather numerous researches of the last 50 years still debates some important issues. Three of them concern an origin of mascons, the deepest but low ferruginous South Pole-Aitken depression, a strange character of the frequency-crater size curve. Prevailing approaches are mainly based on impacts having made the present geomorphology of the Moon. However practically are ignored the fact of antipodality of basins and marea, a complex character of the frequency-crater size curve obviously implying an involvement of different sources and reasons responsible for crater formation. Attempts to find impactor sources in various sometimes very remote parts of the Solar system are too artificial, besides they do not explain very intensive, like lunar cratering of Mercury. Saturation of the lunar surface by ~70-km diameter craters is very strange for random impacts from any source; to find a time interval for this saturation is difficult if not possible because it affects formations of various ages. Lunar basins and marea completely contradict to a classical frequency- crater size curve. Their presumed ( and measured) different ages make dubious existence of one specialized impactor source. So, if one accepts an impact process as the only process responsible for cratering (ring forms development) then the real mess in crater statistics and timing never will be overcome. The wave planetology [1-3 & others] examined by many planets and satellites of the Solar system proved to be real. In a case of the Moon it can help in answering the above questions. First of all it should be admitted that the complex lunar crater (ring forms) statistics is due to a superposition and mixing of two main processes (a minor involvement of volcanic features is also present): impacts and wave
Three not adequately understood lunar phenomena investigated by the wave planetology
NASA Astrophysics Data System (ADS)
Kochemasov, G. G.
2009-04-01
Three not adequately understood lunar phenomena investigated by the wave planetology G. Kochemasov IGEM of the Russian Academy of Sciences, Moscow, Russia, kochem.36@mail.ru The lunar science notwithstanding rather numerous researches of the last 50 years still debates some important issues. Three of them concern an origin of mascons, the deepest but low ferruginous South Pole-Aitken depression, a strange character of the frequency-crater size curve. Prevailing approaches are mainly based on impacts having made the present geomorphology of the Moon. However practically are ignored the fact of antipodality of basins and marea, a complex character of the frequency-crater size curve obviously implying an involvement of different sources and reasons responsible for crater formation. Attempts to find impactor sources in various sometimes very remote parts of the Solar system are too artificial, besides they do not explain very intensive, like lunar cratering of Mercury. Saturation of the lunar surface by ~70-km diameter craters is very strange for random impacts from any source; to find a time interval for this saturation is difficult if not possible because it affects formations of various ages. Lunar basins and marea completely contradict to a classical frequency- crater size curve. Their presumed ( and measured) different ages make dubious existence of one specialized impactor source. So, if one accepts an impact process as the only process responsible for cratering (ring forms development) then the real mess in crater statistics and timing never will be overcome. The wave planetology [1-3 & others] examined by many planets and satellites of the Solar system proved to be real. In a case of the Moon it can help in answering the above questions. First of all it should be admitted that the complex lunar crater (ring forms) statistics is due to a superposition and mixing of two main processes (a minor involvement of volcanic features is also present): impacts and wave
Chapman, S; Liberman, J
2005-01-01
The right to information is a fundamental consumer value. Following the advent of health warnings, the tobacco industry has repeatedly asserted that smokers are fully informed of the risks they take, while evidence demonstrates widespread superficial levels of awareness and understanding. There remains much that tobacco companies could do to fulfil their responsibilities to inform smokers. We explore issues involved in the meaning of "adequately informed" smoking and discuss some of the key policy and regulatory implications. We use the idea of a smoker licensing scheme—under which it would be illegal to sell to smokers who had not demonstrated an adequate level of awareness—as a device to explore some of these issues. We also explore some of the difficulties that addiction poses for the notion that smokers might ever voluntarily assume the risks of smoking. PMID:16046703
The concept of adequate causation and Max Weber's comparative sociology of religion.
Buss, A
1999-06-01
Max Weber's The Protestant Ethic and the Spirit of Capitalism, studied in isolation, shows mainly an elective affinity or an adequacy on the level of meaning between the Protestant ethic and the 'spirit' of capitalism. Here it is suggested that Weber's subsequent essays on 'The Economic Ethics of World Religions' are the result of his opinion that adequacy on the level of meaning needs and can be verified by causal adequacy. After some introductory remarks, particularly on elective affinity, the paper tries to develop the concept of adequate causation and the related concept of objective possibility on the basis of the work of v. Kries on whom Weber heavily relied. In the second part, this concept is used to show how the study of the economic ethics of India, China, Rome and orthodox Russia can support the thesis that the 'spirit' of capitalism, although it may not have been caused by the Protestant ethic, was perhaps adequately caused by it. PMID:15260028
Chapman, S; Liberman, J
2005-08-01
The right to information is a fundamental consumer value. Following the advent of health warnings, the tobacco industry has repeatedly asserted that smokers are fully informed of the risks they take, while evidence demonstrates widespread superficial levels of awareness and understanding. There remains much that tobacco companies could do to fulfil their responsibilities to inform smokers. We explore issues involved in the meaning of "adequately informed" smoking and discuss some of the key policy and regulatory implications. We use the idea of a smoker licensing scheme-under which it would be illegal to sell to smokers who had not demonstrated an adequate level of awareness-as a device to explore some of these issues. We also explore some of the difficulties that addiction poses for the notion that smokers might ever voluntarily assume the risks of smoking. PMID:16046703
Myth 19: Is Advanced Placement an Adequate Program for Gifted Students?
ERIC Educational Resources Information Center
Gallagher, Shelagh A.
2009-01-01
Is it a myth that Advanced Placement (AP) is an adequate program for gifted students? AP is so covered with myths and assumptions that it is hard to get a clear view of the issues. In this article, the author finds the answer about AP by looking at current realties. First, AP is hard for gifted students to avoid. Second, AP never was a program…
Bioelement effects on thyroid gland in children living in iodine-adequate territory.
Gorbachev, Anatoly L; Skalny, Anatoly V; Koubassov, Roman V
2007-01-01
Endemic goitre is a primary pathology of thyroid gland and critical medico social problem in many countries. A dominant cause of endemic goitre is iodine deficiency. However, besides primary iodine deficiency, the goitre may probably develop due to effects of other bioelement imbalances, essential to thyroid function maintenance. Here we studied 44 cases of endemic goitre in prepubertal children (7-10 y.o.) living in iodine-adequate territory. Thyroid volume was estimated by ultrasonometry. Main bioelements (Al, Ca, Cd, Co, Cr, Cu, Fe, Hg, I, Mg, Mn, Pb, Se, Si, Zn) were determined in hair samples by ICP-OES/ICP-MS method. Relationships between hair content of bioelements and thyroid gland size were estimated by multiple regressions. The regression model revealed significant positive relations between thyroid volume and Cr, Si, Mn contents. However, the actual factor of thyroid gland increase was only Si excess in organism. Significant negative relations of thyroid volume were revealed with I, Mg, Zn, Se, Co and Cd. In spite of this, the actual factors of thyroid gland volume increasing were I, Co, Mg and Se deficiency. Total bioelement contribution in thyroid impairment was estimated as 24%. Thus, it was suggested that endemic goitre in iodine-adequate territory can be formed by bioelement imbalances, namely Si excess and Co, Mg, Se shortage as well as endogenous I deficiency in spite of iodine-adequate environment.
Wu, Felicia
2013-01-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295
Wu, Felicia; Stacy, Shaina L; Kensler, Thomas W
2013-09-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America.
Ye, Fanghua; Albarouki, Emad; Lingam, Brahmasivasenkar; Deising, Holger B; von Wirén, Nicolaus
2014-07-01
Iron (Fe) is an essential element for plant pathogens as well as for their host plants. As Fe plays a central role in pathogen virulence, most plants have evolved Fe-withholding strategies to reduce Fe availability to pathogens. On the other hand, plants need Fe for an oxidative burst in their basal defense response against pathogens. To investigate how the plant Fe nutritional status affects plant tolerance to a hemibiotrophic fungal pathogen, we employed the maize-Colletotrichum graminicola pathosystem. Fungal infection progressed rapidly via biotrophic to necrotrophic growth in Fe-deficient leaves, while an adequate Fe nutritional status suppressed the formation of infection structures of C. graminicola already during the early biotrophic growth phase. As indicated by Prussian blue and 3,3'-diaminobenzidine (DAB) staining, the retarding effect of an adequate Fe nutritional status on fungal development coincided temporally and spatially with the recruitment of Fe to infection sites and a local production of H2 O2 . A similar coincidence between local Fe and H2 O2 accumulation was found in a parallel approach employing C. graminicola mutants affected in Fe acquisition and differing in virulence. These results indicate that an adequate Fe nutritional status delays and partially suppresses the fungal infection process and the biotrophic growth phase of C. graminicola, most likely via the recruitment of free Fe to the fungal infection site for a timely oxidative burst.
Statistical mechanics of shell models for two-dimensional turbulence
NASA Astrophysics Data System (ADS)
Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.
1994-12-01
We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.
On More Sensitive Periodogram Statistics
NASA Astrophysics Data System (ADS)
Bélanger, G.
2016-05-01
Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.
NASA Astrophysics Data System (ADS)
Nowak, Bernard; Łuczak, Rafał
2015-09-01
The article discusses the improvement of thermal working conditions in underground mine workings, using local refrigeration systems. It considers the efficiency of air cooling with direct action air compression refrigerator of the TS-300B type. As a result of a failure to meet the required operating conditions of the aforementioned air cooling system, frequently there are discrepancies between the predicted (and thus the expected) effects of its work and the reality. Therefore, to improve the operating efficiency of this system, in terms of effective use of the evaporator cooling capacity, quality criteria were developed, which are easy in practical application. They were obtained in the form of statistical models, describing the effect of independent variables, i.e. the parameters of the inlet air to the evaporator (temperature, humidity and volumetric flow rate), as well as the parameters of the water cooling the condenser (temperature and volumetric flow rate), on the thermal power of air cooler, treated as the dependent variable. Statistical equations describing the performance of the analyzed air cooling system were determined, based on the linear and nonlinear multiple regression. The obtained functions were modified by changing the values of the coefficients in the case of linear regression, and of the coefficients and exponents in the case of non-linear regression, with the independent variables. As a result, functions were obtained, which were more convenient in practical applications. Using classical statistics methods, the quality of fitting the regression function to the experimental data was evaluated. Also, the values of the evaporator thermal power of the refrigerator, which were obtained on the basis of the measured air parameters, were compared with the calculated ones, by using the obtained regression functions. These statistical models were built on the basis of the results of measurements in different operating conditions of the TS-300B
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Beneath the Skin: Statistics, Trust, and Status
ERIC Educational Resources Information Center
Smith, Richard
2011-01-01
Overreliance on statistics, and even faith in them--which Richard Smith in this essay calls a branch of "metricophilia"--is a common feature of research in education and in the social sciences more generally. Of course accurate statistics are important, but they often constitute essentially a powerful form of rhetoric. For purposes of analysis and…
Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt.
Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila
2016-01-01
Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8-10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100-199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925
Kimlin, Michael; Sun, Jiandong; Sinclair, Craig; Heward, Sue; Hill, Jane; Dunstone, Kimberley; Brodie, Alison
2016-01-01
An adequate vitamin D status, as measured by serum 25-hydroxyvitamin D (25(OH)D) concentration, is important in humans for maintenance of healthy bones and muscle function. Serum 25(OH)D concentration was assessed in participants from Melbourne, Australia (37.81S, 144.96E), who were provided with the current Australian guidelines on sun exposure for 25(OH)D adequacy (25(OH)D ≥50 nmol/L). Participants were interviewed in February (summer, n=104) and August (winter, n=99) of 2013. Serum 25(OH)D concentration was examined as a function of measures of sun exposure and sun protection habits with control of key characteristics such as dietary intake of vitamin D, body mass index (BMI) and skin colour, that may modify this relationship. The mean 25(OH)D concentration in participants who complied with the current sun exposure guidelines was 67.3 nmol/L in summer and 41.9 nmol/L in winter. At the end of the study, 69.3% of participants who complied with the summer sun exposure guidelines were 25(OH)D adequate, while only 27.6% of participants who complied with the winter sun exposure guidelines were 25(OH)D adequate at the end of the study. The results suggest that the current Australian guidelines for sun exposure for 25(OH)D adequacy are effective for most in summer and ineffective for most in winter. This article is part of a Special Issue entitled '17th Vitamin D Workshop'.
Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt
Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila
2016-01-01
Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8–10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100–199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925
Wong, Raymond K; Sleep, Joseph R; Visner, Allison J; Raasch, David J; Lanza, Louis A; DeValeria, Patrick A; Torloni, Antonio S; Arabia, Francisco A
2011-03-01
The intrinsic and extrinsic activation pathways of the hemostatic system converge when prothrombin is converted to thrombin. The ability to generate an adequate thrombin burst is the most central aspect of the coagulation cascade. The thrombin-generating potential in patients following cardiopulmonary bypass (CPB) may be indicative of their hemostatic status. In this report, thrombography, a unique technique for directly measuring the potential of patients' blood samples to generate adequate thrombin bursts, is used to characterize the coagulopathic profile in post-CPB patients. Post-CPB hemostasis is typically achieved with protamine reversal of heparin anticoagulation and occasionally supplemented with blood product component transfusions. In this pilot study, platelet poor plasma samples were derived from 11 primary cardiac surgery patients at five time points: prior to CPB, immediately post-protamine, upon arrival to the intensive care unit (ICU), 3 hours post-ICU admission, and 24 hours after ICU arrival. Thrombography revealed that the Endogenous Thrombin Potential (ETP) was not different between [Baseline] and [PostProtamine] but proceeded to deteriorate in the immediate postoperative period. At the [3HourPostICU] time point, the ETP was significantly lower than the [Baseline] values, 1233 +/- 591 versus 595 +/- 379 nM.min (mean +/- SD; n=9, p < .005), despite continued adequacy of hemostasis. ETPs returned to baseline values the day after surgery. Transfusions received, conventional blood coagulation testing results, and blood loss volumes are also presented. Despite adequate hemostasis, thrombography reveals an underlying coagulopathic process that could put some cardiac surgical patients at risk for postoperative bleeding. Thrombography is a novel technique that could be developed into a useful tool for perfusionists and physicians to identify coagulopathies and optimize blood management following CPB. PMID:21449230
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries
Statistical Rick Estimation for Communication System Design --- A Preliminary Look
NASA Astrophysics Data System (ADS)
Babuscia, A.; Cheung, K.-M.
2012-02-01
Spacecraft are complex systems that involve different subsystems with multiple relationships among them. For these reasons, the design of a spacecraft is a time-evolving process that starts from requirements and evolves over time across different design phases. During this process, a lot of changes can happen. They can affect mass and power at the component level, at the subsystem level, and even at the system level. Each spacecraft has to respect the overall constraints in terms of mass and power: for this reason, it is important to be sure that the design does not exceed these limitations. Current practice in system models primarily deals with this problem, allocating margins on individual components and on individual subsystems. However, a statistical characterization of the fluctuations in mass and power of the overall system (i.e., the spacecraft) is missing. This lack of adequate statistical characterization would result in a risky spacecraft design that might not fit the mission constraints and requirements, or in a conservative design that might not fully utilize the available resources. Due to the complexity of the problem and to the different expertise and knowledge required to develop a complete risk model for a spacecraft design, this article is focused on risk estimation for a specific spacecraft subsystem: the communication subsystem. The current research aims to be a proof of concept of a risk-based design optimization approach, which can then be further expanded to the design of other subsystems as well as to the whole spacecraft. The objective of this research is to develop a mathematical approach to quantify the likelihood that the major design drivers of mass and power of a space communication system would meet the spacecraft and mission requirements and constraints through the mission design lifecycle. Using this approach, the communication system designers will be able to evaluate and to compare different communication architectures in a risk
Chronic leg ulcer: does a patient always get a correct diagnosis and adequate treatment?
Mooij, Michael C; Huisman, Laurens C
2016-03-01
Patients with chronic leg ulcers have severely impaired quality of life and account for a high percentage of annual healthcare costs. To establish the cause of a chronic leg ulcer, referral to a center with a multidisciplinary team of professionals is often necessary. Treating the underlying cause diminishes healing time and reduces costs. In venous leg ulcers adequate compression therapy is still a problem. It can be improved by training the professionals with pressure measuring devices. A perfect fitting of elastic stockings is important to prevent venous leg ulcer recurrence. In most cases, custom-made stockings are the best choice for this purpose. PMID:26916772
Determining Adequate Margins in Head and Neck Cancers: Practice and Continued Challenges.
Williams, Michelle D
2016-09-01
Margin assessment remains a critical component of oncologic care for head and neck cancer patients. As an integrated team, both surgeons and pathologists work together to assess margins in these complex patients. Differences in method of margin sampling can impact obtainable information and effect outcomes. Additionally, what distance is an "adequate or clear" margin for patient care continues to be debated. Ultimately, future studies and potentially secondary modalities to augment pathologic assessment of margin assessment (i.e., in situ imaging or molecular assessment) may enhance local control in head and neck cancer patients. PMID:27469263
Nebulized antibiotics. An adequate option for treating ventilator-associated respiratory infection?
Rodríguez, A; Barcenilla, F
2015-03-01
Ventilator-associated tracheobronchitis (VAT) is a frequent complication in critical patients. The 90% of those who develop it receive broad-spectrum antibiotic (ATB) treatment, without any strong evidence of its favorable impact. The use of nebulized ATB could be a valid treatment option, to reduce the use of systemic ATB and the pressure of selection on the local flora. Several studies suggest that an adequate nebulization technique can ensure high levels of ATB even in areas of lung consolidation, and to obtain clinical and microbiological cure. New studies are needed to properly assess the impact of treatment with nebulized ATB on the emergence of resistance.
Bidwell, Duane R
2002-01-01
Psychosocial interventions and systematic theology are primary resources for chaplains and congregational pastors who care for victims of physical trauma. Yet these resources may not be adequate to address the spiritual impacts of trauma. This article proposes a preliminary "pneumatraumatology," drawing on early Christian asceticism and Buddhist mysticism to describe one way of understanding the spiritual impacts of traumatic injury. It also suggests possible responses to these impacts informed by narrative/constructionist perspectives and Breggemann's understanding of the dimensions of spiritual transformation in the Hebrew Bible.
Power Plant Water Intake Assessment.
ERIC Educational Resources Information Center
Zeitoun, Ibrahim H.; And Others
1980-01-01
In order to adequately assess the impact of power plant cooling water intake on an aquatic ecosystem, total ecosystem effects must be considered, rather than merely numbers of impinged or entrained organisms. (Author/RE)
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Conger, Bruce
2012-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapid Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently and the ventilation flow is adequate for maintaining CO2 washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, testing planned in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing. 1
Developments in Statistical Education.
ERIC Educational Resources Information Center
Kapadia, Ramesh
1980-01-01
The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
ERIC Educational Resources Information Center
Hall, Rogers; Horn, Ilana Seidel
2012-01-01
In this article we ask how concepts that organize work in two professional disciplines change during moments of consultation, which represent concerted efforts by participants to work differently now and in the future. Our analysis compares structures of talk, the adequacy of representations of practice, and epistemic and moral stances deployed…
ERIC Educational Resources Information Center
Bopp, Richard E.; Van Der Laan, Sharon J.
1985-01-01
Presents a search strategy for locating time-series or cross-sectional statistical data in published sources which was designed for undergraduate students who require 30 units of data for five separate variables in a statistical model. Instructional context and the broader applicability of the search strategy for general statistical research is…
ERIC Educational Resources Information Center
Strasser, Nora
2007-01-01
Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Statistical quality management
NASA Astrophysics Data System (ADS)
Vanderlaan, Paul
1992-10-01
Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.
Adequate bases of phase space master integrals for gg → h at NNLO and beyond
NASA Astrophysics Data System (ADS)
Höschele, Maik; Hoff, Jens; Ueda, Takahiro
2014-09-01
We study master integrals needed to compute the Higgs boson production cross section via gluon fusion in the infinite top quark mass limit, using a canonical form of differential equations for master integrals, recently identified by Henn, which makes their solution possible in a straightforward algebraic way. We apply the known criteria to derive such a suitable basis for all the phase space master integrals in afore mentioned process at next-to-next-to-leading order in QCD and demonstrate that the method is applicable to next-to-next-to-next-to-leading order as well by solving a non-planar topology. Furthermore, we discuss in great detail how to find an adequate basis using practical examples. Special emphasis is devoted to master integrals which are coupled by their differential equations.
Tsuboyama-Kasaoka, Nobuyo; Takizawa, Asuka; Tsubota-Utsugi, Megumi; Nakade, Makiko; Imai, Eri; Kondo, Akiko; Yoshida, Kazue; Okuda, Nagako; Nishi, Nobuo; Takimoto, Hidemi
2013-01-01
The Adequate Intake (AI) values in the Dietary Reference Intakes for Japanese (DRIs-J) 2010 were mainly determined based on the median intakes from 2 y of pooled data (2005-2006) from the National Health and Nutrition Survey-Japan (NHNS-J). However, it remains unclear whether 2 y of pooled data from the NHNS-J are appropriate for evaluating the intake of the population. To clarify the differences in nutrient intakes determined from 2 and 7 y of pooled data, we analyzed selected nutrient intake levels by sex and age groups using NHNS-J data. Intake data were obtained from 64,624 individuals (age: ≥1 y; 47.4% men) who completed a semi-weighed 1-d household dietary record that was part of the NHNS-J conducted annually in Japan from 2003 to 2009. There were no large differences between the median intakes calculated from 2 or 7 y of pooled data for n-6 or n-3 polyunsaturated fatty acids (PUFAs), vitamin D, pantothenic acid, potassium, or phosphorus. When the AI values and median intakes were compared, there was no large difference in the values for n-6 or n-3 PUFAs, pantothenic acid, or phosphorus. Conversely, the AI values for vitamin D and potassium differed from the median intakes of these nutrients for specific sex and age groups, because values were not based on NHNS-J data. Our results indicate that 2 y of pooled data from the NHNS-J adequately reflect the population's intake, and that the current system for determination of AI values will be applicable for future revisions.
Hahm, Hyeouk Chris; Cook, Benjamin; Ault-Brutus, Andrea; Alegria, Margarita
2015-01-01
Objectives This study was conducted to understand the interaction of race/ethnicity and gender in depression screening, any mental health care, and adequate care. Methods 2010–2012 electronic health records data of adult primary care patients from a New England urban health care system was used (n = 65,079). Multivariate logit regression models were used to assess the associations between race/ethnicity, gender, and other covariates with depression screening, any depression care among those screened positive, and adequate depression care among users. Secondly, disparities were evaluated by race/ethnicity and gender and incorporated differences due to insurance, marital status, and area-level SES measures. Findings Black and Asian males and females were less likely to be screened for depression compared to their white counterparts, while Latino males and females were more likely to be screened. Among those that screened PHQ-9>10, black males and females, Latino males, and Asian males and females were less likely to receive any mental health care than their white counterparts. The black-white disparity in screening was greater for females compared to males. The Latino-white disparity for any mental health care and adequacy of care was greater for males compared to females. Conclusions Our approach underscores the importance of identifying disparities at each step of depression care by both race/ethnicity and gender. Targeting certain groups in specific stages of care would be more effective (i.e., screening of black females, any mental health care and adequacy of care for Latino males) than a blanket approach to disparities reduction. PMID:25727113
Kahalley, Lisa S.; Wilson, Stephanie J.; Tyc, Vida L.; Conklin, Heather M.; Hudson, Melissa M.; Wu, Shengjie; Xiong, Xiaoping; Stancel, Heather H.; Hinds, Pamela S.
2012-01-01
Objectives To describe the psychological needs of adolescent survivors of acute lymphoblastic leukemia (ALL) or brain tumor (BT), we examined: (a) the occurrence of cognitive, behavioral, and emotional concerns identified during a comprehensive psychological evaluation, and (b) the frequency of referrals for psychological follow-up services to address identified concerns. Methods Psychological concerns were identified on measures according to predetermined criteria for 100 adolescent survivors. Referrals for psychological follow-up services were made for concerns previously unidentified in formal assessment or not adequately addressed by current services. Results Most survivors (82%) exhibited at least one concern across domains: behavioral (76%), cognitive (47%), and emotional (19%). Behavioral concerns emerged most often on scales associated with executive dysfunction, inattention, learning, and peer difficulties. CRT was associated with cognitive concerns, χ2(1,N=100)=5.63, p<0.05. Lower income was associated with more cognitive concerns for ALL survivors, t(47)=3.28, p<0.01, and more behavioral concerns for BT survivors, t(48)=2.93, p<0.01. Of survivors with concerns, 38% were referred for psychological follow-up services. Lower-income ALL survivors received more referrals for follow-up, χ2(1,N=41)=8.05, p<0.01. Referred survivors had more concerns across domains than non-referred survivors, ALL: t(39)=2.96, p<0.01, BT: t(39)=3.52, p<0.01. Trends suggest ALL survivors may be at risk for experiencing unaddressed cognitive needs. Conclusions Many adolescent survivors of cancer experience psychological difficulties that are not adequately managed by current services, underscoring the need for long-term surveillance. In addition to prescribing regular psychological evaluations, clinicians should closely monitor whether current support services appropriately meet survivors’ needs, particularly for lower-income survivors and those treated with CRT. PMID:22278930
Tetens, Inge; Dejgård Jensen, Jørgen; Smed, Sinne; Gabrijelčič Blenkuš, Mojca; Rayner, Mike; Darmon, Nicole; Robertson, Aileen
2016-01-01
Background Food-Based Dietary Guidelines (FBDGs) are developed to promote healthier eating patterns, but increasing food prices may make healthy eating less affordable. The aim of this study was to design a range of cost-minimized nutritionally adequate health-promoting food baskets (FBs) that help prevent both micronutrient inadequacy and diet-related non-communicable diseases at lowest cost. Methods Average prices for 312 foods were collected within the Greater Copenhagen area. The cost and nutrient content of five different cost-minimized FBs for a family of four were calculated per day using linear programming. The FBs were defined using five different constraints: cultural acceptability (CA), or dietary guidelines (DG), or nutrient recommendations (N), or cultural acceptability and nutrient recommendations (CAN), or dietary guidelines and nutrient recommendations (DGN). The variety and number of foods in each of the resulting five baskets was increased through limiting the relative share of individual foods. Results The one-day version of N contained only 12 foods at the minimum cost of DKK 27 (€ 3.6). The CA, DG, and DGN were about twice of this and the CAN cost ~DKK 81 (€ 10.8). The baskets with the greater variety of foods contained from 70 (CAN) to 134 (DGN) foods and cost between DKK 60 (€ 8.1, N) and DKK 125 (€ 16.8, DGN). Ensuring that the food baskets cover both dietary guidelines and nutrient recommendations doubled the cost while cultural acceptability (CAN) tripled it. Conclusion Use of linear programming facilitates the generation of low-cost food baskets that are nutritionally adequate, health promoting, and culturally acceptable. PMID:27760131
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda
2011-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapic Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently enough and the ventilation flow is adequate enough to maintain CO2 1 Project Engineer, Space Suit and Crew Survival Systems Branch, Crew and Thermal Systems Division, 2101 NASA Parkway, Houston, TX 77058/EC5. washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, the testing results performed in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2014 CFR
2014-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2012 CFR
2012-07-01
... my system's watershed control requirements are adequate? 141.522 Section 141.522 Protection of... Additional Watershed Control Requirements for Unfiltered Systems § 141.522 How does the State determine whether my system's watershed control requirements are adequate? During an onsite inspection...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Statistical error in particle simulations of low mach number flows
Hadjiconstantinou, N G; Garcia, A L
2000-11-13
We present predictions for the statistical error due to finite sampling in the presence of thermal fluctuations in molecular simulation algorithms. The expressions are derived using equilibrium statistical mechanics. The results show that the number of samples needed to adequately resolve the flowfield scales as the inverse square of the Mach number. Agreement of the theory with direct Monte Carlo simulations shows that the use of equilibrium theory is justified.
Multivariate statistical analysis of environmental monitoring data
Ross, D.L.
1997-11-01
EPA requires statistical procedures to determine whether soil or ground water adjacent to or below waste units is contaminated. These statistical procedures are often based on comparisons between two sets of data: one representing background conditions, and one representing site conditions. Since statistical requirements were originally promulgated in the 1980s, EPA has made several improvements and modifications. There are, however, problems which remain. One problem is that the regulations do not require a minimum probability that contaminated sites will be correctly identified. Another problems is that the effect of testing several correlated constituents on the probable outcome of the statistical tests has not been quantified. Results from computer simulations to determine power functions for realistic monitoring situations are presented here. Power functions for two different statistical procedures: the Student`s t-test, and the multivariate Hotelling`s T{sup 2} test, are compared. The comparisons indicate that the multivariate test is often more powerful when the tests are applied with significance levels to control the probability of falsely identifying clean sites as contaminated. This program could also be used to verify that statistical procedures achieve some minimum power standard at a regulated waste unit.
Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties
Chiang, K. Y.; Hu, L. W.; Forget, B.
2012-07-01
The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)
Adopting adequate leaching requirement for practical response models of basil to salinity
NASA Astrophysics Data System (ADS)
Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour
2016-07-01
Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.
Statistical phenomena in particle beams
Bisognano, J.J.
1984-09-01
Particle beams are subject to a variety of apparently distinct statistical phenomena such as intrabeam scattering, stochastic cooling, electron cooling, coherent instabilities, and radiofrequency noise diffusion. In fact, both the physics and mathematical description of these mechanisms are quite similar, with the notion of correlation as a powerful unifying principle. In this presentation we will attempt to provide both a physical and a mathematical basis for understanding the wide range of statistical phenomena that have been discussed. In the course of this study the tools of the trade will be introduced, e.g., the Vlasov and Fokker-Planck equations, noise theory, correlation functions, and beam transfer functions. Although a major concern will be to provide equations for analyzing machine design, the primary goal is to introduce a basic set of physical concepts having a very broad range of applicability.
Statistical modeling of laser welding of DP/TRIP steel sheets
NASA Astrophysics Data System (ADS)
Reisgen, U.; Schleser, M.; Mokrov, O.; Ahmed, E.
2012-02-01
In this research work, a statistical analysis of the CO 2 laser beam welding of dual phase (DP600)/transformation induced plasticity (TRIP700) steel sheets was done using response surface methodology. The analysis considered the effect of laser power (2-2.2 kW), welding speed (40-50 mm/s) and focus position (-1 to 0 mm) on the heat input, the weld bead geometry, uniaxial tensile strength, formability limited dome height and welding operation cost. The experimental design was based on Box-Behnken design using linear and quadratic polynomial equations for predicting the mathematical models. The results indicate that the proposed models predict the responses adequately within the limits of welding parameters being used and the welding speed is the most significant parameter during the welding process.
Tannery, Nancy Hrinya; Silverman, Deborah L; Epstein, Barbara A
2002-01-01
Online use statistics can provide libraries with a tool to be used when developing an online collection of resources. Statistics can provide information on overall use of a collection, individual print and electronic journal use, and collection use by specific user populations. They can also be used to determine the number of user licenses to purchase. This paper focuses on the issue of use statistics made available for one collection of online resources.
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Koko, Abdelmoniem K; Onuora, Vincent C; Al Turki, Mohammed A; Mesbed, Ahmed H; Al Jawini, Nasser A
2003-01-01
Between 1990 and 1999 a total of 186 patients with staghorn renal stones were treated in our unit. Of them, 76 patients were managed by extra-corporeal shockwave lithotripsy (ESWL) alone using a third generation Siemen's Lithostar Plus lithotriptor. Sixty-one of these patients who completed a follow-up of 41 months formed the subjects of this study. ESWL was done after routine stenting of the affected side in all cases except one. The mean number of ESWL sessions was 5.2, delivering an average 15,940 shocks per patient. The average hospital stay was 21.68 days and the duration of the treatment was 1-41 months (mean 6.75 months). Significant complications occurred in 35 patients (57.4%) eight of whom sustained multiple significant complications. A total of 162 auxiliary procedures were used in conjunction with ESWL and in the management of complications. The stone free rate at three months was 18%, but rose by the end of the treatment period (41 months) to 63.9%. Our study indicates that ESWL monotherapy is associated with high morbidity rates, high rates of unplanned invasive procedures as well as prolonged treatment periods and hospitalization. Thus, ESWL monotherapy is not adequate for the management of staghorn calculi.
Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios
NASA Technical Reports Server (NTRS)
Juarez, Alfredo; Harper, Susana Tapia
2016-01-01
The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.
Determination of the need for selenium by chicks fed practical diets adequate in vitamin E
Combs, G.F. Jr.; Su, Q.; Liu, C.H.; Sinisalo, M.; Combs, S.B.
1986-03-01
Experiments were conducted to compare the dietary needs for selenium (Se) by chicks fed either purified (amino acid-based) or practical (corn- and soy-based) diets that were adequate with respect to vitamin E (i.e., contained 100 IU/kg) and all other known nutrients with the single exception of Se (i.e., contained only 0.10 ppm Se). Studies were conducted in Ithaca using Single Comb White Leghorn chicks fed the purified basal diet and in Beijing using chicks of the same breed fed either the same purified basal diet or the practical diet formulated to be similar to that used in poultry production in some parts of China and the US. Results showed that each basal diet produced severe depletion of Se-dependent glutathione peroxidase (SeGSHpx) in plasma, liver and pancreas according to the same time-course, but that other consequences of severe uncomplicated Se deficiency were much more severe among chicks fed the purified diet (e.g., growth depression, pancreatic dysfunction as indicated by elevated plasma amylase and abnormal pancreatic histology). Chicks fed the practical Se-deficient diet showed reduced pancreas levels of copper, zinc and molybdenum and elevated plasma levels of iron; they required ca. 0.10 ppm dietary Se to sustain normal SeGSHpx in several tissues and to prevent elevated amylase in plasma. The dietary Se requirement of the chick is, therefore, estimated to be 0.10 ppm.
Bhattarai, M D
2012-09-01
On one hand there is obvious inadequate health coverage to the rural population and on the other hand the densely populated urban area is facing the triple burden of increasing non-communicable and communicable health problems and the rising health cost. The postgraduate medical training is closely interrelated with the adequate health service delivery and health economics. In relation to the prevailing situation, the modern medical education trend indicates the five vital issues. These are i). Opportunity needs to be given to all MBBS graduates for General Specialist and Sub-Specialist Training inside the country to complete their medical education, ii). Urgent need for review of PG residential training criteria including appropriate bed and teacher criteria as well as entry criteria and eligibility criteria, iii). Involvement of all available units of hospitals fulfilling the requirements of the residential PG training criteria, iv). PG residential trainings involve doing the required work in the hospitals entitling them full pay and continuation of the service without any training fee or tuition fee, and v). Planning of the proportions of General Specialty and Sub-Specialty Training fields, particularly General Practice (GP) including its career and female participation. With increased number of medical graduates, now it seems possible to plan for optimal health coverage to the populations with appropriate postgraduate medical training. The medical professionals and public health workers must make the Government aware of the vital responsibility and the holistic approach required.
Twenty-Four-Hour Urine Osmolality as a Physiological Index of Adequate Water Intake
Perrier, Erica T.; Buendia-Jimenez, Inmaculada; Vecchio, Mariacristina; Armstrong, Lawrence E.; Tack, Ivan; Klein, Alexis
2015-01-01
While associations exist between water, hydration, and disease risk, research quantifying the dose-response effect of water on health is limited. Thus, the water intake necessary to maintain optimal hydration from a physiological and health standpoint remains unclear. The aim of this analysis was to derive a 24 h urine osmolality (UOsm) threshold that would provide an index of “optimal hydration,” sufficient to compensate water losses and also be biologically significant relative to the risk of disease. Ninety-five adults (31.5 ± 4.3 years, 23.2 ± 2.7 kg·m−2) collected 24 h urine, provided morning blood samples, and completed food and fluid intake diaries over 3 consecutive weekdays. A UOsm threshold was derived using 3 approaches, taking into account European dietary reference values for water; total fluid intake, and urine volumes associated with reduced risk for lithiasis and chronic kidney disease and plasma vasopressin concentration. The aggregate of these approaches suggest that a 24 h urine osmolality ≤500 mOsm·kg−1 may be a simple indicator of optimal hydration, representing a total daily fluid intake adequate to compensate for daily losses, ensure urinary output sufficient to reduce the risk of urolithiasis and renal function decline, and avoid elevated plasma vasopressin concentrations mediating the increased antidiuretic effort. PMID:25866433
Mensah, Patience; Tomkins, Andrew
2003-03-01
Plant-based complementary foods are the main source of nutrients for many young children in developing countries. They may, however, present problems in providing nutritionally adequate and safe diets for older infants and young children. The high starch content leads to low-nutrient diets that are bulky and dense, with high levels of antinutritive factors such as phytates, tannins, lectins, and enzyme inhibitors. Phytates impair mineral bioavailability, lectins interfere with intestinal structure, and enzyme inhibitors inhibit digestive enzymes. In addition, there is often microbial contamination, which leads to diarrhea, growth-faltering, and impaired development, and the presence of chemical contaminants may lead to neurological disease and goiter. The fact that some fruits containing carotenoids are only available seasonally contributes to the vulnerability of children receiving predominantly plant-based diets. Traditional household food technologies have been used for centuries to improve the quality and safety of complementary foods. These include dehulling, peeling, soaking, germination, fermentation, and drying. While modern communities tend to reject these technologies in favor of more convenient fast-food preparations, there is now a resurgence of interest in older technologies as a possible means of improving the quality and safety of complementary foods when the basic diet cannot be changed for economic reasons. This paper describes the biology, safety, practicability, and acceptability of these traditional processes at the household or community level, as well as the gaps in research, so that more effective policies and programs can be implemented to improve the quality and safety of complementary foods.
Burnier, Michel; Wuerzner, Gregoire; Bochud, Murielle
2015-01-01
Among the various strategies to reduce the incidence of non-communicable diseases reduction of sodium intake in the general population has been recognized as one of the most cost-effective means because of its potential impact on the development of hypertension and cardiovascular diseases. Yet, this strategic health recommendation of the WHO and many other international organizations is far from being universally accepted. Indeed, there are still several unresolved scientific and epidemiological questions that maintain an ongoing debate. Thus what is the adequate low level of sodium intake to recommend to the general population and whether national strategies should be oriented to the overall population or only to higher risk fractions of the population such as salt-sensitive patients are still discussed. In this paper, we shall review the recent results of the literature regarding salt, blood pressure and cardiovascular risk and we present the recommendations recently proposed by a group of experts of Switzerland. The propositions of the participating medical societies are to encourage national health authorities to continue their discussion with the food industry in order to reduce the sodium intake of food products with a target of mean salt intake of 5–6 grams per day in the population. Moreover, all initiatives to increase the information on the effect of salt on health and on the salt content of food are supported. PMID:26321959
Gasparetto, Emerson Leandro; Alves-Leon, Soniza; Domingues, Flavio Sampaio; Frossard, João Thiago; Lopes, Selva Paraguassu; Souza, Jorge Marcondes de
2016-06-01
Neurocysticercosis (NCC) is an endemic disease and important public health problem in some areas of the World and epilepsy is the most common neurological manifestation. Multiple intracranial lesions, commonly calcified, are seen on cranial computed tomography (CT) in the chronic phase of the disease and considered one of the diagnostic criteria of the diagnosis. Magnetic resonance imaging (MRI) is the test that better depicts the different stages of the intracranial cysts but does not show clearly calcified lesions. Cerebral cavernous malformations (CCM), also known as cerebral cavernomas, are frequent vascular malformations of the brain, better demonstrated by MRI and have also epilepsy as the main form of clinical presentation. When occurring in the familial form, cerebral cavernomas typically present with multiple lesions throughout the brain and, very often, with foci of calcifications in the lesions when submitted to the CT imaging. In the countries, and geographic areas, where NCC is established as an endemic health problem and neuroimaging screening is done by CT scan, it will be important to consider the differential diagnosis between the two diseases due to the differences in adequate management.
Gasparetto, Emerson Leandro; Alves-Leon, Soniza; Domingues, Flavio Sampaio; Frossard, João Thiago; Lopes, Selva Paraguassu; Souza, Jorge Marcondes de
2016-06-01
Neurocysticercosis (NCC) is an endemic disease and important public health problem in some areas of the World and epilepsy is the most common neurological manifestation. Multiple intracranial lesions, commonly calcified, are seen on cranial computed tomography (CT) in the chronic phase of the disease and considered one of the diagnostic criteria of the diagnosis. Magnetic resonance imaging (MRI) is the test that better depicts the different stages of the intracranial cysts but does not show clearly calcified lesions. Cerebral cavernous malformations (CCM), also known as cerebral cavernomas, are frequent vascular malformations of the brain, better demonstrated by MRI and have also epilepsy as the main form of clinical presentation. When occurring in the familial form, cerebral cavernomas typically present with multiple lesions throughout the brain and, very often, with foci of calcifications in the lesions when submitted to the CT imaging. In the countries, and geographic areas, where NCC is established as an endemic health problem and neuroimaging screening is done by CT scan, it will be important to consider the differential diagnosis between the two diseases due to the differences in adequate management. PMID:27332076
[Level of awareness and the adequate application of sunscreen by beauticians].
Cortez, Diógenes Aparício Garcia; Machado, Érica Simionato; Vermelho, Sonia Cristina Soares Dias; Teixeira, Jorge Juarez Vieira; Cortez, Lucia Elaine Ranieri
2016-06-01
The scope of this research was to establish the level of awareness of beauticians regarding the importance of the application of sunscreen and to identify whether their patients had been properly instructed by these professionals. It involved a descriptive and exploratory study with interviews applying qualitative methodology among 30 beauticians. Data were gathered using the semi-structured interview technique in Maringá, in the southern state of Paraná. The data were analyzed using Atlas.ti software after applying quantitative analysis and response classification. Of those interviewed, 83.33% had a degree in Aesthetics, 20% attended ongoing training activities on sunscreen and 73.17% acquired sunscreen for its quality, though 86.67% were not familiar with sunscreens with natural anti-free radical components. Of those interviewed, 80% had never treated patients with skin cancer, though they reported having knowledge of care in relation to sun exposure and how to use the sunscreen and the relationship of these practices with the disease. The results showed that the recommendations and use of sunscreen by beauticians and users has been conducted in an adequate and conscientious manner. PMID:27383359
Idvall, J; Aronsen, K F; Lindström, K; Ulmsten, U
1977-09-30
Various catheter-manometer systems possible for intravascular blood pressure measurments on rats have been elaborated and tested in vitro and in vivo. Using a pressure-step calibrator, it was observed from in vitro studies that microtransducers had superior frequency response compared to conventional transducers. Of the catheters tested, Pe-90 tapered to a 40 mm tip with an inner diameter of 0.3 mm had the best frequency response as judged from fall and settling times. Because of the damping effect, tapering increased fall time to 1.8 ms, which was still quite acceptable. By the same token settling time was minimized to 22.4 ms. With a special calculation method the theoretical percentile fault of the recordings was estimated to be 9.66%. When the measurement error was calculated from the actual in vivo recordings, it was found to be no more than 2.7%. These results show that the technique described is adequate for continuous intravascular blood pressure recordings on small animals. Finally it is emphasized that careful handling of the catheters and avoidance of stopcocks and air bubbles are essential for obtaining accurate and reproducible values. PMID:928971
A high UV environment does not ensure adequate Vitamin D status
NASA Astrophysics Data System (ADS)
Kimlin, M. G.; Lang, C. A.; Brodie, A.; Harrison, S.; Nowak, M.; Moore, M. R.
2006-12-01
Queensland has the highest rates of skin cancer in the world and due to the high levels of solar UV in this region it is assumed that incidental UV exposure should provide adequate vitamin D status for the population. This research was undertaken to test this assumption among healthy free-living adults in south-east Queensland, Australia (27°S), at the end of winter. This research was approved by Queensland University of Technology Human Research Ethics Committee and conducted under the guidelines of the Declaration of Helsinki. 10.2% of the sample had serum vitamin D levels below 25nm/L (deficiency) and a further 32.3% had levels between 25nm/L and 50nm/L (insufficiency). Vitamin D deficiency and insufficiency can occur at the end of winter, even in sunny climates. The wintertime UV levels in south-east Queensland (UV index 4-6) are equivalent to summertime UV levels in northern regions of Europe and the USA. These ambient UV levels are sufficient to ensure synthesis of vitamin D requirements. We investigated individual UV exposure (through a self reported sun exposure questionnaire) and found correlations between exposure and Vitamin D status. Further research is needed to explore the interactions between the solar UV environment and vitamin D status, particularly in high UV environments, such as Queensland.
The placental pursuit for an adequate oxidant balance between the mother and the fetus
Herrera, Emilio A.; Krause, Bernardo; Ebensperger, German; Reyes, Roberto V.; Casanello, Paola; Parra-Cordero, Mauro; Llanos, Anibal J.
2014-01-01
The placenta is the exchange organ that regulates metabolic processes between the mother and her developing fetus. The adequate function of this organ is clearly vital for a physiologic gestational process and a healthy baby as final outcome. The umbilico-placental vasculature has the capacity to respond to variations in the materno-fetal milieu. Depending on the intensity and the extensity of the insult, these responses may be immediate-, mediate-, and long-lasting, deriving in potential morphostructural and functional changes later in life. These adjustments usually compensate the initial insults, but occasionally may switch to long-lasting remodeling and dysfunctional processes, arising maladaptation. One of the most challenging conditions in modern perinatology is hypoxia and oxidative stress during development, both disorders occurring in high-altitude and in low-altitude placental insufficiency. Hypoxia and oxidative stress may induce endothelial dysfunction and thus, reduction in the perfusion of the placenta and restriction in the fetal growth and development. This Review will focus on placental responses to hypoxic conditions, usually related with high-altitude and placental insufficiency, deriving in oxidative stress and vascular disorders, altering fetal and maternal health. Although day-to-day clinical practice, basic and clinical research are clearly providing evidence of the severe impact of oxygen deficiency and oxidative stress establishment during pregnancy, further research on umbilical and placental vascular function under these conditions is badly needed to clarify the myriad of questions still unsettled. PMID:25009498
Aurally-adequate time-frequency analysis for scattered sound in auditoria
NASA Astrophysics Data System (ADS)
Norris, Molly K.; Xiang, Ning; Kleiner, Mendel
2005-04-01
The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.
Statistical Properties of Online Auctions
NASA Astrophysics Data System (ADS)
Namazi, Alireza; Schadschneider, Andreas
We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.
Effects of flare definitions on the statistics of derived flare distributions
NASA Astrophysics Data System (ADS)
Ryan, D. F.; Dominique, M.; Seaton, D.; Stegen, K.; White, A.
2016-08-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. Such examinations can tackle large-scale science questions or give context to detailed single-event studies. However, they are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds. This arbitrariness may lead to important scientific conclusions being drawn from results caused by subjective choices in algorithms rather than the true nature of the Sun. In this paper, we explore the effect of the arbitrary thresholds used in the Geostationary Operational Environmental Satellite (GOES) event list and Large Yield RAdiometer (LYRA) Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the flare start thresholds of the algorithms. We also find that the power law exponents of these distributions are not stable, but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are artificially steep and clearly non-power law. We show that this is consistent with an insufficient degradation correction. This means that PROBA2/LYRA should not be used for flare statistics or energetics unless degradation is adequately accounted for. However, it can be used to study variations over shorter timescales and for space weather monitoring.
Phase statistics of seismic coda waves.
Anache-Ménier, D; van Tiggelen, B A; Margerin, L
2009-06-19
We report the analysis of the statistics of the phase fluctuations in the coda of earthquakes recorded during a temporary experiment deployed at Pinyon Flats Observatory, California. The observed distributions of the spatial derivatives of the phase in the seismic coda exhibit universal power-law decays whose exponents agree accurately with circular Gaussian statistics. The correlation function of the phase derivative is measured and used to estimate the mean free path of Rayleigh waves.
On real statistics of relaxation in gases
NASA Astrophysics Data System (ADS)
Kuzovlev, Yu. E.
2016-02-01
By example of a particle interacting with ideal gas, it is shown that the statistics of collisions in statistical mechanics at any value of the gas rarefaction parameter qualitatively differ from that conjugated with Boltzmann's hypothetical molecular chaos and kinetic equation. In reality, the probability of collisions of the particle in itself is random. Because of that, the relaxation of particle velocity acquires a power-law asymptotic behavior. An estimate of its exponent is suggested on the basis of simple kinematic reasons.
Multidimensional Visual Statistical Learning
ERIC Educational Resources Information Center
Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.
2008-01-01
Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Demonstrating Poisson Statistics.
ERIC Educational Resources Information Center
Vetterling, William T.
1980-01-01
Describes an apparatus that offers a very lucid demonstration of Poisson statistics as applied to electrical currents, and the manner in which such statistics account for shot noise when applied to macroscopic currents. The experiment described is intended for undergraduate physics students. (HM)
Statistical Summaries: Public Institutions.
ERIC Educational Resources Information Center
Virginia State Council of Higher Education, Richmond.
This document, presents a statistical portrait of the Virginia's 17 public higher education institutions. Data provided include: enrollment figures (broken down in categories such as sex, residency, full- and part-time status, residence, ethnicity, age, and level of postsecondary education); FTE figures; admissions statistics (such as number…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
ERIC Educational Resources Information Center
Council of Ontario Universities, Toronto.
Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…
Introduction to Statistical Physics
NASA Astrophysics Data System (ADS)
Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo
2014-12-01
Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
Explorations in Statistics: Correlation
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…
Code of Federal Regulations, 2010 CFR
2010-07-01
... Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Enhanced Filtration and Disinfection-Systems Serving Fewer Than 10,000 People... is adequate to limit potential contamination by Cryptosporidium oocysts. The adequacy of the...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Enhanced Filtration and Disinfection-Systems Serving Fewer Than 10,000 People... is adequate to limit potential contamination by Cryptosporidium oocysts. The adequacy of the...
NASA Astrophysics Data System (ADS)
Petropoulos, Z.; Clavin, C.; Zuckerman, B.
2015-12-01
The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large
Maintaining Adequate Carbon Dioxide Washout for an Advanced Extravehicular Mobility Unit
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Navarro, Moses; Conger, Bruce; Korona, Adam; McMillin, Summer; Norcross, Jason; Swickrath, Mike
2013-01-01
Over the past several years, NASA has realized tremendous progress in technology development that is aimed at the production of an Advanced Extravehicular Mobility Unit (AEMU). Of the many functions provided by the spacesuit and portable life support subsystem within the AEMU, delivering breathing gas to the astronaut along with removing the carbon dioxide (CO2) remains one of the most important environmental functions that the AEMU can control. Carbon dioxide washout is the capability of the ventilation flow in the spacesuit helmet to provide low concentrations of CO2 to the crew member to meet breathing requirements. CO2 washout performance is a critical parameter needed to ensure proper and sufficient designs in a spacesuit and in vehicle applications such as sleep stations and hygiene compartments. Human testing to fully evaluate and validate CO2 washout performance is necessary but also expensive due to the levied safety requirements. Moreover, correlation of math models becomes challenging because of human variability and movement. To supplement human CO2 washout testing, a breathing capability will be integrated into a suited manikin test apparatus to provide a safe, lower cost, stable, easily modeled alternative to human testing. Additionally, this configuration provides NASA Johnson Space Center (JSC) the capability to evaluate CO2 washout under off-nominal conditions that would otherwise be unsafe for human testing or difficult due to fatigue of a test subject. Testing has been under way in-house at JSC and analysis has been initiated to evaluate whether the technology provides sufficient performance in ensuring that the CO2 is removed sufficiently and the ventilation flow is adequate for maintaining CO2 washout in the AEMU spacesuit helmet of the crew member during an extravehicular activity. This paper will review recent CO2 washout testing and analysis activities, testing planned in-house with a spacesuit simulator, and the associated analytical work
Do Foley Catheters Adequately Drain the Bladder? Evidence from CT Imaging Studies
Avulova, Svetlana; Li, Valery J.; Khusid, Johnathan A.; Choi, Woo S.; Weiss, Jeffrey P.
2015-01-01
ABSTRACT Introduction: The Foley catheter has been widely assumed to be an effective means of draining the bladder. However, recent studies have brought into question its efficacy. The objective of our study is to further assess the adequacy of Foley catheter for complete drainage of the bladder. Materials and Methods: Consecutive catheterized patients were identified from a retrospective review of contrast enhanced and non-contrast enhanced computed tomo-graphic (CT) abdomen and pelvis studies completed from 7/1/2011-6/30/2012. Residual urine volume (RUV) was measured using 5mm axial CT sections as follows: The length (L) and width (W) of the bladder in the section with the greatest cross sectional area was combined with bladder height (H) as determined by multiplanar reformatted images in order to calculate RUV by applying the formula for the volume (V) of a sphere in a cube: V=(ϖ/6)*(L*W*H). Results: RUVs of 167 (mean age 67) consecutively catheterized men (n=72) and women (n=95) identified by CT abdomen and pelvis studies were calculated. The mean RUV was 13.2 mL (range: 0.0 mL-859.1 mL, standard deviation: 75.9 mL, margin of error at 95% confidence:11.6 mL). Four (2.4%) catheterized patients had RUVs of >50 mL, two of whom had an improperly placed catheter tip noted on their CT-reports. Conclusions: Previous studies have shown that up to 43% of catheterized patients had a RUV greater than 50 mL, suggesting inadequacy of bladder drainage via the Foley catheter. Our study indicated that the vast majority of patients with Foley catheters (97.6%), had adequately drained bladders with volumes of <50 mL. PMID:26200550
Goutianos, Georgios; Tzioura, Aikaterini; Kyparos, Antonios; Paschalis, Vassilis; Margaritelis, Nikos V; Veskoukis, Aristidis S; Zafeiridis, Andreas; Dipla, Konstantina; Nikolaidis, Michalis G; Vrabas, Ioannis S
2015-01-01
Animal models are widely used in biology and the findings of animal research are traditionally projected to humans. However, recent publications have raised concerns with regard to what extent animals and humans respond similar to physiological stimuli. Original data on direct in vivo comparison between animals and humans are scarce and no study has addressed this issue after exercise. We aimed to compare side by side in the same experimental setup rat and human responses to an acute exercise bout of matched intensity and duration. Rats and humans ran on a treadmill at 86% of maximal velocity until exhaustion. Pre and post exercise we measured 30 blood chemistry parameters, which evaluate iron status, lipid profile, glucose regulation, protein metabolism, liver, and renal function. ANOVA indicated that almost all biochemical parameters followed a similar alteration pattern post exercise in rats and humans. In fact, there were only 2/30 significant species × exercise interactions (in testosterone and globulins), indicating different responses to exercise between rats and humans. On the contrary, the main effect of exercise was significant in 15/30 parameters and marginally nonsignificant in other two parameters (copper, P = 0.060 and apolipoprotein B, P = 0.058). Our major finding is that the rat adequately mimics human responses to exercise in those basic blood biochemical parameters reported here. The physiological resemblance of rat and human blood responses after exercise to exhaustion on a treadmill indicates that the use of blood chemistry in rats for exercise physiology research is justified. PMID:25677548
Emotional Experiences of Obese Women with Adequate Gestational Weight Variation: A Qualitative Study
Faria-Schützer, Débora Bicudo; Surita, Fernanda Garanhani de Castro; Alves, Vera Lucia Pereira; Vieira, Carla Maria; Turato, Egberto Ribeiro
2015-01-01
Background As a result of the growth of the obese population, the number of obese women of fertile age has increased in the last few years. Obesity in pregnancy is related to greater levels of anxiety, depression and physical harm. However, pregnancy is an opportune moment for the intervention of health care professionals to address obesity. The objective of this study was to describe how obese pregnant women emotionally experience success in adequate weight control. Methods and Findings Using a qualitative design that seeks to understand content in the field of health, the sample of subjects was deliberated, with thirteen obese pregnant women selected to participate in an individual interview. Data was analysed by inductive content analysis and includes complete transcription of the interviews, re-readings using suspended attention, categorization in discussion topics and the qualitative and inductive analysis of the content. The analysis revealed four categories, three of which show the trajectory of body care that obese women experience during pregnancy: 1) The obese pregnant woman starts to think about her body;2) The challenge of the diet for the obese pregnant woman; 3) The relation of the obese pregnant woman with the team of antenatal professionals. The fourth category reveals the origin of the motivation for the change: 4) The potentializing factors for change: the motivation of the obese woman while pregnant. Conclusions During pregnancy, obese women are more in touch with themselves and with their emotional conflicts. Through the transformations of their bodies, women can start a more refined self-care process and experience of the body-mind unit. The fear for their own and their baby's life, due to the risks posed by obesity, appears to be a great potentializing factor for change. The relationship with the professionals of the health care team plays an important role in the motivational support of the obese pregnant woman. PMID:26529600
Roos, V; Gunnarsson, L; Fick, J; Larsson, D G J; Rudén, C
2012-04-01
The presence of pharmaceuticals in the aquatic environment, and the concerns for negative effects on aquatic organisms, has gained increasing attention over the last years. As ecotoxicity data are lacking for most active pharmaceutical ingredients (APIs), it is important to identify strategies to prioritise APIs for ecotoxicity testing and environmental monitoring. We have used nine previously proposed prioritisation schemes, both risk- and hazard-based, to rank 582 APIs. The similarities and differences in overall ranking results and input data were compared. Moreover, we analysed how well the methods ranked seven relatively well-studied APIs. It is concluded that the hazard-based methods were more successful in correctly ranking the well-studied APIs, but the fish plasma model, which includes human pharmacological data, also showed a high success rate. The results of the analyses show that the input data availability vary significantly; some data, such as logP, are available for most API while information about environmental concentrations and bioconcentration are still scarce. The results also suggest that the exposure estimates in risk-based methods need to be improved and that the inclusion of effect measures at first-tier prioritisation might underestimate risks. It is proposed that in order to develop an adequate prioritisation scheme, improved data on exposure such as degradation and sewage treatment removal and bioconcentration ability should be further considered. The use of ATC codes may also be useful for the development of a prioritisation scheme that includes the mode of action of pharmaceuticals and, to some extent, mixture effects. PMID:22361586
Delange, François; de Benoist, Bruno; Burgi, Hans
2002-01-01
OBJECTIVE: Urinary iodine concentration is the prime indicator of nutritional iodine status and is used to evaluate population-based iodine supplementation. In 1994, WHO, UNICEF and ICCIDD recommended median urinary iodine concentrations for populations of 100- 200 micro g/l, assuming the 100 micro g/l threshold would limit concentrations <50 micro g/l to 100 micro g/l. The total population was 55 892, including 35 661 (64%) schoolchildren. Median urinary iodine concentrations were 111-540 (median 201) micro g/l for all populations, 100-199 micro g/l in 23 (48%) populations and >/=200 micro g/l in 25 (52%). The frequencies of values <50 micro g/l were 0-20.8 (mean 4.8%) overall and 7.2% and 2.5% in populations with medians of 100-199 micro g/l and >200 micro g/l, respectively. The frequency reached 20% only in two places where iodine had been supplemented for <2 years. CONCLUSION: The frequency of urinary iodine concentrations <50 micro g/l in populations with median urinary iodine concentrations >/=100 micro g/l has been overestimated. The threshold of 100 micro g/l does not need to be increased. In populations, median urinary iodine concentrations of 100-200 micro g/l indicate adequate iodine intake and optimal iodine nutrition. PMID:12219154
The adequate stimulus for avian short latency vestibular responses to linear translation
NASA Technical Reports Server (NTRS)
Jones, T. A.; Jones, S. M.; Colbert, S.
1998-01-01
Transient linear acceleration stimuli have been shown to elicit eighth nerve vestibular compound action potentials in birds and mammals. The present study was undertaken to better define the nature of the adequate stimulus for neurons generating the response in the chicken (Gallus domesticus). In particular, the study evaluated the question of whether the neurons studied are most sensitive to the maximum level of linear acceleration achieved or to the rate of change in acceleration (da/dt, or jerk). To do this, vestibular response thresholds were measured as a function of stimulus onset slope. Traditional computer signal averaging was used to record responses to pulsed linear acceleration stimuli. Stimulus onset slope was systematically varied. Acceleration thresholds decreased with increasing stimulus onset slope (decreasing stimulus rise time). When stimuli were expressed in units of jerk (g/ms), thresholds were virtually constant for all stimulus rise times. Moreover, stimuli having identical jerk magnitudes but widely varying peak acceleration levels produced virtually identical responses. Vestibular response thresholds, latencies and amplitudes appear to be determined strictly by stimulus jerk magnitudes. Stimulus attributes such as peak acceleration or rise time alone do not provide sufficient information to predict response parameter quantities. Indeed, the major response parameters were shown to be virtually independent of peak acceleration levels or rise time when these stimulus features were isolated and considered separately. It is concluded that the neurons generating short latency vestibular evoked potentials do so as "jerk encoders" in the chicken. Primary afferents classified as "irregular", and which traditionally fall into the broad category of "dynamic" or "phasic" neurons, would seem to be the most likely candidates for the neural generators of short latency vestibular compound action potentials.
Carnegie Mellon Course Dissects Statistics about Sexual Orientation
ERIC Educational Resources Information Center
Keller, Josh
2007-01-01
Most statistics courses emphasize the power of statistics. Michele DiPietro's course focuses on the failures. Gay and lesbian studies are certainly fertile ground for bad guesses and unreliable statistics. The most famous number, that 10 percent of the population is gay, was taken from a biased Kinsey sample of white men ages 16 to 55 in 1948, and…
Kodama, Momoko; Uenishi, Kazuhiro
2010-06-01
Childhood and adolescence are important periods for body growth. Calcium is one of the critical dietary factors especially for bone growth. Although recommended dietary allowance (RDA) of calcium has been determined higher in Dietary reference intakes for Japanese, 2010, calcium intake of Japanese children and adolescent are not necessarily adequate. Furthermore, breakfast skippers in this period tend to increase. So, it is very important to acquire an adequate dietary habit from childhood and adolescent. PMID:20513944
Statistical regimes of random laser fluctuations
Lepri, Stefano; Cavalieri, Stefano; Oppo, Gian-Luca; Wiersma, Diederik S.
2007-06-15
Statistical fluctuations of the light emitted from amplifying random media are studied theoretically and numerically. The characteristic scales of the diffusive motion of light lead to Gaussian or power-law (Levy) distributed fluctuations depending on external control parameters. In the Levy regime, the output pulse is highly irregular leading to huge deviations from a mean-field description. Monte Carlo simulations of a simplified model which includes the population of the medium demonstrate the two statistical regimes and provide a comparison with dynamical rate equations. Different statistics of the fluctuations helps to explain recent experimental observations reported in the literature.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific joumals such asThe journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
LED champing: statistically blessed?
Wang, Zhuo
2015-06-10
LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863
Winters, Ryan; Winters, Andrew; Amedee, Ronald G.
2010-01-01
The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381
NASA Technical Reports Server (NTRS)
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
NASA Technical Reports Server (NTRS)
Levy, R.; Mcginness, H.
1976-01-01
Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.
Ge, Jiajia; Santanam, Lakshmi; Noel, Camille; Parikh, Parag J.
2013-03-15
Purpose: To evaluate whether planning 4-dimensional computed tomography (4DCT) can adequately represent daily motion of abdominal tumors in regularly fractionated and stereotactic body radiation therapy (SBRT) patients. Methods and Materials: Intrafractional tumor motion of 10 patients with abdominal tumors (4 pancreas-fractionated and 6 liver-stereotactic patients) with implanted fiducials was measured based on daily orthogonal fluoroscopic movies over 38 treatment fractions. The needed internal margin for at least 90% of tumor coverage was calculated based on a 95th and fifth percentile of daily 3-dimensional tumor motion. The planning internal margin was generated by fusing 4DCT motion from all phase bins. The disagreement between needed and planning internal margin was analyzed fraction by fraction in 3 motion axes (superior-inferior [SI], anterior-posterior [AP], and left-right [LR]). The 4DCT margin was considered as an overestimation/underestimation of daily motion when disagreement exceeded at least 3 mm in the SI axis and/or 1.2 mm in the AP and LR axes (4DCT image resolution). The underlying reasons for this disagreement were evaluated based on interfractional and intrafractional breathing variation. Results: The 4DCT overestimated daily 3-dimensional motion in 39% of the fractions in 7 of 10 patients and underestimated it in 53% of the fractions in 8 of 10 patients. Median underestimation was 3.9 mm, 3.0 mm, and 1.7 mm in the SI axis, AP axis, and LR axis, respectively. The 4DCT was found to capture irregular deep breaths in 3 of 10 patients, with 4DCT motion larger than mean daily amplitude by 18 to 21 mm. The breathing pattern varied from breath to breath and day to day. The intrafractional variation of amplitude was significantly larger than intrafractional variation (2.7 mm vs 1.3 mm) in the primary motion axis (ie, SI axis). The SBRT patients showed significantly larger intrafractional amplitude variation than fractionated patients (3.0 mm vs 2
NASA Astrophysics Data System (ADS)
Bieg, Bohdan; Chrzanowski, Janusz; Kravtsov, Yury A.; Orsitto, Francesco
Basic principles and recent findings of quasi-isotropic approximation (QIA) of a geometrical optics method are presented in a compact manner. QIA was developed in 1969 to describe electromagnetic waves in weakly anisotropic media. QIA represents the wave field as a power series in two small parameters, one of which is a traditional geometrical optics parameter, equal to wavelength ratio to plasma characteristic scale, and the other one is the largest component of anisotropy tensor. As a result, "" QIA ideally suits to tokamak polarimetry/interferometry systems in submillimeter range, where plasma manifests properties of weakly anisotropic medium.
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...
Titanic: A Statistical Exploration.
ERIC Educational Resources Information Center
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
Cooperative Learning in Statistics.
ERIC Educational Resources Information Center
Keeler, Carolyn M.; And Others
1994-01-01
Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)
Purposeful Statistical Investigations
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.
Tuberculosis Data and Statistics
... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...
Electric power annual 1995. Volume II
1996-12-01
This document summarizes pertinent statistics on various aspects of the U.S. electric power industry for the year and includes a graphic presentation. Data is included on electric utility retail sales and revenues, financial statistics, environmental statistics of electric utilities, demand-side management, electric power transactions, and non-utility power producers.
Oakland, J.S.
1986-01-01
Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics
Statistical Physics of Particles
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics
NASA Astrophysics Data System (ADS)
Maslov, V. P.; Maslova, T. V.
2013-07-01
We introduce several new notions in mathematical statistics that bridge the gap between this discipline and statistical physics. The analogy between them is useful both for mathematics and for physics. What is more, this new mathematical statistics is adequate for the study of computer networks and self-teaching systems. The role of the web in sociological and economic research is ascertained.
F-specific RNA bacteriophages are adequate model organisms for enteric viruses in fresh water.
Havelaar, A H; van Olphen, M; Drost, Y C
1993-01-01
Culturable enteroviruses were detected by applying concentration techniques and by inoculating the concentrates on the BGM cell line. Samples were obtained from a wide variety of environments, including raw sewage, secondary effluent, coagulated effluent, chlorinated and UV-irradiated effluents, river water, coagulated river water, and lake water. The virus concentrations varied widely between 0.001 and 570/liter. The same cell line also supported growth of reoviruses, which were abundant in winter (up to 95% of the viruses detected) and scarce in summer (less than 15%). The concentrations of three groups of model organisms in relation to virus concentrations were also studied. The concentrations of bacteria (thermotolerant coliforms and fecal streptococci) were significantly correlated with virus concentrations in river water and coagulated secondary effluent, but were relatively low in disinfected effluents and relatively high in surface water open to nonhuman fecal pollution. The concentrations of F-specific RNA bacteriophages (FRNA phages) were highly correlated with virus concentrations in all environments studied except raw and biologically treated sewage. Numerical relationships were consistent over the whole range of environments; the regression equations for FRNA phages on viruses in river water and lake water were statistically equivalent. These relationships support the possibility that enteric virus concentrations can be predicted from FRNA phage data. PMID:8215367
The Importance and Efficacy of Using Statistics in the High School Chemistry Laboratory
ERIC Educational Resources Information Center
Matsumoto, Paul S.
2006-01-01
In high school, many students do not have an opportunity to learn or use statistics. Because statistics is a powerful tool and many students take a statistics class in college, prior exposure to statistics in a chemistry course (or another course) would benefit students. This paper describes some statistical concepts and tests with their…
GeoGebra for Mathematical Statistics
ERIC Educational Resources Information Center
Hewson, Paul
2009-01-01
The GeoGebra software is attracting a lot of interest in the mathematical community, consequently there is a wide range of experience and resources to help use this application. This article briefly outlines how GeoGebra will be of great value in statistical education. The release of GeoGebra is an excellent example of the power of free software…
Anitua, Eduardo; Alkhraisat, Mohammad Hamdan; Piñas, Laura; Orive, Gorka
2015-05-01
The primary stability of dental implants is essentially influenced by the quality and quantity of hosting bone. To study the effects of adaptation of the drilling protocol to the biological quality of bone estimated by bone density and cortical/cancellous bone ratio, 8.5mm-short implants were placed in different bone types by adapting the drilling protocol to result in a socket under-preparation by 0.2, 0.4, 0.7, 1 and 1.2mm in bone types I, II, III, IV and V, respectively. The effect of the drilling protocol was studied on implant insertion torque and osseointegration. Additionally, we analyzed the relationship of demographic data and social habits to bone type and insertion torque. Then the correlation between insertion torque and bone quality was tested. One hundred ninety two patients (mean age: 62 ± 11 years) participated with 295 implants. The most common bone type at implant site was type III (47.1%) followed by type II (28.1%). Data analysis indicated that gender, age, and social habits had neither correlation with bone type nor with insertion torque. The insertion torque was 59.29 ± 7.27 Ncm for bone type I, 56.51 ± 1.62 Ncm for bone type II, 46.40 ± 1.60 Ncm for bone type III, 34.84 ± 2.38 Ncm for bone type IV and 5 Ncm for bone type V. Statistically significant correlation was found between bone type and insertion torque. The followed drilling protocol adapts socket under-preparation to the needs of establishing a sufficient primary stability for implant osseointegration.
Statistical analyses to support guidelines for marine avian sampling. Final report
Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris
2012-01-01
distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.
Data Torturing and the Misuse of Statistical Tools
Abate, Marcey L.
1999-08-16
Statistical concepts, methods, and tools are often used in the implementation of statistical thinking. Unfortunately, statistical tools are all too often misused by not applying them in the context of statistical thinking that focuses on processes, variation, and data. The consequences of this misuse may be ''data torturing'' or going beyond reasonable interpretation of the facts due to a misunderstanding of the processes creating the data or the misinterpretation of variability in the data. In the hope of averting future misuse and data torturing, examples are provided where the application of common statistical tools, in the absence of statistical thinking, provides deceptive results by not adequately representing the underlying process and variability. For each of the examples, a discussion is provided on how applying the concepts of statistical thinking may have prevented the data torturing. The lessons learned from these examples will provide an increased awareness of the potential for many statistical methods to mislead and a better understanding of how statistical thinking broadens and increases the effectiveness of statistical tools.
Helping Alleviate Statistical Anxiety with Computer Aided Statistical Classes
ERIC Educational Resources Information Center
Stickels, John W.; Dobbs, Rhonda R.
2007-01-01
This study, Helping Alleviate Statistical Anxiety with Computer Aided Statistics Classes, investigated whether undergraduate students' anxiety about statistics changed when statistics is taught using computers compared to the traditional method. Two groups of students were questioned concerning their anxiety about statistics. One group was taught…
[Functional restoration--it depends on an adequate mixture of treatment].
Pfingsten, M
2001-12-01
impairment as well as physical variables (mobility, strength) have limited predictive value. Return to work and pain reduction are much better predicted by length of absence from work, application for pension, and the patients' disability in daily-life activities. In the last five years another important variable of success has been identified: avoidance behavior has been suspected to be a major contributor to the initiation and maintenance of chronic low back pain. The perpetuation of avoidance behavior beyond normal healing time subsequently leads to negative consequences such as "disuse syndrome", which is associated with physical deconditioning, sick role behavior, psychosocial withdrawal and negative affect. Accordingly, fear-avoidance beliefs were strongly related to absenteeism from work due to back pain and were the best predictors of therapy outcome in 300 acute low back pain patients. In a prospective study on 87 patients with chronic low back pain (CLBP) we demonstrated that fear-avoidance beliefs were the strongest predictors of return to work after a functional restoration treatment program. Although nonspecific mechanisms such as emotional disturbance, helplessness, pain anticipation, disability, and job circumstances could be identified as influencing the chronic pain process, we have to remember that long-lasting experience of pain is usually a very individual process in which several conditions may work together in a unique combination. Treatment procedures must consider this variability by focusing on general mechanisms, as well as on individual conditions and deficits. FR treatment strongly depends on behavioral principles that rule the whole therapeutic process: Adequate information is necessary to overcome unhelpful beliefs; information has to be related to the patients' daily experiences and their mental capability to understand them. Pacing, goal-setting, graded exposure with exercise quotas and permanent feedback as well as contingent motivation
[Functional restoration--it depends on an adequate mixture of treatment].
Pfingsten, M
2001-12-01
impairment as well as physical variables (mobility, strength) have limited predictive value. Return to work and pain reduction are much better predicted by length of absence from work, application for pension, and the patients' disability in daily-life activities. In the last five years another important variable of success has been identified: avoidance behavior has been suspected to be a major contributor to the initiation and maintenance of chronic low back pain. The perpetuation of avoidance behavior beyond normal healing time subsequently leads to negative consequences such as "disuse syndrome", which is associated with physical deconditioning, sick role behavior, psychosocial withdrawal and negative affect. Accordingly, fear-avoidance beliefs were strongly related to absenteeism from work due to back pain and were the best predictors of therapy outcome in 300 acute low back pain patients. In a prospective study on 87 patients with chronic low back pain (CLBP) we demonstrated that fear-avoidance beliefs were the strongest predictors of return to work after a functional restoration treatment program. Although nonspecific mechanisms such as emotional disturbance, helplessness, pain anticipation, disability, and job circumstances could be identified as influencing the chronic pain process, we have to remember that long-lasting experience of pain is usually a very individual process in which several conditions may work together in a unique combination. Treatment procedures must consider this variability by focusing on general mechanisms, as well as on individual conditions and deficits. FR treatment strongly depends on behavioral principles that rule the whole therapeutic process: Adequate information is necessary to overcome unhelpful beliefs; information has to be related to the patients' daily experiences and their mental capability to understand them. Pacing, goal-setting, graded exposure with exercise quotas and permanent feedback as well as contingent motivation
Suite versus composite statistics
Balsillie, J.H.; Tanner, W.F.
1999-01-01
Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.
Candidate Assembly Statistical Evaluation
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
Perception in statistical graphics
NASA Astrophysics Data System (ADS)
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
NASA Astrophysics Data System (ADS)
Inomata, Akira
1997-03-01
To understand possible physical consequences of quantum deformation, we investigate statistical behaviors of a quon gas. The quon is an object which obeys the minimally deformed commutator (or q-mutator): a a† - q a†a=1 with -1≤ q≤ 1. Although q=1 and q=-1 appear to correspond respectively to boson and fermion statistics, it is not easy to create a gas which unifies the boson gas and the fermion gas. We present a model which is able to interpolates between the two limits. The quon gas shows the Bose-Einstein condensation near the Boson limit in two dimensions.
Statistical anisotropy in the inflationary universe
Shtanov, Yuri; Pyatkovska, Hanna
2009-07-15
During cosmological inflation, quasiclassical perturbations are permanently generated on super-Hubble spatial scales, their power spectrum being determined by the fundamental principles of quantum field theory. By the end of inflation, they serve as primeval seeds for structure formation in the universe. At early stages of inflation, such perturbations break homogeneity and isotropy of the inflationary background. In the present paper, we perturbatively take into account this quasiclassical background inhomogeneity of the inflationary universe while considering the evolution of small-scale (sub-Hubble) quantum modes. As a result, the power spectrum of primordial perturbations develops statistical anisotropy, which can subsequently manifest itself in the large-scale structure and cosmic microwave background. The statistically anisotropic contribution to the primordial power spectrum is predicted to have almost scale-invariant form dominated by a quadrupole. Theoretical expectation of the magnitude of this anisotropy depends on the assumptions about the physics in the trans-Planckian region of wave numbers.
Statistical characterization of real-world illumination.
Dror, Ron O; Willsky, Alan S; Adelson, Edward H
2004-09-28
Although studies of vision and graphics often assume simple illumination models, real-world illumination is highly complex, with reflected light incident on a surface from almost every direction. One can capture the illumination from every direction at one point photographically using a spherical illumination map. This work illustrates, through analysis of photographically acquired, high dynamic range illumination maps, that real-world illumination possesses a high degree of statistical regularity. The marginal and joint wavelet coefficient distributions and harmonic spectra of illumination maps resemble those documented in the natural image statistics literature. However, illumination maps differ from typical photographs in that illumination maps are statistically nonstationary and may contain localized light sources that dominate their power spectra. Our work provides a foundation for statistical models of real-world illumination, thereby facilitating the understanding of human material perception, the design of robust computer vision systems, and the rendering of realistic computer graphics imagery. PMID:15493972
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Options for Affordable Fission Surface Power Systems
NASA Technical Reports Server (NTRS)
Houts, Mike; Gaddis, Steve; Porter, Ron; VanDyke, Melissa; Martin Jim; Godfroy, Tom; Bragg-Sitton, Shannon; Garber, Anne; Pearson, Boise
2006-01-01
Fission surface power systems could provide abundant power anywhere on free surface of the moon or Mars. Locations could include permanently shaded regions on the moon and high latitudes on Mars. To be fully utilized; however, fission surface power systems must be safe, have adequate performance, and be affordable. This paper discusses options for the design and development of such systems.
Options for Affordable Fission Surface Power Systems
Houts, Mike; Gaddis, Steve; Porter, Ron; Van Dyke, Melissa; Martin, Jim; Godfroy, Tom; Bragg-Sitton, Shannon; Garber, Anne; Pearson, Boise
2006-07-01
Fission surface power systems could provide abundant power anywhere on the surface of the moon or Mars. Locations could include permanently shaded regions on the moon and high latitudes on Mars. To be fully utilized, however, fission surface power systems must be safe, have adequate performance, and be affordable. This paper discusses options for the design and development of such systems. (authors)
Statistical anisotropy from inflationary magnetogenesis
NASA Astrophysics Data System (ADS)
Giovannini, Massimo
2016-02-01
Provided the quantum fluctuations are amplified in the presence of a classical gauge field configuration the resulting curvature perturbations exhibit a mild statistical anisotropy which should be sufficiently weak not to conflict with current observational data. The curvature power spectra induced by weakly anisotropic initial states are computed here for the first time when the electric and the magnetic gauge couplings evolve at different rates as it happens, for instance, in the relativistic theory of van der Waals interactions. After recovering the results valid for coincident gauge couplings, the constraints imposed by the isotropy and the homogeneity of the initial states are discussed. The obtained bounds turn out to be more stringent than naively expected and cannot be ignored when discussing the underlying magnetogenesis scenarios.
Statistical anisotropies in gravitational waves in solid inflation
Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk
2014-09-01
Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.
Valley Fever (Coccidioidomycosis) Statistics
... military trainees, 6 , 7 archeological workers, 8 - 11 solar farm workers, 12 and in people exposed to ... McDowell A et al. Coccidioidomycosis among Workers Constructing Solar Power Farms, California, USA, 2011-2014. Emerg Infect ...
Statistical insight: a review.
Vardell, Emily; Garcia-Barcena, Yanira
2012-01-01
Statistical Insight is a database that offers the ability to search across multiple sources of data, including the federal government, private organizations, research centers, and international intergovernmental organizations in one search. Two sample searches on the same topic, a basic and an advanced, were conducted to evaluate the database.