42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and....568 Adequate financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination...
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and... financial records, statistical data, and cost finding. (a) Maintenance of records. (1) An HMO or CMP must maintain sufficient financial records and statistical data for proper determination of costs payable by...
Achieving statistical power through research design sensitivity.
Beck, C T
1994-11-01
The challenge for nurse researchers is to design their intervention studies with sufficient sensitivity to detect the treatment effects they are investigating. In order to meet this challenge, researchers must understand the factors that influence statistical power. Underpowered studies can result in a majority of null results in a research area when, in fact, the interventions are effective. The sensitivity of a research design is not a function of just one element of the design but of the entire research design: its plan, implementation and statistical analysis. When discussing factors that can increase a research design's statistical power, attention is most often focused on increasing sample size. This paper addresses a variety of factors and techniques, other than increasing sample size, that nurse researchers can use to enhance the sensitivity of a research design so that it can attain adequate power.
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
Statistical Power in Meta-Analysis
ERIC Educational Resources Information Center
Liu, Jin
2015-01-01
Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…
Statistical Power Analysis of Rehabilitation Counseling Research.
ERIC Educational Resources Information Center
Kosciulek, John F.; Szymanski, Edna Mora
1993-01-01
Provided initial assessment of the statistical power of rehabilitation counseling research published in selected rehabilitation journals. From 5 relevant journals, found 32 articles that contained statistical tests that could be power analyzed. Findings indicated that rehabilitation counselor researchers had little chance of finding small…
Statistical Power of Randomization Tests Used with Multiple-Baseline Designs.
ERIC Educational Resources Information Center
Ferron, John; Sentovich, Chris
2002-01-01
Estimated statistical power for three randomization tests used with multiple-baseline designs using Monte Carlo methods. For an effect size of 0.5, none of the tests provided an adequate level of power, and for an effect size of 1.0, power was adequate for the Koehler-Levin test and the Marascuilo-Busk test only when the series length was long and…
Evaluating and Reporting Statistical Power in Counseling Research
ERIC Educational Resources Information Center
Balkin, Richard S.; Sheperis, Carl J.
2011-01-01
Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…
Practical Uses of Statistical Power in Business Research Studies.
ERIC Educational Resources Information Center
Markowski, Edward P.; Markowski, Carol A.
1999-01-01
Proposes the use of statistical power subsequent to the results of hypothesis testing in business research. Describes how posttest use of power might be integrated into business statistics courses. (SK)
Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.
ERIC Educational Resources Information Center
Rogers, Bruce G.
This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…
Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power
ERIC Educational Resources Information Center
Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon
2016-01-01
An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
Study on the statistical characteristics of solar power
NASA Astrophysics Data System (ADS)
Liu, Jun
2017-01-01
Solar power in China has grown rapidly recently. It exists variation of solar power due to cloudy and dusty, which is not that much of wind. The way to evaluate the statistical characteristics of solar power is important to the analysis of power system planning and operation. In this study, a multi-scale spatial and temporal framework of evaluating indices was established to describe the variation of its own natural features and the interaction between solar and load, grids. Finally, we have a case study on the variation, comparison, penetration, etc.
Return time statistic of wind power ramp events
NASA Astrophysics Data System (ADS)
Calif, Rudy; Schmitt, François G.
2015-04-01
Detection and forecasting of wind power ramp events is a critical issue for the management of power generated by wind turbine and a cluster of wind turbines. The wind power ramp events occur suddenly with larges changes (increases or decreases) of wind power output. In this work, the statistic and the dynamic of wind power ramp events are examined. For that, we analyze several datasets of wind power output with different sampling rate and duration. The data considered are delivered by five wind farms and two single turbines, located at different geographic locations. From these datasets, the return time series τr of wind power ramp events, i.e., the time between two successive ramps above a given threshold Δ p. The return time statistic is investigated plotting the complementary cumulative distribution C(τ_r) in log-log representation. Using a robust method developed by Clauset et al., combining maximum-likelihood fitting methods with goodness-of-fit tests based on the Kolmogorov Smirnov statistic, we show a scaling behavior of the return time statistic, of the form: C(τ_r)˜ kτ_r-α where k is a positive constant and the exponent α called the tail exponent of the distribution. In this study, the value of α ranges from 1.68 to 2.20. This result is a potential information for the estimation risk of wind power generation based on the return time series. Clauset A, Shalizi CR, Newman MEJ. Power-Law distributions in empirical data. SIAM Review 2009;51(4):661-703.
Power Curve Modeling in Complex Terrain Using Statistical Models
NASA Astrophysics Data System (ADS)
Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G.; Miller, W.
2014-12-01
Traditional power output curves typically model power only as a function of the wind speed at the turbine hub height. While the latter is an essential predictor of power output, wind speed information in other parts of the vertical profile, as well as additional atmospheric variables, are also important determinants of power. The goal of this work was to determine the gain in predictive ability afforded by adding wind speed information at other heights, as well as other atmospheric variables, to the power prediction model. Using data from a wind farm with a moderately complex terrain in the Altamont Pass region in California, we trained three statistical models, a neural network, a random forest and a Gaussian process model, to predict power output from various sets of aforementioned predictors. The comparison of these predictions to the observed power data revealed that considerable improvements in prediction accuracy can be achieved both through the addition of predictors other than the hub-height wind speed and the use of statistical models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was funded by Wind Uncertainty Quantification Laboratory Directed Research and Development Project at LLNL under project tracking code 12-ERD-069.
Robust Statistical Detection of Power-Law Cross-Correlation
NASA Astrophysics Data System (ADS)
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
Statistical analyses support power law distributions found in neuronal avalanches.
Klaus, Andreas; Yu, Shan; Plenz, Dietmar
2011-01-01
The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
The Role of Atmospheric Measurements in Wind Power Statistical Models
NASA Astrophysics Data System (ADS)
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Newman, J. F.; Clifton, A.
2015-12-01
The simplest wind power generation curves model power only as a function of the wind speed at turbine hub-height. While the latter is an essential predictor of power output, it is widely accepted that wind speed information in other parts of the vertical profile, as well as additional atmospheric variables including atmospheric stability, wind veer, and hub-height turbulence are also important factors. The goal of this work is to determine the gain in predictive ability afforded by adding additional atmospheric measurements to the power prediction model. In particular, we are interested in quantifying any gain in predictive ability afforded by measurements taken from a laser detection and ranging (lidar) instrument, as lidar provides high spatial and temporal resolution measurements of wind speed and direction at 10 or more levels throughout the rotor-disk and at heights well above. Co-located lidar and meteorological tower data as well as SCADA power data from a wind farm in Northern Oklahoma will be used to train a set of statistical models. In practice, most wind farms continue to rely on atmospheric measurements taken from less expensive, in situ instruments mounted on meteorological towers to assess turbine power response to a changing atmospheric environment. Here, we compare a large suite of atmospheric variables derived from tower measurements to those taken from lidar to determine if remote sensing devices add any competitive advantage over tower measurements alone to predict turbine power response.
Joyce, Karen E; Hayasaka, Satoru
2012-10-01
Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.
Statistics of power input into a vibrated granular system
NASA Astrophysics Data System (ADS)
Wang, Hongqiang; Feitosa, Klebert; Menon, Narayanan
2004-03-01
Statistics of power input into a vibrated granular system Authors: Hongqiang Wang, Klebert Feitosa, Narayanan Menon Motivated by the recent Fluctuation theorem of Gallavotti and Cohen, we demonstrate a numerical and experimental exploration of the fluctuations in power input and energy dissipation in a sub-volume of a vibrated granular system. Both experimental and simulation results are in accord with the Fluctuation relation, even for short-time fluctuations. In the simulations, we are also able to compare power fluctuations in rotational and translational modes; we discuss the effective temperatures arising from this fluctuation relation. Finally, in the simulations, we also study the dependence of our results on the size of the sub-volume considered in the system. Supported by: NSF DMR 9878433, DMR 0216719
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Defects and statistical degradation analysis of photovoltaic power plants
NASA Astrophysics Data System (ADS)
Sundarajan, Prasanna
As the photovoltaic (PV) power plants age in the field, the PV modules degrade and generate visible and invisible defects. A defect and statistical degradation rate analysis of photovoltaic (PV) power plants is presented in two-part thesis. The first part of the thesis deals with the defect analysis and the second part of the thesis deals with the statistical degradation rate analysis. In the first part, a detailed analysis on the performance or financial risk related to each defect found in multiple PV power plants across various climatic regions of the USA is presented by assigning a risk priority number (RPN). The RPN for all the defects in each PV plant is determined based on two databases: degradation rate database; defect rate database. In this analysis it is determined that the RPN for each plant is dictated by the technology type (crystalline silicon or thin-film), climate and age. The PV modules aging between 3 and 19 years in four different climates of hot-dry, hot-humid, cold-dry and temperate are investigated in this study. In the second part, a statistical degradation analysis is performed to determine if the degradation rates are linear or not in the power plants exposed in a hot-dry climate for the crystalline silicon technologies. This linearity degradation analysis is performed using the data obtained through two methods: current-voltage method; metered kWh method. For the current-voltage method, the annual power degradation data of hundreds of individual modules in six crystalline silicon power plants of different ages is used. For the metered kWh method, a residual plot analysis using Winters' statistical method is performed for two crystalline silicon plants of different ages. The metered kWh data typically consists of the signal and noise components. Smoothers remove the noise component from the data by taking the average of the current and the previous observations. Once this is done, a residual plot analysis of the error component is
Metrology optical power budgeting in SIM using statistical analysis techniques
NASA Astrophysics Data System (ADS)
Kuan, Gary M.
2008-07-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques
NASA Technical Reports Server (NTRS)
Kuan, Gary M
2008-01-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring
Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat
2015-01-01
We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863
Toward improved statistical treatments of wind power forecast errors
NASA Astrophysics Data System (ADS)
Hart, E.; Jacobson, M. Z.
2011-12-01
The ability of renewable resources to reliably supply electric power demand is of considerable interest in the context of growing renewable portfolio standards and the potential for future carbon markets. Toward this end, a number of probabilistic models have been applied to the problem of grid integration of intermittent renewables, such as wind power. Most of these models rely on simple Markov or autoregressive models of wind forecast errors. While these models generally capture the bulk statistics of wind forecast errors, they often fail to reproduce accurate ramp rate distributions and do not accurately describe extreme forecast error events, both of which are of considerable interest to those seeking to comment on system reliability. The problem often lies in characterizing and reproducing not only the magnitude of wind forecast errors, but also the timing or phase errors (ie. when a front passes over a wind farm). Here we compare time series wind power data produced using different forecast error models to determine the best approach for capturing errors in both magnitude and phase. Additionally, new metrics are presented to characterize forecast quality with respect to both considerations.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Statistical effects in high-power microwave beam propagation
NASA Astrophysics Data System (ADS)
Alvarez, R. A.; Bolton, P. R.; Sieger, G. E.
1988-06-01
At very high power levels pulsed microwave beams can generate air-breakdown plasmas which may limit the fluence that the beam can transport through the atmosphere. Conventional air breakdown is an avalanche process wherein free electrons, driven by the microwave fields, produce ionization through collisions with air molecules. Propagation of a beam is affected when the plasma electron density approaches the critical density for the particular microwave frequency. The rate of growth of the plasma depends on the competition between the ionization probability and electron loss processes such as attachment and diffusion. The physics of the avalanche process is reasonably well understood, and fluence limits can be fairly accurately predicted, so long as there are free seed electrons to initiate the breakdown. At sea level and low altitudes, seed electrons are, in fact, expected to be fairly rare, and air breakdown, and the consequences for beam propagation, must be treated as a statistical problem; the effective fluence limit may be much greater than would be predicted on the basis of conventional breakdown thresholds. The statistical effects are currently being investigated.
Statistical learning: a powerful mechanism that operates by mere exposure.
Aslin, Richard N
2017-01-01
How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website.
The Power of Statistical Tests for Moderators in Meta-Analysis
ERIC Educational Resources Information Center
Hedges, Larry V.; Pigott, Therese D.
2004-01-01
Calculation of the statistical power of statistical tests is important in planning and interpreting the results of research studies, including meta-analyses. It is particularly important in moderator analyses in meta-analysis, which are often used as sensitivity analyses to rule out moderator effects but also may have low statistical power. This…
ERIC Educational Resources Information Center
Spybrook, Jessaca
2008-01-01
This study examines the reporting of power analyses in the group randomized trials funded by the Institute of Education Sciences from 2002 to 2006. A detailed power analysis provides critical information that allows reviewers to (a) replicate the power analysis and (b) assess whether the parameters used in the power analysis are reasonable.…
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W., Jr.
2003-01-01
A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.
Statistical optics applied to high-power glass lasers
Manes, K.R.; Simmons, W.W.
1985-04-01
Multiterawatt laser systems, particularly the Novette system at the Lawrence Livermore National Laboratory, are simulated using statistical-optics techniques. The results are compared with experimental observations.
An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models
ERIC Educational Resources Information Center
Prindle, John J.; McArdle, John J.
2012-01-01
This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…
Enrichment of statistical power for genome-wide association studies
Technology Transfer Automated Retrieval System (TEKTRAN)
The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...
Power-law statistics for avalanches in a martensitic transformation.
Ahluwalia, R; Ananthakrishna, G
2001-04-30
We devise a two-dimensional model that mimics the recently observed power-law distributions for the amplitudes and durations of the acoustic emission signals observed during martensitic transformation [Vives et al., Phys. Rev. Lett. 72, 1694 (1994)]. We include a threshold mechanism, long-range interaction between the transformed domains, inertial effects, and dissipation arising due to the motion of the interface. The model exhibits thermal hysteresis and, more importantly, it shows that the energy is released in the form of avalanches with power-law distributions for their amplitudes and durations. Computer simulations also reveal morphological features similar to those observed in real systems.
Automated FMV image quality assessment based on power spectrum statistics
NASA Astrophysics Data System (ADS)
Kalukin, Andrew
2015-05-01
Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).
Statistical Power of Psychological Research: What Have We Gained in 20 Years?
ERIC Educational Resources Information Center
Rossi, Joseph S.
1990-01-01
Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…
Heidel, R. Eric
2016-01-01
Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717
Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information
NASA Technical Reports Server (NTRS)
Howell, L. W.
2002-01-01
A simple power law model consisting of a single spectral index, a is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index alpha(sub 2) greater than alpha(sub 1) above E(sub k). The Maximum likelihood (ML) procedure was developed for estimating the single parameter alpha(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (P1) consistency (asymptotically unbiased). (P2) efficiency asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only he ascertained by calculating the CRB for an assumed energy spectrum-detector response function combination, which can be quite formidable in practice. However. the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are attained in practice are investigated. The ML technique is then extended to estimate spectra information from
2010-01-01
Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods
Statistical Power Flow Analysis of an Imperfect Ribbed Cylinder
NASA Astrophysics Data System (ADS)
Blakemore, M.; Woodhouse, J.; Hardie, D. J. W.
1999-05-01
Prediction of the noise transmitted from machinery and flow sources on a submarine to the sonar arrays poses a complex problem. Vibrations in the pressure hull provide the main transmission mechanism. The pressure hull is characterised by a very large number of modes over the frequency range of interest (at least 100,000) and by high modal overlap, both of which place its analysis beyond the scope of finite element or boundary element methods. A method for calculating the transmission is presented, which is broadly based on Statistical Energy Analysis, but extended in two important ways: (1) a novel subsystem breakdown which exploits the particular geometry of a submarine pressure hull; (2) explicit modelling of energy density variation within a subsystem due to damping. The method takes account of fluid-structure interaction, the underlying pass/stop band characteristics resulting from the near-periodicity of the pressure hull construction, the effect of vibration isolators such as bulkheads, and the cumulative effect of irregularities (e.g., attachments and penetrations).
Lee, Chaeyoung
2012-11-01
Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.
Low statistical power in biomedical science: a review of three human research domains
Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois
2017-01-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409
Monitoring Statistics Which Have Increased Power over a Reduced Time Range.
ERIC Educational Resources Information Center
Tang, S. M.; MacNeill, I. B.
1992-01-01
The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)
Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models
ERIC Educational Resources Information Center
Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles
2012-01-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…
Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J.; Suggs, Suzanne; Barbour, Russell
2015-01-01
The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power. PMID:26640421
Novel Resampling Improves Statistical Power for Multiple-Trait QTL Mapping
Cheng, Riyan; Doerge, R. W.; Borevitz, Justin
2017-01-01
Multiple-trait analysis typically employs models that associate a quantitative trait locus (QTL) with all of the traits. As a result, statistical power for QTL detection may not be optimal if the QTL contributes to the phenotypic variation in only a small proportion of the traits. Excluding QTL effects that contribute little to the test statistic can improve statistical power. In this article, we show that an optimal power can be achieved when the number of QTL effects is best estimated, and that a stringent criterion for QTL effect selection may improve power when the number of QTL effects is small but can reduce power otherwise. We investigate strategies for excluding trivial QTL effects, and propose a method that improves statistical power when the number of QTL effects is relatively small, and fairly maintains the power when the number of QTL effects is large. The proposed method first uses resampling techniques to determine the number of nontrivial QTL effects, and then selects QTL effects by the backward elimination procedure for significance test. We also propose a method for testing QTL-trait associations that are desired for biological interpretation in applications. We validate our methods using simulations and Arabidopsis thaliana transcript data. PMID:28064191
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Power-law statistics and universal scaling in the absence of criticality.
Touboul, Jonathan; Destexhe, Alain
2017-01-01
Critical states are sometimes identified experimentally through power-law statistics or universal scaling functions. We show here that such features naturally emerge from networks in self-sustained irregular regimes away from criticality. In these regimes, statistical physics theory of large interacting systems predict a regime where the nodes have independent and identically distributed dynamics. We thus investigated the statistics of a system in which units are replaced by independent stochastic surrogates and found the same power-law statistics, indicating that these are not sufficient to establish criticality. We rather suggest that these are universal features of large-scale networks when considered macroscopically. These results put caution on the interpretation of scaling laws found in nature.
Power-law statistics and universal scaling in the absence of criticality
NASA Astrophysics Data System (ADS)
Touboul, Jonathan; Destexhe, Alain
2017-01-01
Critical states are sometimes identified experimentally through power-law statistics or universal scaling functions. We show here that such features naturally emerge from networks in self-sustained irregular regimes away from criticality. In these regimes, statistical physics theory of large interacting systems predict a regime where the nodes have independent and identically distributed dynamics. We thus investigated the statistics of a system in which units are replaced by independent stochastic surrogates and found the same power-law statistics, indicating that these are not sufficient to establish criticality. We rather suggest that these are universal features of large-scale networks when considered macroscopically. These results put caution on the interpretation of scaling laws found in nature.
NASA Astrophysics Data System (ADS)
Ma, W. T.; Sandri, G. vH.; Sarkar, S.
1991-05-01
We use the convolution power of infinite sequences to obtain a novel representation of exponential functions of power series which often arise in statistical mechanics. We thus obtain new formulas for the configuration and cluster integrals of pairwise interacting systems of molecules in an imperfect gas. We prove that the asymptotic behaviour of the Luria-Delbrück distribution is pn∼ cn-2. We derive a new, simple and computationally efficient recursion relation for pn.
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-01-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471
NASA Astrophysics Data System (ADS)
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
Narayan, Manjari; Allen, Genevera I.
2016-01-01
Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940
Palmisano, Aldo N.; Elder, N.E.
2001-01-01
We examined, under standardized conditions, seawater survival of chinook salmon Oncorhynchus tshawytscha at the smolt stage to evaluate the experimental hatchery practices applied to their rearing. The experimental rearing practices included rearing fish at different densities; attempting to control bacterial kidney disease with broodstock segregation, erythromycin injection, and an experimental diet; rearing fish on different water sources; and freeze branding the fish. After application of experimental rearing practices in hatcheries, smolts were transported to a rearing facility for about 2-3 months of seawater rearing. Of 16 experiments, 4 yielded statistically significant differences in seawater survival. In general we found that high variability among replicates, plus the low numbers of replicates available, resulted in low statistical power. We recommend including four or five replicates and using ?? = 0.10 in 1-tailed tests of hatchery experiments to try to increase the statistical power to 0.80.
[Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].
Suzukawa, Yumi; Toyoda, Hideki
2012-04-01
This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.
Statistics of injected power on a bouncing ball subjected to a randomly vibrating piston.
García-Cid, Alfredo; Gutiérrez, Pablo; Falcón, Claudio; Aumaître, Sébastien; Falcon, Eric
2015-09-01
We present an experimental study on the statistical properties of the injected power needed to maintain an inelastic ball bouncing constantly on a randomly accelerating piston in the presence of gravity. We compute the injected power at each collision of the ball with the moving piston by measuring the velocity of the piston and the force exerted on the piston by the ball. The probability density function of the injected power has its most probable value close to zero and displays two asymmetric exponential tails, depending on the restitution coefficient, the piston acceleration, and its frequency content. This distribution can be deduced from a simple model assuming quasi-Gaussian statistics for the force and velocity of the piston.
NASA Astrophysics Data System (ADS)
Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.
1992-11-01
The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will
Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.
2012-01-01
The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID
Anomalous wave function statistics on a one-dimensional lattice with power-law disorder.
Titov, M; Schomerus, H
2003-10-24
Within a general framework, we discuss the wave function statistics in the Lloyd model of Anderson localization on a one-dimensional lattice with a Cauchy distribution for random on-site potential. We demonstrate that already in leading order in the disorder strength, there exists a hierarchy of anomalies in the probability distributions of the wave function, the conductance, and the local density of states, for every energy which corresponds to a rational ratio of wavelength to lattice constant. Power-law rather than log-normal tails dominate the short-distance wave-function statistics.
Asking sensitive questions: a statistical power analysis of randomized response models.
Ulrich, Rolf; Schröter, Hannes; Striegel, Heiko; Simon, Perikles
2012-12-01
This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated question model, forced-choice models, item count model, cheater detection model). This power analysis can help in choosing the optimal model and sample size and in setting model parameters in survey studies. The general framework can be applied to all existing randomized response model versions. The Appendix of this article contains worked-out numerical examples to demonstrate the power analysis for each specific model.
Statistical power of likelihood ratio and Wald tests in latent class models with covariates.
Gudicha, Dereje W; Schmittmann, Verena D; Vermunt, Jeroen K
2016-12-30
This paper discusses power and sample-size computation for likelihood ratio and Wald testing of the significance of covariate effects in latent class models. For both tests, asymptotic distributions can be used; that is, the test statistic can be assumed to follow a central Chi-square under the null hypothesis and a non-central Chi-square under the alternative hypothesis. Power or sample-size computation using these asymptotic distributions requires specification of the non-centrality parameter, which in practice is rarely known. We show how to calculate this non-centrality parameter using a large simulated data set from the model under the alternative hypothesis. A simulation study is conducted evaluating the adequacy of the proposed power analysis methods, determining the key study design factor affecting the power level, and comparing the performance of the likelihood ratio and Wald test. The proposed power analysis methods turn out to perform very well for a broad range of conditions. Moreover, apart from effect size and sample size, an important factor affecting the power is the class separation, implying that when class separation is low, rather large sample sizes are needed to achieve a reasonable power level.
Statistical Design Model (SDM) of power supply and communication subsystem's Satellite
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Zabihian, Ahmadreza
In this paper, based on the fact that in designing the energy providing and communication subsystems for satellites, most approaches and relations are empirical and statistical, and also, considering the aerospace sciences and its relation with other engineering fields such as electrical engineering to be young, these are no analytic or one hundred percent proven empirical relations in many fields. Therefore, we consider the statistical design of this subsystem. The presented approach in this paper is entirely innovative and all parts of the energy providing and communication subsystems for the satellite are specified. In codifying this approach, the data of 602 satellites and some software programs such as SPSS have been used. In this approach, after proposing the design procedure, the total needed power for the satellite, the mass of the energy providing and communication subsystems , communication subsystem needed power, working band, type of antenna, number of transponders the material of solar array and finally the placement of these arrays on the satellite are designed. All these parts are designed based on the mission of the satellite and its weight class. This procedure increases the performance rate, avoids wasting energy, and reduces the costs. Keywords: database, Statistical model, the design procedure, power supply subsystem, communication subsystem
Foster, Jonathan; Krompholz, Hermann; Neuber, Andreas
2011-11-15
The physical mechanisms that contribute to atmospheric breakdown induced by high power microwaves (HPMs) are of particular interest for the further development of high power microwave systems and related technologies. For a system in which HPM is produced in a vacuum environment for the purpose of radiating into atmosphere, it is necessary to separate the atmospheric environment from the vacuum environment with a dielectric interface. Breakdown across this interface on the atmospheric side and plasma development to densities prohibiting further microwave propagation are of special interest. In this paper, the delay time between microwave application and plasma emergence is investigated. Various external parameters, such as UV illumination or the presence of small metallic points on the surface, provide sources for electron field emission and influence the delay time which yields crucial information on the breakdown mechanisms involved. Due to the inherent statistical appearance of initial electrons and the statistics of the charge carrier amplification mechanisms, the flashover delay times deviate by as much as {+-}50% from the average, for the investigated case of discharges in N{sub 2} at pressures of 60-140 Torr and a microwave frequency of 2.85 GHz with 3 {mu}s pulse duration, 50 ns pulse risetime, and MW/cm{sup 2} power densities. The statistical model described in this paper demonstrates how delay times for HPM surface flashover events can be effectively predicted for various conditions given sufficient knowledge about ionization rate coefficients as well as the production rate for breakdown initiating electrons.
Stift, M; Reeve, R; van Tienderen, P H
2010-07-01
In their recent article, Albertin et al. (2009) suggest an autotetraploid origin of 10 tetraploid strains of baker's yeast (Saccharomyces cerevisiae), supported by the frequent observation of double reduction meiospores. However, the presented inheritance results were puzzling and seemed to contradict the authors' interpretation that segregation ratios support a tetrasomic model of inheritance. Here, we provide an overview of the expected segregation ratios at the tetrad and meiospore level given scenarios of strict disomic and tetrasomic inheritance, for cases with and without recombination between locus and centromere. We also use a power analysis to derive adequate sample sizes to distinguish alternative models. Closer inspection of the Albertin et al. data reveals that strict disomy can be rejected in most cases. However, disomic inheritance with strong but imperfect preferential pairing could not be excluded with the sample sizes used. The possibility of tetrad analysis in tetraploid yeast offers a valuable opportunity to improve our understanding of meiosis and inheritance of tetraploids.
NASA Astrophysics Data System (ADS)
Najac, Julien
2014-05-01
For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.
Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M
2011-12-01
This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
More powerful genetic association testing via a new statistical framework for integrative genomics
Zhao, Sihai D.; Cai, T. Tony; Li, Hongzhe
2015-01-01
Integrative genomics offers a promising approach to more powerful genetic association studies. The hope is that combining outcome and genotype data with other types of genomic information can lead to more powerful SNP detection. We present a new association test based on a statistical model that explicitly assumes that genetic variations affect the outcome through perturbing gene expression levels. It is shown analytically that the proposed approach can have more power to detect SNPs that are associated with the outcome through transcriptional regulation, compared to tests using the outcome and genotype data alone, and simulations show that our method is relatively robust to misspecification. We also provide a strategy for applying our approach to high-dimensional genomic data. We use this strategy to identify a potentially new association between a SNP and a yeast cell’s response to the natural product tomatidine, which standard association analysis did not detect. PMID:24975802
More powerful genetic association testing via a new statistical framework for integrative genomics.
Zhao, Sihai D; Cai, T Tony; Li, Hongzhe
2014-12-01
Integrative genomics offers a promising approach to more powerful genetic association studies. The hope is that combining outcome and genotype data with other types of genomic information can lead to more powerful SNP detection. We present a new association test based on a statistical model that explicitly assumes that genetic variations affect the outcome through perturbing gene expression levels. It is shown analytically that the proposed approach can have more power to detect SNPs that are associated with the outcome through transcriptional regulation, compared to tests using the outcome and genotype data alone, and simulations show that our method is relatively robust to misspecification. We also provide a strategy for applying our approach to high-dimensional genomic data. We use this strategy to identify a potentially new association between a SNP and a yeast cell's response to the natural product tomatidine, which standard association analysis did not detect.
Zhang, Kai; Traskin, Mikhail; Small, Dylan S
2012-03-01
For group-randomized trials, randomization inference based on rank statistics provides robust, exact inference against nonnormal distributions. However, in a matched-pair design, the currently available rank-based statistics lose significant power compared to normal linear mixed model (LMM) test statistics when the LMM is true. In this article, we investigate and develop an optimal test statistic over all statistics in the form of the weighted sum of signed Mann-Whitney-Wilcoxon statistics under certain assumptions. This test is almost as powerful as the LMM even when the LMM is true, but it is much more powerful for heavy tailed distributions. A simulation study is conducted to examine the power.
In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.
Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat
2014-11-01
The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells.
Gibson, Eli; Fenster, Aaron; Ward, Aaron D
2013-10-01
Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions?
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Statistical analysis of the cosmic microwave background: Power spectra and foregrounds
NASA Astrophysics Data System (ADS)
O'Dwyer, Ian J.
2005-11-01
In this thesis I examine some of the challenges associated with analyzing Cosmic Microwave Background (CMB) data and present a novel approach to solving the problem of power spectrum estimation, which is called MAGIC (MAGIC Allows Global Inference of Covariance). In light of the computational difficulty of a brute force approach to power spectrum estimation, I review several approaches which have been applied to the problem and show an example application of such an approximate method to experimental CMB data from the Background Emission Anisotropy Scanning Telescope (BEAST). I then introduce MAGIC, a new approach to power spectrum estimation; based on a Bayesian statistical analysis of the data utilizing Gibbs Sampling. I demonstrate application of this method to the all-sky Wilkinson Microwave Anistropy Probe WMAP data. The results are in broad agreement with those obtained originally by the WMAP team. Since MAGIC generates a full description of each C l it is possible to examine several issues raised by the best-fit WMAP power spectrum, for example the perceived lack of power at low ℓ. It is found that the distribution of C ℓ's at low l are significantly non-Gaussian and, based on the exact analysis presented here, the "low quadrupole issue" can be attributed to a statistical fluctuation. Finally, I examine the effect of Galactic foreground contamination on CMB experiments and describe the principle foregrounds. I show that it is possible to include the foreground components in a self-consistent fashion within the statistical framework of MAGIC and give explicit examples of how this might be achieved. Foreground contamination will become an increasingly important issue in CMB data analysis and the ability of this new algorithm to produce an exact power spectrum in a computationally feasible time, coupled with the foreground component separation and removal is an exciting development in CMB data analysis. When considered with current algorithmic developments
Statistics of the radiated field of a space-to-earth microwave power transfer system
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Leininger, G.
1976-01-01
Statistics such as average power density pattern, variance of the power density pattern and variance of the beam pointing error are related to hardware parameters such as transmitter rms phase error and rms amplitude error. Also a limitation on spectral width of the phase reference for phase control was established. A 1 km diameter transmitter appears feasible provided the total rms insertion phase errors of the phase control modules does not exceed 10 deg, amplitude errors do not exceed 10% rms, and the phase reference spectral width does not exceed approximately 3 kHz. With these conditions the expected radiation pattern is virtually the same as the error free pattern, and the rms beam pointing error would be insignificant (approximately 10 meters).
Statistics of power injection in a plate set into chaotic vibration
NASA Astrophysics Data System (ADS)
Cadot, O.; Boudaoud, A.; Touzé, C.
2008-12-01
A vibrating plate is set into a chaotic state of wave turbulence by either a periodic or a random local forcing. Correlations between the forcing and the local velocity response of the plate at the forcing point are studied. Statistical models with fairly good agreement with the experiments are proposed for each forcing. Both distributions of injected power have a logarithmic cusp for zero power, while the tails are Gaussian for the periodic driving and exponential for the random one. The distributions of injected work over long time intervals are investigated in the framework of the fluctuation theorem, also known as the Gallavotti-Cohen theorem. It appears that the conclusions of the theorem are verified only for the periodic, deterministic forcing. Using independent estimates of the phase space contraction, this result is discussed in the light of available theoretical framework.
Nicol, Samuel; Roach, Jennifer K.; Griffith, Brad
2013-01-01
Over the past 50 years, the number and size of high-latitude lakes have decreased throughout many regions; however, individual lake trends have been variable in direction and magnitude. This spatial heterogeneity in lake change makes statistical detection of temporal trends challenging, particularly in small analysis areas where weak trends are difficult to separate from inter- and intra-annual variability. Factors affecting trend detection include inherent variability, trend magnitude, and sample size. In this paper, we investigated how the statistical power to detect average linear trends in lake size of 0.5, 1.0 and 2.0 %/year was affected by the size of the analysis area and the number of years of monitoring in National Wildlife Refuges in Alaska. We estimated power for large (930–4,560 sq km) study areas within refuges and for 2.6, 12.9, and 25.9 sq km cells nested within study areas over temporal extents of 4–50 years. We found that: (1) trends in study areas could be detected within 5–15 years, (2) trends smaller than 2.0 %/year would take >50 years to detect in cells within study areas, and (3) there was substantial spatial variation in the time required to detect change among cells. Power was particularly low in the smallest cells which typically had the fewest lakes. Because small but ecologically meaningful trends may take decades to detect, early establishment of long-term monitoring will enhance power to detect change. Our results have broad applicability and our method is useful for any study involving change detection among variable spatial and temporal extents.
Detecting trends in raptor counts: power and type I error rates of various statistical tests
Hatfield, J.S.; Gould, W.R.; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.
1996-01-01
We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.
The statistical power to detect cross-scale interactions at macroscales
Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.
2016-01-01
Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.
Notes on the Statistical Power of the Binary State Speciation and Extinction (BiSSE) Model
Gamisch, Alexander
2016-01-01
The Binary State Speciation and Extinction (BiSSE) method is one of the most popular tools for investigating the rates of diversification and character evolution. Yet, based on previous simulation studies, it is commonly held that the BiSSE method requires phylogenetic trees of fairly large sample sizes (>300 taxa) in order to distinguish between the different models of speciation, extinction, or transition rate asymmetry. Here, the power of the BiSSE method is reevaluated by simulating trees of both small and large sample sizes (30, 60, 90, and 300 taxa) under various asymmetry models and root state assumptions. Results show that the power of the BiSSE method can be much higher, also in trees of small sample size, for detecting differences in speciation rate asymmetry than anticipated earlier. This, however, is not a consequence of any conceptual or mathematical flaw in the method per se but rather of assumptions about the character state at the root of the simulated trees and thus the underlying macroevolutionary model, which led to biased results and conclusions in earlier power assessments. As such, these earlier simulation studies used to determine the power of BiSSE were not incorrect but biased, leading to an overestimation of type-II statistical error for detecting differences in speciation rate but not for extinction and transition rates. PMID:27486297
Power law distribution in statistics of failures in operation of spacecraft onboard equipment
NASA Astrophysics Data System (ADS)
Karimova, L. M.; Kruglun, O. A.; Makarenko, N. G.; Romanova, N. V.
2011-10-01
The possibility of using the statistics of recurrence time for extreme events is studied in this paper having in mind the problems of control and prediction of failures in spacecraft operation. The information about failures onboard satellites of various types presented by the US National Geophysical Data Center was analyzed. It was found that the probability density of recurrence intervals followed a power law of the Pareto type with an index equal to 2.3. The obtained result is consistent both with the theory of normal catastrophes and with the principle of self-organization of criticality for metastable active heterogeneous environment. A practical consequence of the obtained result consists in the fact that predictions of these extreme events should not rely on traditional models with the second-order Pearson statistics. To make predictions, the models are necessary that take into account the power law distribution of recurrence intervals for failures on satellites. The failures should be considered in these models as extreme events connected with manifestation of the space environment factors.
A powerful weighted statistic for detecting group differences of directed biological networks
Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Xu, Jing; Ma, Daoxin; Xue, Fuzhong
2016-01-01
Complex disease is largely determined by a number of biomolecules interwoven into networks, rather than a single biomolecule. Different physiological conditions such as cases and controls may manifest as different networks. Statistical comparison between biological networks can provide not only new insight into the disease mechanism but statistical guidance for drug development. However, the methods developed in previous studies are inadequate to capture the changes in both the nodes and edges, and often ignore the network structure. In this study, we present a powerful weighted statistical test for group differences of directed biological networks, which is independent of the network attributes and can capture the changes in both the nodes and edges, as well as simultaneously accounting for the network structure through putting more weights on the difference of nodes locating on relatively more important position. Simulation studies illustrate that this method had better performance than previous ones under various sample sizes and network structures. One application to GWAS of leprosy successfully identifies the specific gene interaction network contributing to leprosy. Another real data analysis significantly identifies a new biological network, which is related to acute myeloid leukemia. One potential network responsible for lung cancer has also been significantly detected. The source R code is available on our website. PMID:27686331
A powerful weighted statistic for detecting group differences of directed biological networks.
Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Xu, Jing; Ma, Daoxin; Xue, Fuzhong
2016-09-30
Complex disease is largely determined by a number of biomolecules interwoven into networks, rather than a single biomolecule. Different physiological conditions such as cases and controls may manifest as different networks. Statistical comparison between biological networks can provide not only new insight into the disease mechanism but statistical guidance for drug development. However, the methods developed in previous studies are inadequate to capture the changes in both the nodes and edges, and often ignore the network structure. In this study, we present a powerful weighted statistical test for group differences of directed biological networks, which is independent of the network attributes and can capture the changes in both the nodes and edges, as well as simultaneously accounting for the network structure through putting more weights on the difference of nodes locating on relatively more important position. Simulation studies illustrate that this method had better performance than previous ones under various sample sizes and network structures. One application to GWAS of leprosy successfully identifies the specific gene interaction network contributing to leprosy. Another real data analysis significantly identifies a new biological network, which is related to acute myeloid leukemia. One potential network responsible for lung cancer has also been significantly detected. The source R code is available on our website.
Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants
NASA Astrophysics Data System (ADS)
Rajasekar, Vidyashree
This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly
Air-chemistry "turbulence": power-law scaling and statistical regularity
NASA Astrophysics Data System (ADS)
Hsu, H.-M.; Lin, C.-Y.; Guenther, A.; Tribbia, J. J.; Liu, S. C.
2011-03-01
With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3) and aerosol (PM10) measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, and plain/hill locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, intermediate (20 ~ 30 days) and intraseasonal (30 ~ 100 days) peaks are also identified with the continuous wavelet transform (CWT). The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of -5/3 and -1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, -3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are intriguingly interesting, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.
Air-chemistry "turbulence": power-law scaling and statistical regularity
NASA Astrophysics Data System (ADS)
Hsu, H.-M.; Lin, C.-Y.; Guenther, A.; Tribbia, J. J.; Liu, S. C.
2011-08-01
With the intent to gain further knowledge on the spectral structures and statistical regularities of surface atmospheric chemistry, the chemical gases (NO, NO2, NOx, CO, SO2, and O3) and aerosol (PM10) measured at 74 air quality monitoring stations over the island of Taiwan are analyzed for the year of 2004 at hourly resolution. They represent a range of surface air quality with a mixed combination of geographic settings, and include urban/rural, coastal/inland, plain/hill, and industrial/agricultural locations. In addition to the well-known semi-diurnal and diurnal oscillations, weekly, and intermediate (20 ~ 30 days) peaks are also identified with the continuous wavelet transform (CWT). The spectra indicate power-law scaling regions for the frequencies higher than the diurnal and those lower than the diurnal with the average exponents of -5/3 and -1, respectively. These dual-exponents are corroborated with those with the detrended fluctuation analysis in the corresponding time-lag regions. These exponents are mostly independent of the averages and standard deviations of time series measured at various geographic settings, i.e., the spatial inhomogeneities. In other words, they possess dominant universal structures. After spectral coefficients from the CWT decomposition are grouped according to the spectral bands, and inverted separately, the PDFs of the reconstructed time series for the high-frequency band demonstrate the interesting statistical regularity, -3 power-law scaling for the heavy tails, consistently. Such spectral peaks, dual-exponent structures, and power-law scaling in heavy tails are important structural information, but their relations to turbulence and mesoscale variability require further investigations. This could lead to a better understanding of the processes controlling air quality.
Amigo, Jorge; González-Manteiga, Wenceslao
2013-01-01
Background Mitochondrial DNA (mtDNA) variation (i.e. haplogroups) has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. Methods/Principal Findings We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/), a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. Conclusions/Significance The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive) to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool – mitPower – that can be used in both retrospective and prospective case-control disease studies. PMID:24086285
Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.
2015-01-01
The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.
A statistical survey of ultralow-frequency wave power and polarization in the Hermean magnetosphere.
James, Matthew K; Bunce, Emma J; Yeoman, Timothy K; Imber, Suzanne M; Korth, Haje
2016-09-01
We present a statistical survey of ultralow-frequency wave activity within the Hermean magnetosphere using the entire MErcury Surface, Space ENvironment, GEochemistry, and Ranging magnetometer data set. This study is focused upon wave activity with frequencies <0.5 Hz, typically below local ion gyrofrequencies, in order to determine if field line resonances similar to those observed in the terrestrial magnetosphere may be present. Wave activity is mapped to the magnetic equatorial plane of the magnetosphere and to magnetic latitude and local times on Mercury using the KT14 magnetic field model. Wave power mapped to the planetary surface indicates the average location of the polar cap boundary. Compressional wave power is dominant throughout most of the magnetosphere, while azimuthal wave power close to the dayside magnetopause provides evidence that interactions between the magnetosheath and the magnetopause such as the Kelvin-Helmholtz instability may be driving wave activity. Further evidence of this is found in the average wave polarization: left-handed polarized waves dominate the dawnside magnetosphere, while right-handed polarized waves dominate the duskside. A possible field line resonance event is also presented, where a time-of-flight calculation is used to provide an estimated local plasma mass density of ∼240 amu cm(-3).
A statistical survey of ultralow-frequency wave power and polarization in the Hermean magnetosphere
NASA Astrophysics Data System (ADS)
James, Matthew K.; Bunce, Emma J.; Yeoman, Timothy K.; Imber, Suzanne M.; Korth, Haje
2016-09-01
We present a statistical survey of ultralow-frequency wave activity within the Hermean magnetosphere using the entire MErcury Surface, Space ENvironment, GEochemistry, and Ranging magnetometer data set. This study is focused upon wave activity with frequencies <0.5 Hz, typically below local ion gyrofrequencies, in order to determine if field line resonances similar to those observed in the terrestrial magnetosphere may be present. Wave activity is mapped to the magnetic equatorial plane of the magnetosphere and to magnetic latitude and local times on Mercury using the KT14 magnetic field model. Wave power mapped to the planetary surface indicates the average location of the polar cap boundary. Compressional wave power is dominant throughout most of the magnetosphere, while azimuthal wave power close to the dayside magnetopause provides evidence that interactions between the magnetosheath and the magnetopause such as the Kelvin-Helmholtz instability may be driving wave activity. Further evidence of this is found in the average wave polarization: left-handed polarized waves dominate the dawnside magnetosphere, while right-handed polarized waves dominate the duskside. A possible field line resonance event is also presented, where a time-of-flight calculation is used to provide an estimated local plasma mass density of ˜240 amu cm-3.
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
Schliekelman, Paul
2008-01-01
A number of recent genomewide surveys have found numerous QTL for gene expression, often with intermediate to high heritability values. As a result, there is currently a great deal of interest in genetical genomics—that is, the combination of genomewide expression data and molecular marker data to elucidate the genetics of complex traits. To date, most genetical genomics studies have focused on generating candidate genes for previously known trait loci or have otherwise leveraged existing knowledge about trait-related genes. The purpose of this study is to explore the potential for genetical genomics approaches in the context of genomewide scans for complex trait loci. I explore the expected strength of association between expression-level traits and a clinical trait, as a function of the underlying genetic model in natural populations. I give calculations of statistical power for detecting differential expression between affected and unaffected individuals. I model both reactive and causative expression-level traits with both additive and multiplicative multilocus models for the relationship between phenotype and genotype and explore a variety of assumptions about dominance, number of segregating loci, and other parameters. There are two key results. If a transcript is causative for the disease (in the sense that disease risk depends directly on transcript level), then the power to detect association between transcript and disease is quite good. Sample sizes on the order of 100 are sufficient for 80% power. On the other hand, if the transcript is reactive to a disease locus, then the correlation between expression-level traits and disease is low unless the expression-level trait shares several causative loci with the disease—that is, the expression-level trait itself is a complex trait. Thus, there is a trade-off between the power to show association between a reactive expression-level trait and the clinical trait of interest and the power to map expression
NASA Astrophysics Data System (ADS)
Bianucci, M.
2016-01-01
This letter has two main goals. The first one is to give a physically reasonable explanation for the use of stochastic models for mimicking the apparent random features of the El Ninõ-Southern Oscillation (ENSO) phenomenon. The second one is to obtain, from the theory, an analytical expression for the equilibrium density function of the anomaly sea surface temperature, an expression that fits the data from observations well, reproducing the asymmetry and the power law tail of the histograms of the NIÑO3 index. We succeed in these tasks exploiting some recent theoretical results of the author in the field of the dynamical origin of the stochastic processes. More precisely, we apply this approach to the celebrated recharge oscillator model (ROM), weakly interacting by a multiplicative term, with a general deterministic complex forcing (Madden-Julian Oscillations, westerly wind burst, etc.), and we obtain a Fokker-Planck equation that describes the statistical behavior of the ROM.
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Conductance statistics for the power-law banded random matrix model
Martinez-Mendoza, A. J.; Mendez-Bermudez, J. A.; Varga, Imre
2010-12-21
We study numerically the conductance statistics of the one-dimensional (1D) Anderson model with random long-range hoppings described by the Power-law Banded Random Matrix (PBRM) model. Within a scattering approach to electronic transport, we consider two scattering setups in absence and presence of direct processes: 2M single-mode leads attached to one side and to opposite sides of 1D circular samples. For both setups we show that (i) the probability distribution of the logarithm of the conductance T behaves as w(lnT){proportional_to}T{sup M2/2}, for T<<
Statistical connection of peak counts to power spectrum and moments in weak-lensing field
NASA Astrophysics Data System (ADS)
Shirasaki, Masato
2017-02-01
The number density of local maxima of weak-lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak-lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field K to a new Gaussian field y, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of K can be reproduced from a single Gaussian field y and monotonic relation between y and K. Therefore, the correct information of two-point clustering and any order of moments in weak-lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to explain weak-lensing peak counts in the absence of shape noise. The prediction by local-Gaussianized transformation underestimates the simulated peak counts with a level of ∼20-30 per cent over a wide range of peak heights. Local-Gaussianized transformation can predict the weak-lensing peak counts with an ∼10 per cent accuracy in the presence of shape noise. Our analyses suggest that the cosmological information beyond power spectrum and its moments would be necessary to predict the weak-lensing peak counts with a percent-level accuracy, which is an expected statistical uncertainty in upcoming wide-field galaxy surveys.
Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,
2016-01-01
Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-01-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.
NASA Astrophysics Data System (ADS)
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year-1 for all indicators and is appropriate for detecting a 1 % trend·year-1 in most indicators.
Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz
2015-03-01
FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power.
Inter-replicate variance and statistical power of electrofishing data from low gradient streams
Paller, M.H.
1994-03-01
Electrofishing data from streams on the coastal plain of South Carolina were used to assess the relationship between inter-replicate variance and the mean for catch per unit effort expressed as number of fish per 100 m{sup 2} of stream surface area. Variance and mean were strongly and positively related. This relationship was unaffected by type and quantity of instream structure and was consistent among schooling and nonschooling species. Computations of statistical power indicated that large numbers of samples (>25) were required to detect small (<20%) differences among means and that the required number of samples was greater when catch per unit effort was low. Inter-replicate variance increased as the reach length of individual replicates decreased, with the greatest increases occurring at replicate reach lengths under 60 m. Inter-replicate variance can be decreased by increasing sampling effort (i.e., multiple passes) and by employing replicate reaches that are long. However, the most cost-effective approach may be to employ single passes and replicates of moderate size.
Socol, Yehoshua; Dobrzyński, Ludwik
2015-01-01
The atomic bomb survivors life-span study (LSS) is often claimed to support the linear no-threshold hypothesis (LNTH) of radiation carcinogenesis. This paper shows that this claim is baseless. The LSS data are equally or better described by an s-shaped dependence on radiation exposure with a threshold of about 0.3 Sievert (Sv) and saturation level at about 1.5 Sv. A Monte-Carlo simulation of possible LSS outcomes demonstrates that, given the weak statistical power, LSS cannot provide support for LNTH. Even if the LNTH is used at low dose and dose rates, its estimation of excess cancer mortality should be communicated as 2.5% per Sv, i.e., an increase of cancer mortality from about 20% spontaneous mortality to about 22.5% per Sv, which is about half of the usually cited value. The impact of the "neutron discrepancy problem" - the apparent difference between the calculated and measured values of neutron flux in Hiroshima - was studied and found to be marginal. Major revision of the radiation risk assessment paradigm is required.
A Powerful Statistical Method for Identifying Differentially Methylated Markers in Complex Diseases
Ahn, Surin; Wang, Tao
2013-01-01
DNA methylation is an important epigenetic modification that regulates transcriptional expression and plays an important role in complex diseases, such as cancer. Genome-wide methylation patterns have unique features and hence require the development of new analytic approaches. One important feature is that methylation levels in disease tissues often differ from those in normal tissues with respect to both average and variability. In this paper, we propose a new score test to identify methylation markers of disease. This approach simultaneously utilizes information from the first and second moments of methylation distribution to improve statistical efficiency. Because the proposed score test is derived from a generalized regression model, it can be used for analyzing both categorical and continuous disease phenotypes, and for adjusting for covariates. We evaluate the performance of the proposed method and compare it to other tests including the most commonly-used t-test through simulations. The simulation results show that the validity of the proposed method is robust to departures from the normal assumption of methylation levels and can be substantially more powerful than the t-test in the presence of heterogeneity of methylation variability between disease and normal tissues. We demonstrate our approach by analyzing the methylation dataset of an ovarian cancer study and identify novel methylation loci not identified by the t-test. PMID:23424113
21 CFR 1404.900 - Adequate evidence.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...
21 CFR 1404.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Adequate evidence. 1404.900 Section 1404.900 Food and Drugs OFFICE OF NATIONAL DRUG CONTROL POLICY GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 1404.900 Adequate evidence. Adequate evidence means information sufficient...
Colegrave, Nick; Ruxton, Graeme D
2017-03-29
A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure.
Sofer, Tamar; Heller, Ruth; Bogomolov, Marina; Avery, Christy L; Graff, Mariaelisa; North, Kari E; Reiner, Alex P; Thornton, Timothy A; Rice, Kenneth; Benjamini, Yoav; Laurie, Cathy C; Kerr, Kathleen F
2017-01-15
In genome-wide association studies (GWAS), "generalization" is the replication of genotype-phenotype association in a population with different ancestry than the population in which it was first identified. Current practices for declaring generalizations rely on testing associations while controlling the family-wise error rate (FWER) in the discovery study, then separately controlling error measures in the follow-up study. This approach does not guarantee control over the FWER or false discovery rate (FDR) of the generalization null hypotheses. It also fails to leverage the two-stage design to increase power for detecting generalized associations. We provide a formal statistical framework for quantifying the evidence of generalization that accounts for the (in)consistency between the directions of associations in the discovery and follow-up studies. We develop the directional generalization FWER (FWERg ) and FDR (FDRg ) controlling r-values, which are used to declare associations as generalized. This framework extends to generalization testing when applied to a published list of Single Nucleotide Polymorphism-(SNP)-trait associations. Our methods control FWERg or FDRg under various SNP selection rules based on P-values in the discovery study. We find that it is often beneficial to use a more lenient P-value threshold than the genome-wide significance threshold. In a GWAS of total cholesterol in the Hispanic Community Health Study/Study of Latinos (HCHS/SOL), when testing all SNPs with P-values <5×10-8 (15 genomic regions) for generalization in a large GWAS of whites, we generalized SNPs from 15 regions. But when testing all SNPs with P-values <6.6×10-5 (89 regions), we generalized SNPs from 27 regions.
5 CFR 919.900 - Adequate evidence.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Adequate evidence. 919.900 Section 919.900 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) GOVERNMENTWIDE DEBARMENT AND SUSPENSION (NONPROCUREMENT) Definitions § 919.900 Adequate...
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.
NASA Astrophysics Data System (ADS)
Xu, Ding; Li, Qun
2017-01-01
This paper addresses the power allocation problem for cognitive radio (CR) based on hybrid-automatic-repeat-request (HARQ) with chase combining (CC) in Nakagamimslow fading channels. We assume that, instead of the perfect instantaneous channel state information (CSI), only the statistical CSI is available at the secondary user (SU) transmitter. The aim is to minimize the SU outage probability under the primary user (PU) interference outage constraint. Using the Lagrange multiplier method, an iterative and recursive algorithm is derived to obtain the optimal power allocation for each transmission round. Extensive numerical results are presented to illustrate the performance of the proposed algorithm.
ERIC Educational Resources Information Center
Spybrook, Jessaca; Hedges, Larry; Borenstein, Michael
2014-01-01
Research designs in which clusters are the unit of randomization are quite common in the social sciences. Given the multilevel nature of these studies, the power analyses for these studies are more complex than in a simple individually randomized trial. Tools are now available to help researchers conduct power analyses for cluster randomized…
Fraley, R. Chris; Vazire, Simine
2014-01-01
The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)—the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings. PMID:25296159
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of <33% changes in fish density and diversity metrics in both the wet and the dry season in the two larger study areas. Sampling effort was found to be insufficient in either season to detect changes at this level (<33%) in species-specific occurrence metrics for the two fish species examined. The option of supplementing ongoing, biological monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.
NASA Astrophysics Data System (ADS)
Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.
2014-03-01
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.
Blinking in quantum dots: The origin of the grey state and power law statistics
NASA Astrophysics Data System (ADS)
Ye, Mao; Searson, Peter C.
2011-09-01
Quantum dot (QD) blinking is characterized by switching between an “on” state and an “off” state, and a power-law distribution of on and off times with exponents from 1.0 to 2.0. The origin of blinking behavior in QDs, however, has remained a mystery. Here we describe an energy-band model for QDs that captures the full range of blinking behavior reported in the literature and provides new insight into features such as the gray state, the power-law distribution of on and off times, and the power-law exponents.
NASA Astrophysics Data System (ADS)
Białous, Małgorzata; Yunko, Vitalii; Bauch, Szymon; Ławniczak, Michał; Dietz, Barbara; Sirko, Leszek
2016-09-01
We present experimental studies of the power spectrum and other fluctuation properties in the spectra of microwave networks simulating chaotic quantum graphs with violated time reversal invariance. On the basis of our data sets, we demonstrate that the power spectrum in combination with other long-range and also short-range spectral fluctuations provides a powerful tool for the identification of the symmetries and the determination of the fraction of missing levels. Such a procedure is indispensable for the evaluation of the fluctuation properties in the spectra of real physical systems like, e.g., nuclei or molecules, where one has to deal with the problem of missing levels.
Białous, Małgorzata; Yunko, Vitalii; Bauch, Szymon; Ławniczak, Michał; Dietz, Barbara; Sirko, Leszek
2016-09-30
We present experimental studies of the power spectrum and other fluctuation properties in the spectra of microwave networks simulating chaotic quantum graphs with violated time reversal invariance. On the basis of our data sets, we demonstrate that the power spectrum in combination with other long-range and also short-range spectral fluctuations provides a powerful tool for the identification of the symmetries and the determination of the fraction of missing levels. Such a procedure is indispensable for the evaluation of the fluctuation properties in the spectra of real physical systems like, e.g., nuclei or molecules, where one has to deal with the problem of missing levels.
Methods of anaerobic power assessment (a statistical program for the IBM PC)
Francis, K
1987-02-01
Many sports activities that involve brief, high-intensity exercise tend to be anaerobic. To improve or optimize anaerobic performance, assessment of anaerobic power is necessary. A variety of tests have been developed, three of which are used widely; these are the Margaria step test, the Wingate cycle ergometer test, and the 50-yd dash. Estimates of anaerobic power requiring a minimal amount of equipment can be obtained using field tests such as the 50-yd dash. More quantitative measurements can be made using the Margaria step test or the Wingate cycle ergometer test. Whereas these tests have identifiable limitations, they are relatively easy to administer and require only a minimum of equipment. Because maximal anaerobic power is a kind of work important in many common sports activities, the measurement of anaerobic power should be considered with other routine assessments for optimizing performance.
Konstantopoulos, Spyros
2012-06-18
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power without adding many clusters that make the study much more expensive. In this article I discuss how power estimates of tests of treatment effects in balanced cluster randomized designs are affected by covariates at different levels. I use third-grade data from Project STAR, a field experiment about class size, to demonstrate how covariates that explain a considerable proportion of variance in outcomes increase power significantly. When lower level covariates are group-mean centered and clustering effects are larger, top-level covariates increase power more than lower level covariates. In contrast, when clustering effects are smaller and lower level covariates are grand-mean centered or uncentered, lower level covariates increase power more than top-level covariates.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
Kneller, James P.; Kabadi, Neel V.
2015-07-16
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. In this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Furthermore, since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.
Sensitivity of neutrinos to the supernova turbulence power spectrum: Point source statistics
Kneller, James P.; Kabadi, Neel V.
2015-07-16
The neutrinos emitted from the proto-neutron star created in a core-collapse supernova must run through a significant amount of turbulence before exiting the star. Turbulence can modify the flavor evolution of the neutrinos imprinting itself upon the signal detected here at Earth. The turbulence effect upon individual neutrinos, and the correlation between pairs of neutrinos, might exhibit sensitivity to the power spectrum of the turbulence, and recent analysis of the turbulence in a two-dimensional hydrodynamical simulation of a core-collapse supernova indicates the power spectrum may not be the Kolmogorov 5 /3 inverse power law as has been previously assumed. Inmore » this paper we study the effect of non-Kolmogorov turbulence power spectra upon neutrinos from a point source as a function of neutrino energy and turbulence amplitude at a fixed postbounce epoch. We find the two effects of turbulence upon the neutrinos—the distorted phase effect and the stimulated transitions—both possess strong and weak limits in which dependence upon the power spectrum is absent or evident, respectively. Furthermore, since neutrinos of a given energy will exhibit these two effects at different epochs of the supernova each with evolving strength, we find there is sensitivity to the power spectrum present in the neutrino burst signal from a Galactic supernova.« less
A statistical framework for genetic association studies of power curves in bird flight
Lin, Min; Zhao, Wei
2006-01-01
How the power required for bird flight varies as a function of forward speed can be used to predict the flight style and behavioral strategy of a bird for feeding and migration. A U-shaped curve was observed between the power and flight velocity in many birds, which is consistent to the theoretical prediction by aerodynamic models. In this article, we present a general genetic model for fine mapping of quantitative trait loci (QTL) responsible for power curves in a sample of birds drawn from a natural population. This model is developed within the maximum likelihood context, implemented with the EM algorithm for estimating the population genetic parameters of QTL and the simplex algorithm for estimating the QTL genotype-specific parameters of power curves. Using Monte Carlo simulation derived from empirical observations of power curves in the European starling (Sturnus vulgaris), we demonstrate how the underlying QTL for power curves can be detected from molecular markers and how the QTL detected affect the most appropriate flight speeds used to design an optimal migration strategy. The results from our model can be directly integrated into a conceptual framework for understanding flight origin and evolution. PMID:17066123
NASA Technical Reports Server (NTRS)
Smith, Wayne Farrior
1973-01-01
The effect of finite source size on the power statistics in a reverberant room for pure tone excitation was investigated. Theoretical results indicate that the standard deviation of low frequency, pure tone finite sources is always less than that predicted by point source theory and considerably less when the source dimension approaches one-half an acoustic wavelength or greater. A supporting experimental study was conducted utilizing an eight inch loudspeaker and a 30 inch loudspeaker at eleven source positions. The resulting standard deviation of sound power output of the smaller speaker is in excellent agreement with both the derived finite source theory and existing point source theory, if the theoretical data is adjusted to account for experimental incomplete spatial averaging. However, the standard deviation of sound power output of the larger speaker is measurably lower than point source theory indicates, but is in good agreement with the finite source theory.
NASA Astrophysics Data System (ADS)
Woolley, Thomas W.; Dawson, George O.
It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.
ERIC Educational Resources Information Center
Porter, Kristin E.
2016-01-01
In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…
ERIC Educational Resources Information Center
Endress, Ansgar D.; Mehler, Jacques
2009-01-01
Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…
ERIC Educational Resources Information Center
Groth, Randall E.
2013-01-01
A hypothetical framework to characterize statistical knowledge for teaching (SKT) is described. Empirical grounding for the framework is provided by artifacts from an undergraduate course for prospective teachers that concentrated on the development of SKT. The theoretical notion of "key developmental understanding" (KDU) is used to identify…
Statistical tests with accurate size and power for balanced linear mixed models.
Muller, Keith E; Edwards, Lloyd J; Simpson, Sean L; Taylor, Douglas J
2007-08-30
The convenience of linear mixed models for Gaussian data has led to their widespread use. Unfortunately, standard mixed model tests often have greatly inflated test size in small samples. Many applications with correlated outcomes in medical imaging and other fields have simple properties which do not require the generality of a mixed model. Alternately, stating the special cases as a general linear multivariate model allows analysing them with either the univariate or multivariate approach to repeated measures (UNIREP, MULTIREP). Even in small samples, an appropriate UNIREP or MULTIREP test always controls test size and has a good power approximation, in sharp contrast to mixed model tests. Hence, mixed model tests should never be used when one of the UNIREP tests (uncorrected, Huynh-Feldt, Geisser-Greenhouse, Box conservative) or MULTIREP tests (Wilks, Hotelling-Lawley, Roy's, Pillai-Bartlett) apply. Convenient methods give exact power for the uncorrected and Box conservative tests. Simulations demonstrate that new power approximations for all four UNIREP tests eliminate most inaccuracy in existing methods. In turn, free software implements the approximations to give a better choice of sample size. Two repeated measures power analyses illustrate the methods. The examples highlight the advantages of examining the entire response surface of power as a function of sample size, mean differences, and variability.
Statistical Characterization of Solar Photovoltaic Power Variability at Small Timescales: Preprint
Shedd, S.; Hodge, B.-M.; Florita, A.; Orwig, K.
2012-08-01
Integrating large amounts of variable and uncertain solar photovoltaic power into the electricity grid is a growing concern for power system operators in a number of different regions. Power system operators typically accommodate variability, whether from load, wind, or solar, by carrying reserves that can quickly change their output to match the changes in the solar resource. At timescales in the seconds-to-minutes range, this is known as regulation reserve. Previous studies have shown that increasing the geographic diversity of solar resources can reduce the short term-variability of the power output. As the price of solar has decreased, the emergence of very large PV plants (greater than 10 MW) has become more common. These plants present an interesting case because they are large enough to exhibit some spatial smoothing by themselves. This work examines the variability of solar PV output among different arrays in a large ({approx}50 MW) PV plant in the western United States, including the correlation in power output changes between different arrays, as well as the aggregated plant output, at timescales ranging from one second to five minutes.
Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.
Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor
2011-02-01
The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.
NASA Astrophysics Data System (ADS)
Kato, Takeyoshi; Minagata, Atsushi; Suzuoki, Yasuo
This paper discusses the influence of mass installation of a home co-generation system (H-CGS) using a polymer electrolyte fuel cell (PEFC) on the voltage profile of power distribution system in residential area. The influence of H-CGS is compared with that of photovoltaic power generation systems (PV systems). The operation pattern of H-CGS is assumed based on the electricity and hot-water demand observed in 10 households for a year. The main results are as follows. With the clustered H-CGS, the voltage of each bus is higher by about 1-3% compared with the conventional system without any distributed generators. Because H-CGS tends to increase the output during the early evening, H-CGS contributes to recover the voltage drop during the early evening, resulting in smaller voltage variation of distribution system throughout a day. Because of small rated power output about 1kW, the influence on voltage profile by the clustered H-CGS is smaller than that by the clustered PV systems. The highest voltage during the day time is not so high as compared with the distribution system with the clustered PV systems, even if the reverse power flow from H-CGS is allowed.
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2012-01-01
Field experiments with nested structures are becoming increasingly common, especially designs that assign randomly entire clusters such as schools to a treatment and a control group. In such large-scale cluster randomized studies the challenge is to obtain sufficient power of the test of the treatment effect. The objective is to maximize power…
Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J
2008-02-01
One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.
Jacobs, Kevin B; Yeager, Meredith; Wacholder, Sholom; Craig, David; Kraft, Peter; Hunter, David J; Paschal, Justin; Manolio, Teri A; Tucker, Margaret; Hoover, Robert N; Thomas, Gilles D; Chanock, Stephen J; Chatterjee, Nilanjan
2009-11-01
Aggregate results from genome-wide association studies (GWAS), such as genotype frequencies for cases and controls, were until recently often made available on public websites because they were thought to disclose negligible information concerning an individual's participation in a study. Homer et al. recently suggested that a method for forensic detection of an individual's contribution to an admixed DNA sample could be applied to aggregate GWAS data. Using a likelihood-based statistical framework, we developed an improved statistic that uses genotype frequencies and individual genotypes to infer whether a specific individual or any close relatives participated in the GWAS and, if so, what the participant's phenotype status is. Our statistic compares the logarithm of genotype frequencies, in contrast to that of Homer et al., which is based on differences in either SNP probe intensity or allele frequencies. We derive the theoretical power of our test statistics and explore the empirical performance in scenarios with varying numbers of randomly chosen or top-associated SNPs.
Statistical power of latent growth curve models to detect quadratic growth.
Diallo, Thierno M O; Morin, Alexandre J S; Parker, Philip D
2014-06-01
Latent curve models (LCMs) have been used extensively to analyze longitudinal data. However, little is known about the power of LCMs to detect nonlinear trends when they are present in the data. For this study, we utilized simulated data to investigate the power of LCMs to detect the mean of the quadratic slope, Type I error rates, and rates of nonconvergence during the estimation of quadratic LCMs. Five factors were examined: the number of time points, growth magnitude, interindividual variability, sample size, and the R (2)s of the measured variables. The results showed that the empirical Type I error rates were close to the nominal value of 5 %. The empirical power to detect the mean of the quadratic slope was affected by the simulation factors. Finally, a substantial proportion of samples failed to converge under conditions of no to small variation in the quadratic factor, small sample sizes, and small R (2) of the repeated measures. In general, we recommended that quadratic LCMs be based on samples of (a) at least 250 but ideally 400, when four measurement points are available; (b) at least 100 but ideally 150, when six measurement points are available; (c) at least 50 but ideally 100, when ten measurement points are available.
Non-detection of a statistically anisotropic power spectrum in large-scale structure
Pullen, Anthony R.; Hirata, Christopher M. E-mail: chirata@tapir.caltech.edu
2010-05-01
We search a sample of photometric luminous red galaxies (LRGs) measured by the Sloan Digital Sky Survey (SDSS) for a quadrupolar anisotropy in the primordial power spectrum, in which P( k-vector ) is an isotropic power spectrum P-bar (k) multiplied by a quadrupolar modulation pattern. We first place limits on the 5 coefficients of a general quadrupole anisotropy. We also consider axisymmetric quadrupoles of the form P( k-vector ) = P-bar (k)(1+g{sub *}[( k-circumflex ⋅ n-circumflex ){sup 2}−(1/3)]) where n-circumflex is the axis of the anisotropy. When we force the symmetry axis n-circumflex to be in the direction (l,b) = (94°,26°) identified in the recent Groeneboom et al. analysis of the cosmic microwave background, we find g{sub *} = 0.006±0.036 (1σ). With uniform priors on n-circumflex and g{sub *} we find that −0.41 < g{sub *} < +0.38 with 95% probability, with the wide range due mainly to the large uncertainty of asymmetries aligned with the Galactic Plane. In none of these three analyses do we detect evidence for quadrupolar power anisotropy in large scale structure.
On Statistics of Log-Ratio of Arithmetic Mean to Geometric Mean for Nakagami-m Fading Power
NASA Astrophysics Data System (ADS)
Wang, Ning; Cheng, Julian; Tellambura, Chintha
To assess the performance of maximum-likelihood (ML) based Nakagami m parameter estimators, current methods rely on Monte Carlo simulation. In order to enable the analytical performance evaluation of ML-based m parameter estimators, we study the statistical properties of a parameter Δ, which is defined as the log-ratio of the arithmetic mean to the geometric mean for Nakagami-m fading power. Closed-form expressions are derived for the probability density function (PDF) of Δ. It is found that for large sample size, the PDF of Δ can be well approximated by a two-parameter Gamma PDF.
Asbestos/NESHAP adequately wet guidance
Shafer, R.; Throwe, S.; Salgado, O.; Garlow, C.; Hoerath, E.
1990-12-01
The Asbestos NESHAP requires facility owners and/or operators involved in demolition and renovation activities to control emissions of particulate asbestos to the outside air because no safe concentration of airborne asbestos has ever been established. The primary method used to control asbestos emissions is to adequately wet the Asbestos Containing Material (ACM) with a wetting agent prior to, during and after demolition/renovation activities. The purpose of the document is to provide guidance to asbestos inspectors and the regulated community on how to determine if friable ACM is adequately wet as required by the Asbestos NESHAP.
Cuijpers, Pim
2016-05-01
More than 100 comparative outcome trials, directly comparing 2 or more psychotherapies for adult depression, have been published. We first examined whether these comparative trials had sufficient statistical power to detect clinically relevant differences between therapies of d=0.24. In order to detect such an effect size, power calculations showed that a trial would need to include 548 patients. We selected 3 recent meta-analyses of psychotherapies for adult depression (cognitive behaviour therapy (CBT), interpersonal psychotherapy and non-directive counselling) and examined the number of patients included in the trials directly comparing other psychotherapies. The largest trial comparing CBT with another therapy included 178 patients, and had enough power to detect a differential effect size of only d=0.42. None of the trials in the 3 meta-analyses had enough power to detect effect sizes smaller than d=0.34, but some came close to the threshold for detecting a clinically relevant effect size of d=0.24. Meta-analyses may be able to solve the problem of the low power of individual trials. However, many of these studies have considerable risk of bias, and if we only focused on trials with low risk of bias, there would no longer be enough studies to detect clinically relevant effects. We conclude that individual trials are heavily underpowered and do not even come close to having sufficient power for detecting clinically relevant effect sizes. Despite this large number of trials, it is still not clear whether there are clinically relevant differences between these therapies.
Statistical characteristics of the observed Ly-α forest and the shape of initial power spectrum
NASA Astrophysics Data System (ADS)
Demiański, M.; Doroshkevich, A. G.; Turchaninov, V.
2003-04-01
Properties of approximately 4500 observed Ly α absorbers are investigated using the model of formation and evolution of dark matter (DM) structure elements based on the modified Zel'dovich theory. This model is generally consistent with simulations of absorber formation, describes the large-scale structure (LSS) observed in the galaxy distribution at small redshifts reasonably well and emphasizes the generic similarity of the LSS and absorbers. The simple physical model of absorbers asserts that they are composed of DM and gaseous matter. It allows us to estimate the column density and overdensity of DM and gaseous components and the entropy of the gas trapped within the DM potential wells. The parameters of the DM component are found to be consistent with theoretical expectations for the Gaussian initial perturbations with the warm dark matter-like power spectrum. The basic physical factors responsible for the evolution of the absorbers are discussed. The analysis of redshift distribution of absorbers confirms the self-consistency of the adopted physical model, Gaussianity of the initial perturbations and allows one to estimate the shape of the initial power spectrum at small scales that, in turn, restricts the mass of the dominant fraction of DM particles to MDM>= 1.5-5 keV. Our results indicate a possible redshift variations of intensity of the ultraviolet background by approximately a factor of 2-3 at redshifts z~ 2-3.
Funding the Formula Adequately in Oklahoma
ERIC Educational Resources Information Center
Hancock, Kenneth
2015-01-01
This report is a longevity, simulational study that looks at how the ratio of state support to local support effects the number of school districts that breaks the common school's funding formula which in turns effects the equity of distribution to the common schools. After nearly two decades of adequately supporting the funding formula, Oklahoma…
Guilera, Georgina; Gómez-Benito, Juana; Hidalgo, Maria Dolores; Sánchez-Meca, Julio
2013-12-01
This article presents a meta-analysis of studies investigating the effectiveness of the Mantel-Haenszel (MH) procedure when used to detect differential item functioning (DIF). Studies were located electronically in the main databases, representing the codification of 3,774 different simulation conditions, 1,865 related to Type I error and 1,909 to statistical power. The homogeneity of effect-size distributions was assessed by the Q statistic. The extremely high heterogeneity in both error rates (I² = 94.70) and power (I² = 99.29), due to the fact that numerous studies test the procedure in extreme conditions, means that the main interest of the results lies in explaining the variability in detection rates. One-way analysis of variance was used to determine the effects of each variable on detection rates, showing that the MH test was more effective when purification procedures were used, when the data fitted the Rasch model, when test contamination was below 20%, and with sample sizes above 500. The results imply a series of recommendations for practitioners who wish to study DIF with the MH test. A limitation, one inherent to all meta-analyses, is that not all the possible moderator variables, or the levels of variables, have been explored. This serves to remind us of certain gaps in the scientific literature (i.e., regarding the direction of DIF or variances in ability distribution) and is an aspect that methodologists should consider in future simulation studies.
Wille, Anja; Gruissem, Wilhelm; Bühlmann, Peter; Hennig, Lars
2007-11-01
Accurately identifying differentially expressed genes from microarray data is not a trivial task, partly because of poor variance estimates of gene expression signals. Here, after analyzing 380 replicated microarray experiments, we found that probesets have typical, distinct variances that can be estimated based on a large number of microarray experiments. These probeset-specific variances depend at least in part on the function of the probed gene: genes for ribosomal or structural proteins often have a small variance, while genes implicated in stress responses often have large variances. We used these variance estimates to develop a statistical test for differentially expressed genes called EVE (external variance estimation). The EVE algorithm performs better than the t-test and LIMMA on some real-world data, where external information from appropriate databases is available. Thus, EVE helps to maximize the information gained from a typical microarray experiment. Nonetheless, only a large number of replicates will guarantee to identify nearly all truly differentially expressed genes. However, our simulation studies suggest that even limited numbers of replicates will usually result in good coverage of strongly differentially expressed genes.
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
Fanson, Benjamin G.; Beckmann, Christa; Biro, Peter A.
2016-01-01
There is a long-standing interest in behavioural ecology, exploring the causes and correlates of consistent individual differences in mean behavioural traits (‘personality’) and the response to the environment (‘plasticity’). Recently, it has been observed that individuals also consistently differ in their residual intraindividual variability (rIIV). This variation will probably have broad biological and methodological implications to the study of trait variation in labile traits, such as behaviour and physiology, though we currently need studies to quantify variation in rIIV, using more standardized and powerful methodology. Focusing on activity rates in guppies (Poecilia reticulata), we provide a model example, from sampling design to data analysis, in how to quantify rIIV in labile traits. Building on the doubly hierarchical generalized linear model recently used to quantify individual differences in rIIV, we extend the model to evaluate the covariance between individual mean values and their rIIV. After accounting for time-related change in behaviour, our guppies substantially differed in rIIV, and it was the active individuals that tended to be more consistent (lower rIIV). We provide annotated data analysis code to implement these complex models, and discuss how to further generalize the model to evaluate covariances with other aspects of phenotypic variation. PMID:27853550
Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents and Accidents.
Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier
2017-01-01
We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5-0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the "dragon-king" phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60-150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10-20 years. Further-even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima-the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
ERIC Educational Resources Information Center
Blair, R. Clifford; Higgins, James J.
1980-01-01
Monte Carlo techniques were used to compare the power of Wilcoxon's rank-sum test to the power of the two independent means t test for situations in which samples were drawn from (1) uniform, (2) Laplace, (3) half-normal, (4) exponential, (5) mixed-normal, and (6) mixed-uniform distributions. (Author/JKS)
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Wang, Zhaojun; Zhang, Hanyi; Zhang, Xiaowen; Sun, Jie; Han, Cheng; Li, Chenyan; Li, Yongze; Teng, Xiaochun; Fan, Chenling; Liu, Aihua; Shan, Zhongyan; Liu, Chao; Weng, Jianping; Teng, Weiping
2016-11-01
The purpose of this study was to establish normal thyroglobulin (Tg) reference intervals (RIs) in regions with adequate and more than adequate iodine intake according to the National Academy of Clinical Biochemistry (NACB) guidelines and to investigate the relationships between Tg and other factors.A total of 1317 thyroid disease-free adult subjects (578 men, 739 nonpregnant women) from 2 cities (Guangzhou and Nanjing) were enrolled in this retrospective, observational study. Each subject completed a questionnaire and underwent physical and ultrasonic examination. Serum Tg, thyroid-stimulating hormone (TSH), thyroid peroxidase antibody (TPOAb), Tg antibody (TgAb), and urinary iodine concentration (UIC) were measured. Reference groups were established on the basis of TSH levels: 0.5 to 2.0 and 0.27 to 4.2 mIU/L.The Tg RIs for Guangzhou and Nanjing were 1.6 to 30.0 and 1.9 to 25.8 ng/mL, respectively. No significant differences in Tg were found between genders or among different reference groups. Stepwise linear regression analyses showed that TgAb, thyroid volume, goiter, gender, age, and TSH levels were correlated with Tg.In adults from regions with adequate and more than adequate iodine intake, we found that Tg may be a suitable marker of iodine status; gender-specific Tg RI was unnecessary; there was no difference between Tg RIs in regions with adequate and more than adequate iodine intake; and the TSH criterion for selecting the Tg reference population could follow the local TSH reference rather than 0.5 to 2.0 mIU/L.
Wang, Zhaojun; Zhang, Hanyi; Zhang, Xiaowen; Sun, Jie; Han, Cheng; Li, Chenyan; Li, Yongze; Teng, Xiaochun; Fan, Chenling; Liu, Aihua; Shan, Zhongyan; Liu, Chao; Weng, Jianping; Teng, Weiping
2016-01-01
Abstract The purpose of this study was to establish normal thyroglobulin (Tg) reference intervals (RIs) in regions with adequate and more than adequate iodine intake according to the National Academy of Clinical Biochemistry (NACB) guidelines and to investigate the relationships between Tg and other factors. A total of 1317 thyroid disease-free adult subjects (578 men, 739 nonpregnant women) from 2 cities (Guangzhou and Nanjing) were enrolled in this retrospective, observational study. Each subject completed a questionnaire and underwent physical and ultrasonic examination. Serum Tg, thyroid-stimulating hormone (TSH), thyroid peroxidase antibody (TPOAb), Tg antibody (TgAb), and urinary iodine concentration (UIC) were measured. Reference groups were established on the basis of TSH levels: 0.5 to 2.0 and 0.27 to 4.2 mIU/L. The Tg RIs for Guangzhou and Nanjing were 1.6 to 30.0 and 1.9 to 25.8 ng/mL, respectively. No significant differences in Tg were found between genders or among different reference groups. Stepwise linear regression analyses showed that TgAb, thyroid volume, goiter, gender, age, and TSH levels were correlated with Tg. In adults from regions with adequate and more than adequate iodine intake, we found that Tg may be a suitable marker of iodine status; gender-specific Tg RI was unnecessary; there was no difference between Tg RIs in regions with adequate and more than adequate iodine intake; and the TSH criterion for selecting the Tg reference population could follow the local TSH reference rather than 0.5 to 2.0 mIU/L. PMID:27902589
Lombardo, Michael V; Auyeung, Bonnie; Holt, Rosemary J; Waldman, Jack; Ruigrok, Amber N V; Mooney, Natasha; Bullmore, Edward T; Baron-Cohen, Simon; Kundu, Prantik
2016-11-15
Functional magnetic resonance imaging (fMRI) research is routinely criticized for being statistically underpowered due to characteristically small sample sizes and much larger sample sizes are being increasingly recommended. Additionally, various sources of artifact inherent in fMRI data can have detrimental impact on effect size estimates and statistical power. Here we show how specific removal of non-BOLD artifacts can improve effect size estimation and statistical power in task-fMRI contexts, with particular application to the social-cognitive domain of mentalizing/theory of mind. Non-BOLD variability identification and removal is achieved in a biophysical and statistically principled manner by combining multi-echo fMRI acquisition and independent components analysis (ME-ICA). Without smoothing, group-level effect size estimates on two different mentalizing tasks were enhanced by ME-ICA at a median rate of 24% in regions canonically associated with mentalizing, while much more substantial boosts (40-149%) were observed in non-canonical cerebellar areas. Effect size boosting occurs via reduction of non-BOLD noise at the subject-level and consequent reductions in between-subject variance at the group-level. Smoothing can attenuate ME-ICA-related effect size improvements in certain circumstances. Power simulations demonstrate that ME-ICA-related effect size enhancements enable much higher-powered studies at traditional sample sizes. Cerebellar effects observed after applying ME-ICA may be unobservable with conventional imaging at traditional sample sizes. Thus, ME-ICA allows for principled design-agnostic non-BOLD artifact removal that can substantially improve effect size estimates and statistical power in task-fMRI contexts. ME-ICA could mitigate some issues regarding statistical power in fMRI studies and enable novel discovery of aspects of brain organization that are currently under-appreciated and not well understood.
ERIC Educational Resources Information Center
Tabor, Josh
2010-01-01
On the 2009 AP[c] Statistics Exam, students were asked to create a statistic to measure skewness in a distribution. This paper explores several of the most popular student responses and evaluates which statistic performs best when sampling from various skewed populations. (Contains 8 figures, 3 tables, and 4 footnotes.)
Adequate mathematical modelling of environmental processes
NASA Astrophysics Data System (ADS)
Chashechkin, Yu. D.
2012-04-01
In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same
NASA Astrophysics Data System (ADS)
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
NASA Astrophysics Data System (ADS)
Ruecker, Gernot; Leimbach, David; Guenther, Felix; Barradas, Carol; Hoffmann, Anja
2016-04-01
Fire Radiative Power (FRP) retrieved by infrared sensors, such as flown on several polar orbiting and geostationary satellites, has been shown to be proportional to fuel consumption rates in vegetation fires, and hence the total radiative energy released by a fire (Fire Radiative Energy, FRE) is proportional to the total amount of biomass burned. However, due to the sparse temporal coverage of polar orbiting and the coarse spatial resolution of geostationary sensors, it is difficult to estimate fuel consumption for single fire events. Here we explore an approach for estimating FRE through temporal integration of MODIS FRP retrievals over MODIS-derived burned areas. Temporal integration is aided by statistical modelling to estimate missing observations using a generalized additive model (GAM) and taking advantage of additional information such as land cover and a global dataset of the Canadian Fire Weather Index (FWI), as well as diurnal and annual FRP fluctuation patterns. Based on results from study areas located in savannah regions of Southern and Eastern Africa and Brazil, we compare this method to estimates based on simple temporal integration of FRP retrievals over the fire lifetime, and estimate the potential variability of FRP integration results across a range of fire sizes. We compare FRE-based fuel consumption against a database of field experiments in similar landscapes. Results show that for larger fires, this method yields realistic estimates and is more robust when only a small number of observations is available than the simple temporal integration. Finally, we offer an outlook on the integration of data from other satellites, specifically FireBird, S-NPP VIIRS and Sentinel-3, as well as on using higher resolution burned area data sets derived from Landsat and similar sensors.
2014-01-01
Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304
NASA Astrophysics Data System (ADS)
Olson, Kyle David
A model is presented and confirmed experimentally that explains the anomalous behavior observed in the continuous wave (CW) excitation of thermally-isolated optics. Very low absorption, high reflective optical thin film coatings of HfO2 and SiO2 were prepared. When illuminated with a laser for 30s the coatings survived peak irradiances of 13MW/cm 2. The temperature profile of the optical surfaces was measured using a calibrated thermal imaging camera; about the same peak temperatures were recorded regardless of spot size, which ranged between 500mum and 5mm. This phenomenon is explained by solving the heat diffusion equation for an optic of finite dimensions, including the non-idealities of the measurement. An analytical result is also derived showing the transition from millisecond pulses to CW, where the heating is proportional to the laser irradiance (W/m 2) for millisecond pulses, and proportional to the beam radius (W/m) for CW. Contamination-induced laser breakdown is often viewed as random and simple physical models are difficult to apply. Under continuous wave illumination conditions, failure appears to be induced by a runaway free-carrier absorption process. High power laser illumination is absorbed by the contaminant particles or regions, which heat rapidly. Some of this heat transfers to the substrate, raising its temperature towards that of the vaporizing particle. This generates free carriers, causing more absorption and more heating. If a certain threshold concentration is created, the process becomes unstable, thermally heating the material to catastrophic breakdown. Contamination-induced breakdown is exponentially bandgap dependent, and this prediction is borne out in experimental data from TiO2, Ta2O5, HfO2, Al 2O3, and SiO2. The spectral dependence of blackbody radiation and thermal photon noise is derived analytically for the first time as a function of spectra and mode density. An algorithm by which the analytical expression for the variance can
Flynn, Kevin; Swintek, Joe; Johnson, Rodney
2017-02-01
Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group.
ERIC Educational Resources Information Center
Safarkhani, Maryam; Moerbeek, Mirjam
2013-01-01
In a randomized controlled trial, a decision needs to be made about the total number of subjects for adequate statistical power. One way to increase the power of a trial is by including a predictive covariate in the model. In this article, the effects of various covariate adjustment strategies on increasing the power is studied for discrete-time…
Dziak, John J; Lanza, Stephanie T; Tan, Xianming
2014-01-01
Selecting the number of different classes which will be assumed to exist in the population is an important step in latent class analysis (LCA). The bootstrap likelihood ratio test (BLRT) provides a data-driven way to evaluate the relative adequacy of a (K -1)-class model compared to a K-class model. However, very little is known about how to predict the power or the required sample size for the BLRT in LCA. Based on extensive Monte Carlo simulations, we provide practical effect size measures and power curves which can be used to predict power for the BLRT in LCA given a proposed sample size and a set of hypothesized population parameters. Estimated power curves and tables provide guidance for researchers wishing to size a study to have sufficient power to detect hypothesized underlying latent classes.
Dziak, John J.; Lanza, Stephanie T.; Tan, Xianming
2014-01-01
Selecting the number of different classes which will be assumed to exist in the population is an important step in latent class analysis (LCA). The bootstrap likelihood ratio test (BLRT) provides a data-driven way to evaluate the relative adequacy of a (K −1)-class model compared to a K-class model. However, very little is known about how to predict the power or the required sample size for the BLRT in LCA. Based on extensive Monte Carlo simulations, we provide practical effect size measures and power curves which can be used to predict power for the BLRT in LCA given a proposed sample size and a set of hypothesized population parameters. Estimated power curves and tables provide guidance for researchers wishing to size a study to have sufficient power to detect hypothesized underlying latent classes. PMID:25328371
Gorynin, I.V.; Filatov, V.M.; Ignatov, V.A.; Timofeev, B.T.; Zvezdin, Yu. I.
1986-07-01
Data from plant radiographic inspection of reactor vessels and boilers in power plants of 440- and 1000-MW capacity were subjected to statistical analysis. It was found that, given the current technology for making and constructing 440- and 1000-MW power plants, the limiting defect size in the vessels of the plants is no more than 10% of the wall thickness according to the results of statistical analysis. This finding makes it possible to increase the tolerable stresses by a factor of 1.6 compared to the current estimate of resistance to brittle fracture, which presumes the presence of a semielliptical surface crack of a depth corresponding to 25% of the wall thickness. The fracture resistance of steel increase with a decrease in defect size and as a result of the damping capacity of the anticorrosive hardfacing applied.
NASA Astrophysics Data System (ADS)
Shergill, H.; Robinson, T. R.; Dhillon, R. S.; Lester, M.; Milan, S. E.; Yeoman, T. K.
2010-05-01
High-power electromagnetic waves can excite a variety of plasma instabilities in Earth's ionosphere. These lead to the growth of plasma waves and plasma density irregularities within the heated volume, including patches of small-scale field-aligned electron density irregularities. This paper reports a statistical study of intensity distributions in patches of these irregularities excited by the European Incoherent Scatter (EISCAT) heater during beam-sweeping experiments. The irregularities were detected by the Co-operative UK Twin Located Auroral Sounding System (CUTLASS) coherent scatter radar located in Finland. During these experiments the heater beam direction is steadily changed from northward to southward pointing. Comparisons are made between statistical parameters of CUTLASS backscatter power distributions and modeled heater beam power distributions provided by the EZNEC version 4 software. In general, good agreement between the statistical parameters and the modeled beam is observed, clearly indicating the direct causal connection between the heater beam and the irregularities, despite the sometimes seemingly unpredictable nature of unaveraged results. The results also give compelling evidence in support of the upper hybrid theory of irregularity excitation.
García-Hermoso, Antonio; Dávila-Romero, Carlos; Saavedra, Jose M
2013-02-01
This study compared volleyball game-related statistics by outcome (winners and losers of sets) and set number (total, initial, and last) to identify characteristics that discriminated game performance. Game-related statistics from 314 sets (44 matches) played by teams of male 14- to 15-year-olds in a regional volleyball championship were analysed (2011). Differences between contexts (winning or losing teams) and "set number" (total, initial, and last) were assessed. A discriminant analysis was then performed according to outcome (winners and losers of sets) and "set number" (total, initial, and last). The results showed differences (winning or losing sets) in several variables of Complexes I (attack point and error reception) and II (serve and aces). Game-related statistics which discriminate performance in the sets index the serve, positive reception, and attack point. The predictors of performance at these ages when players are still learning could help coaches plan their training.
Turgeon, Alain; Bourdages, Michel; Levallois, Patrick; Gauvin, Denis; Gingras, Suzanne; Deadman, Jan Erik; Goulet, Daniel L; Plante, Michel
2004-07-01
This study was designed to provide an experimental validation for a statistical model predicting past or future exposures to magnetic fields (MF) from power lines. The model estimates exposure, combining the distribution of ambient MF in the absence of power lines with the distribution of past or future MF produced by power lines. In the study, validation is carried out by comparing exposures predicted by the model with the actual measurements obtained from a large-scale epidemiological study. The comparison was made for a group of 220 women living near a 735 kV power line. Knowing that the individual arithmetic means of MF exposures follow a log-normal distribution, the Pearson correlation between the log-transformed measured means and the calculated ones was determined and found to be 0.77. Predicted values of MF exposures were slightly lower than measured values. The calculated geometric mean of the group was 0.33 microT, compared to 0.38 microT for the measured geometric mean. The present study shows good agreement between the measured MF exposure of an individual inside a house near a 735 kV line and the MF exposure calculated using a statistical model.
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
NASA Astrophysics Data System (ADS)
Alisultanov, Z. Z.; Meilanov, R. P.
2012-10-01
We consider the problem of the effective interaction potential in a quantum many-particle system leading to the fractional-power dispersion law. We show that passing to fractional-order derivatives is equivalent to introducing a pair interparticle potential. We consider the case of a degenerate electron gas. Using the van der Waals equation, we study the equation of state for systems with a fractional-power spectrum. We obtain a relation between the van der Waals constant and the phenomenological parameter α, the fractional-derivative order. We obtain a relation between energy, pressure, and volume for such systems: the coefficient of the thermal energy is a simple function of α. We consider Bose—Einstein condensation in a system with a fractional-power spectrum. The critical condensation temperature for 1 < α < 2 is greater in the case under consideration than in the case of an ideal system, where α = 2.
ERIC Educational Resources Information Center
Liu, Xiaofeng
2003-01-01
This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…
This study assessed the pollutant emission offset potential of distributed grid-connected photovoltaic (PV) power systems. Computer-simulated performance results were utilized for 211 PV systems located across the U.S. The PV systems' monthly electrical energy outputs were based ...
Statistical characteristics of the observed Lyα forest and the shape of the initial power spectrum
NASA Astrophysics Data System (ADS)
Demiański, M.; Doroshkevich, A. G.; Turchaninov, V. I.
2006-09-01
We analyse the basic properties of about 6000 Lyman α absorbers observed in the high-resolution spectra of 19 quasars. We compare their observed characteristics with the predictions of our model of formation and evolution of absorbers and dark matter (DM) pancakes and voids based on the Zel'dovich theory of gravitational instability. This model asserts that absorbers are formed in the course of both linear and non-linear adiabatic and shock compression of DM and gaseous matter. Our model is consistent with simulations of structure formation, describes reasonably well the large-scale structure (LSS) observed in the distribution of galaxies at small redshifts, and emphasizes the generic similarity of the process of formation of LSS and absorbers. Using this model, we are able to link the column density and overdensity of the DM and gaseous components with the observed column density of neutral hydrogen, redshifts and Doppler parameters of absorbers. We show that the colder absorbers are associated with rapidly expanded underdense regions of galactic scale. We extend an existing method of measuring the power spectrum of initial perturbations. The observed separations between absorbers and their DM column density are linked with the correlation function of the initial velocity field. Applying this method to our sample of absorbers, we recover the cold dark matter (CDM) like power spectrum at scales of 10h-1 >= D >= 0.15h-1Mpc with a precision of ~15 per cent. However, at scales of ~3-150h-1kpc, the measured and CDM-like spectra are different. This result suggests a possible complex inflation with generation of excess power at small scales. Both confirmation of the CDM-like shape of the initial power spectrum and detection of its distortions at small scales are equally important for the widely discussed problems of physics of the early Universe, galaxy formation, and reheating of the Universe.
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Adequate file search. 716.25 Section 716.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of...
"Something Adequate"? In Memoriam Seamus Heaney, Sister Quinlan, Nirbhaya
ERIC Educational Resources Information Center
Parker, Jan
2014-01-01
Seamus Heaney talked of poetry's responsibility to represent the "bloody miracle", the "terrible beauty" of atrocity; to create "something adequate". This article asks, what is adequate to the burning and eating of a nun and the murderous gang rape and evisceration of a medical student? It considers Njabulo Ndebele's…
40 CFR 716.25 - Adequate file search.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
40 CFR 51.354 - Adequate tools and resources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Adequate tools and resources. 51.354... Requirements § 51.354 Adequate tools and resources. (a) Administrative resources. The program shall maintain the administrative resources necessary to perform all of the program functions including...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
9 CFR 305.3 - Sanitation and adequate facilities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Sanitation and adequate facilities. 305.3 Section 305.3 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... OF VIOLATION § 305.3 Sanitation and adequate facilities. Inspection shall not be inaugurated if...
21 CFR 201.5 - Drugs; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs; adequate directions for use. 201.5 Section 201.5 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.5 Drugs; adequate directions for use....
Is clinical measurement of anatomic axis of the femur adequate?
Wu, Chi-Chuan
2017-03-23
Background and purpose - The accuracy of using clinical measurement from the anterior superior iliac spine (ASIS) to the center of the knee to determine an anatomic axis of the femur has rarely been studied. A radiographic technique with a full-length standing scanogram (FLSS) was used to assess the adequacy of the clinical measurement. Patients and methods - 100 consecutive young adult patients (mean age 34 (20-40) years) with chronic unilateral lower extremity injuries were studied. The pelvis and intact contralateral lower extremity images in the FLSS were selected for study. The angles between the tibial axis and the femoral shaft anatomic axis (S-AA), the piriformis anatomic axis (P-AA), the clinical anatomic axis (C-AA), and the mechanical axis (MA) were compared between sexes. Results - Only the S-AA and C-AA angles were statistically significantly different in the 100 patients (3.6° vs. 2.8°; p = 0.03). There was a strong correlation between S-AA, P-AA, and C-AA angles (r > 0.9). The average intersecting angle between MA and S-AA in the femur in the 100 patients was 5.5°, and it was 4.8° between MA and C-AA. Interpretation - Clinical measurement of an anatomic axis from the ASIS to the center of the knee may be an adequate and acceptable method to determine lower extremity alignment. The optimal inlet for antegrade femoral intramedullary nailing may be the lateral edge of the piriformis fossa.
NASA Astrophysics Data System (ADS)
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
Wharton, S.; Bulaevskaya, V.; Irons, Z.; Qualley, G.; Newman, J. F.; Miller, W. O.
2015-09-28
The goal of our FY15 project was to explore the use of statistical models and high-resolution atmospheric input data to develop more accurate prediction models for turbine power generation. We modeled power for two operational wind farms in two regions of the country. The first site is a 235 MW wind farm in Northern Oklahoma with 140 GE 1.68 turbines. Our second site is a 38 MW wind farm in the Altamont Pass Region of Northern California with 38 Mitsubishi 1 MW turbines. The farms are very different in topography, climatology, and turbine technology; however, both occupy high wind resource areas in the U.S. and are representative of typical wind farms found in their respective areas.
NASA Astrophysics Data System (ADS)
Soua, S.; Bridge, B.; Cebulski, L.; Gan, T.-H.
2012-03-01
The use of a shock accelerometer for the continuous in-service monitoring of wear of the slip ring on a wind turbine generator is proposed and supporting results are presented. Five wear defects in the form of out-of-round circumference acceleration data with average radial dimensions in the range 5.9-25 µm were studied. A theoretical model of the acceleration at a point on the circumference of a ring as a function of the defect profile is presented. Acceleration data as a continuous function of time have been obtained for ring rotation frequencies that span the range of frequencies arising with the variation of wind speeds experienced under all in-service conditions. As a result, the measured RMS acceleration is proven to follow an overall increasing trend with frequency for all defects at all brush pressures. A statistical analysis of the root mean square of the time acceleration data as a function of the defect profiles, rotation speeds and brush contact pressure has been performed. The detection performance is considered in terms of the achievement of a signal to noise ratio exceeding 3 (99.997% defect detection probability). Under all conditions of rotation speed and pressure, this performance was achieved for average defect sizes as small as 10 µm, which is only 0.004% of the ring diameter. These results form the basis of a very sensitive defect alarm system.
Region 9: Arizona Adequate Letter (10/14/2003)
This is a letter from Jack P. Broadben,. Director, to Nancy Wrona and Dennis Smith informing them that Maricopa County's motor vehicle emissions budgets in the 2003 MAGCO Maintenance Plan are adequate for transportation conformity purposes.
Region 6: Texas Adequate Letter (4/16/2010)
This letter from EPA to Texas Commission on Environmental Quality determined 2021 motor vehicle emission budgets for nitrogen oxides (NOx) and volatile organic compounds (VOCs) for Beaumont/Port Arthur area adequate for transportation conformity purposes
Region 2: New Jersey Adequate Letter (5/23/2002)
This April 22, 2002 letter from EPA to the New Jersey Department of Environmental Protection determined 2007 and 2014 Carbon Monoxide (CO) Mobile Source Emissions Budgets adequate for transportation conformity purposes and will be announced in the Federal
Region 8: Colorado Adequate Letter (10/29/2001)
This letter from EPA to Colorado Department of Public Health and Environment determined Denvers' particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.
Region 1: New Hampshire Adequate Letter (8/12/2008)
This July 9, 2008 letter from EPA to the New Hampshire Department of Environmental Services, determined the 2009 Motor Vehicle Emissions Budgets (MVEBs) are adequate for transportation conformity purposes and will be announced in the Federal Register (FR).
Region 8: Colorado Adequate Letter (1/20/2004)
This letter from EPA to Colorado Department of Public Health and Environment determined Greeleys' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes and will be announced in the FR.
Region 8: Utah Adequate Letter (6/10/2005)
This letter from EPA to Utah Department of Environmental Quality determined Salt Lake Citys' and Ogdens' Carbon Monoxide (CO) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes.
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...
15 CFR 970.404 - Adequate exploration plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR EXPLORATION LICENSES Certification of Applications § 970.404 Adequate exploration plan. Before he may certify an application, the Administrator must...
Region 6: New Mexico Adequate Letter (8/21/2003)
This is a letter from Carl Edlund, Director, to Alfredo Santistevan regarding MVEB's contained in the latest revision to the Albuquerque Carbon Monoxide State Implementation Plan (SIP) are adequate for transportation conformity purposes.
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2013 CFR
2013-01-01
... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2014 CFR
2014-01-01
... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...
10 CFR 1304.114 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2013 CFR
2013-01-01
... contained in a system of records are adequately trained to protect the security and privacy of such records..., by degaussing or by overwriting with the appropriate security software, in accordance...
4 CFR 200.14 - Responsibility for maintaining adequate safeguards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... require access to and use of records contained in a system of records are adequately trained to protect... with the appropriate security software, in accordance with regulations of the Archivist of the...
Region 9: Nevada Adequate Letter (3/30/2006)
This is a letter from Deborah Jordan, Director, to Leo M. Drozdoff regarding Nevada's motor vehicle emissions budgets in the 2005 Truckee Meadows CO Redesignation Request and Maintenance Plan are adequate for transportation conformity decisions.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Kipiński, Lech; König, Reinhard; Sielużycki, Cezary; Kordecki, Wojciech
2011-10-01
Stationarity is a crucial yet rarely questioned assumption in the analysis of time series of magneto- (MEG) or electroencephalography (EEG). One key drawback of the commonly used tests for stationarity of encephalographic time series is the fact that conclusions on stationarity are only indirectly inferred either from the Gaussianity (e.g. the Shapiro-Wilk test or Kolmogorov-Smirnov test) or the randomness of the time series and the absence of trend using very simple time-series models (e.g. the sign and trend tests by Bendat and Piersol). We present a novel approach to the analysis of the stationarity of MEG and EEG time series by applying modern statistical methods which were specifically developed in econometrics to verify the hypothesis that a time series is stationary. We report our findings of the application of three different tests of stationarity--the Kwiatkowski-Phillips-Schmidt-Schin (KPSS) test for trend or mean stationarity, the Phillips-Perron (PP) test for the presence of a unit root and the White test for homoscedasticity--on an illustrative set of MEG data. For five stimulation sessions, we found already for short epochs of duration of 250 and 500 ms that, although the majority of the studied epochs of single MEG trials were usually mean-stationary (KPSS test and PP test), they were classified as nonstationary due to their heteroscedasticity (White test). We also observed that the presence of external auditory stimulation did not significantly affect the findings regarding the stationarity of the data. We conclude that the combination of these tests allows a refined analysis of the stationarity of MEG and EEG time series.
ERIC Educational Resources Information Center
Kazdin, Alan E.; Bass, Debra
1989-01-01
Drew 85 outcome studies from 9 journals from 1984 through 1986. Examined data in each article to provide estimates of effect sizes and to evaluate statistical power at posttreatment and follow-up. Findings indicated that power of studies to detect differences between treatment and no treatment was generally quite adequate given relatively large…
NASA Astrophysics Data System (ADS)
Slaski, G.; Ohde, B.
2016-09-01
The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.
NASA Astrophysics Data System (ADS)
Bahauddin, Shah Mohammad; Mehedi Faruk, Mir
2016-09-01
From the unified statistical thermodynamics of quantum gases, the virial coefficients of ideal Bose and Fermi gases, trapped under generic power law potential are derived systematically. From the general result of virial coefficients, one can produce the known results in d = 3 and d = 2. But more importantly we found that, the virial coefficients of Bose and Fermi gases become identical (except the second virial coefficient, where the sign is different) when the gases are trapped under harmonic potential in d = 1. This result suggests the equivalence between Bose and Fermi gases established in d = 1 (J. Stat. Phys. DOI 10.1007/s10955-015-1344-4). Also, it is found that the virial coefficients of two-dimensional free Bose (Fermi) gas are equal to the virial coefficients of one-dimensional harmonically trapped Bose (Fermi) gas.
Regulatory requirements for providing adequate veterinary care to research animals.
Pinson, David M
2013-09-01
Provision of adequate veterinary care is a required component of animal care and use programs in the United States. Program participants other than veterinarians, including non-medically trained research personnel and technicians, also provide veterinary care to animals, and administrators are responsible for assuring compliance with federal mandates regarding adequate veterinary care. All program participants therefore should understand the regulatory requirements for providing such care. The author provides a training primer on the US regulatory requirements for the provision of veterinary care to research animals. Understanding the legal basis and conditions of a program of veterinary care will help program participants to meet the requirements advanced in the laws and policies.
Comparability and Reliability Considerations of Adequate Yearly Progress
ERIC Educational Resources Information Center
Maier, Kimberly S.; Maiti, Tapabrata; Dass, Sarat C.; Lim, Chae Young
2012-01-01
The purpose of this study is to develop an estimate of Adequate Yearly Progress (AYP) that will allow for reliable and valid comparisons among student subgroups, schools, and districts. A shrinkage-type estimator of AYP using the Bayesian framework is described. Using simulated data, the performance of the Bayes estimator will be compared to…
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2010 CFR
2010-01-01
... operate actively in accordance with your Articles and within the context of your business plan, as... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL...
13 CFR 107.200 - Adequate capital for Licensees.
Code of Federal Regulations, 2011 CFR
2011-01-01
... operate actively in accordance with your Articles and within the context of your business plan, as... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Adequate capital for Licensees. 107.200 Section 107.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL...
Is the Stock of VET Skills Adequate? Assessment Methodologies.
ERIC Educational Resources Information Center
Blandy, Richard; Freeland, Brett
In Australia and elsewhere, four approaches have been used to determine whether stocks of vocational education and training (VET) skills are adequate to meet industry needs. The four methods are as follows: (1) the manpower requirements approach; (2) the international, national, and industry comparisons approach; (3) the labor market analysis…
Do Beginning Teachers Receive Adequate Support from Their Headteachers?
ERIC Educational Resources Information Center
Menon, Maria Eliophotou
2012-01-01
The article examines the problems faced by beginning teachers in Cyprus and the extent to which headteachers are considered to provide adequate guidance and support to them. Data were collected through interviews with 25 school teachers in Cyprus, who had recently entered teaching (within 1-5 years) in public primary schools. According to the…
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...
13 CFR 108.200 - Adequate capital for NMVC Companies.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Companies. 108.200 Section 108.200 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Qualifications for the NMVC Program Capitalizing A Nmvc Company § 108.200 Adequate capital for NMVC Companies. You must meet the requirements of §§ 108.200-108.230 in order...
Understanding Your Adequate Yearly Progress (AYP), 2011-2012
ERIC Educational Resources Information Center
Missouri Department of Elementary and Secondary Education, 2011
2011-01-01
The "No Child Left Behind Act (NCLB) of 2001" requires all schools, districts/local education agencies (LEAs) and states to show that students are making Adequate Yearly Progress (AYP). NCLB requires states to establish targets in the following ways: (1) Annual Proficiency Target; (2) Attendance/Graduation Rates; and (3) Participation…
34 CFR 200.13 - Adequate yearly progress in general.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 1 2011-07-01 2011-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...
34 CFR 200.13 - Adequate yearly progress in general.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 1 2013-07-01 2013-07-01 false Adequate yearly progress in general. 200.13 Section 200.13 Education Regulations of the Offices of the Department of Education OFFICE OF ELEMENTARY AND SECONDARY EDUCATION, DEPARTMENT OF EDUCATION TITLE I-IMPROVING THE ACADEMIC ACHIEVEMENT OF THE...
Region 9: Arizona Adequate Letter (11/1/2001)
This is a letter from Jack P. Broadbent, Director, Air Division to Nancy Wrona and James Bourney informing them of the adequacy of Revised MAG 1999 Serious Area Carbon Monoxide Plan and that the MAG CO Plan is adequate for Maricopa County.
Army General Fund Adjustments Not Adequately Documented or Supported
2016-07-26
statements were unreliable and lacked an adequate audit trail. Furthermore, DoD and Army managers could not rely on the data in their accounting...risk that AGF financial statements will be materially misstated and the Army will not achieve audit readiness by the congressionally mandated...and $6.5 trillion in yearend adjustments made to Army General Fund data during FY 2015 financial statement compilation. We conducted this audit in
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
Vaumourin, Elise; Vourc'h, Gwenaël; Telfer, Sandra; Lambin, Xavier; Salih, Diaeldin; Seitzer, Ulrike; Morand, Serge; Charbonnel, Nathalie; Vayssier-Taussat, Muriel; Gasqui, Patrick
2014-01-01
A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e., the generalized chi-square, the network and the multinomial GLM approaches) to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: (1) rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and (2) bovine population infected with Theileria sp. and Babesia sp. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans, and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unraveling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions between
Hehir-Kwa, Jayne Y; Egmont-Petersen, Michael; Janssen, Irene M; Smeets, Dominique; van Kessel, Ad Geurts; Veltman, Joris A
2007-02-28
Recently, comparative genomic hybridization onto bacterial artificial chromosome (BAC) arrays (array-based comparative genomic hybridization) has proved to be successful for the detection of submicroscopic DNA copy-number variations in health and disease. Technological improvements to achieve a higher resolution have resulted in the generation of additional microarray platforms encompassing larger numbers of shorter DNA targets (oligonucleotides). Here, we present a novel method to estimate the ability of a microarray to detect genomic copy-number variations of different sizes and types (i.e. deletions or duplications). We applied our method, which is based on statistical power analysis, to four widely used high-density genomic microarray platforms. By doing so, we found that the high-density oligonucleotide platforms are superior to the BAC platform for the genome-wide detection of copy-number variations smaller than 1 Mb. The capacity to reliably detect single copy-number variations below 100 kb, however, appeared to be limited for all platforms tested. In addition, our analysis revealed an unexpected platform-dependent difference in sensitivity to detect a single copy-number loss and a single copy-number gain. These analyses provide a first objective insight into the true capacities and limitations of different genomic microarrays to detect and define DNA copy-number variations.
Ard, M Colin; Galasko, Douglas R; Edland, Steven D
2013-01-01
Discovery of effective treatment for Alzheimer disease (AD) depends upon the availability of outcome measures that exhibit good sensitivity to rates of longitudinal decline on global functional performance. The Alzheimer's Disease Cooperative Study-Activities of Daily Living inventory (ADCS-ADL) is a frequently used functional endpoint in clinical trials for AD that assesses patient functional ability on the basis of informant ratings of patient performance on a variety of everyday tasks. Previous research has shown that the items comprising the ADCS-ADL are sensitive to characteristic longitudinal trajectories in AD. However, standard procedures for combining information from individual items into an overall test score may not make full use of the information provided by informant responses. The current study explored an application of item-response theory (IRT) techniques to the calculation of test scores on the ADCS-ADL. Using data from 2 ADCS clinical trials on mild-to-moderate AD patients we found that IRT based scoring increased sensitivity to change in functional ability and improved prospective statistical power of the ADCS-ADL as an outcome measure in clinical trials.
Pestana, Dinis
2013-01-01
Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.
Genetic modification of preimplantation embryos: toward adequate human research policies.
Dresser, Rebecca
2004-01-01
Citing advances in transgenic animal research and setbacks in human trials of somatic cell genetic interventions, some scientists and others want to begin planning for research involving the genetic modification of human embryos. Because this form of genetic modification could affect later-born children and their offspring, the protection of human subjects should be a priority in decisions about whether to proceed with such research. Yet because of gaps in existing federal policies, embryo modification proposals might not receive adequate scientific and ethical scrutiny. This article describes current policy shortcomings and recommends policy actions designed to ensure that the investigational genetic modification of embryos meets accepted standards for research on human subjects.
Elements for adequate informed consent in the surgical context.
Abaunza, Hernando; Romero, Klaus
2014-07-01
Given a history of atrocities and violations of ethical principles, several documents and regulations have been issued by a wide variety of organizations. They aim at ensuring that health care and clinical research adhere to defined ethical principles. A fundamental component was devised to ensure that the individual has been provided the necessary information to make an informed decision regarding health care or participation in clinical research. This article summarizes the history and regulations for informed consent and discusses suggested components for adequate consent forms for daily clinical practice in surgery as well as clinical research.
Quantifying dose to the reconstructed breast: Can we adequately treat?
Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M.; Pierce, Lori J.
2013-04-01
To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos
2016-01-01
One-dimensional (1D) kinematic, force, and EMG trajectories are often analyzed using zero-dimensional (0D) metrics like local extrema. Recently whole-trajectory 1D methods have emerged in the literature as alternatives. Since 0D and 1D methods can yield qualitatively different results, the two approaches may appear to be theoretically distinct. The purposes of this paper were (a) to clarify that 0D and 1D approaches are actually just special cases of a more general region-of-interest (ROI) analysis framework, and (b) to demonstrate how ROIs can augment statistical power. We first simulated millions of smooth, random 1D datasets to validate theoretical predictions of the 0D, 1D and ROI approaches and to emphasize how ROIs provide a continuous bridge between 0D and 1D results. We then analyzed a variety of public datasets to demonstrate potential effects of ROIs on biomechanical conclusions. Results showed, first, that a priori ROI particulars can qualitatively affect the biomechanical conclusions that emerge from analyses and, second, that ROIs derived from exploratory/pilot analyses can detect smaller biomechanical effects than are detectable using full 1D methods. We recommend regarding ROIs, like data filtering particulars and Type I error rate, as parameters which can affect hypothesis testing results, and thus as sensitivity analysis tools to ensure arbitrary decisions do not influence scientific interpretations. Last, we describe open-source Python and MATLAB implementations of 1D ROI analysis for arbitrary experimental designs ranging from one-sample t tests to MANOVA.
Prostate cancer between prognosis and adequate/proper therapy
Grozescu, T; Popa, F
2017-01-01
Knowing the indolent, non-invasive nature of most types of prostate cancer, as well as the simple fact that the disease seems more likely to be associated with age rather than with other factors (50% of men at the age of 50 and 80% at the age of 80 have it [1], with or without presenting any symptom), the big challenge of this clinical entity was to determine severity indicators (so far insufficient) to guide the physician towards an adequate attitude in the clinical setting. The risk of over-diagnosing and over-treating many prostate cancer cases (indicated by all the major European and American studies) is real and poses many question marks. The present paper was meant to deliver new research data and to reset the clinical approach in prostate cancer cases. PMID:28255369
The cerebellopontine angle: does the translabyrinthine approach give adequate access?
Fagan, P A; Sheehy, J P; Chang, P; Doust, B D; Coakley, D; Atlas, M D
1998-05-01
A long-standing but unfounded criticism of the translabyrinthine approach is the misperception that this approach does not give adequate access to the cerebellopontine angle. Because of what is perceived as limited visualization and operating space within the cerebellopontine angle, some surgeons still believe that the translabyrinthine approach is inappropriate for large acoustic tumors. In this study, the surgical access to the cerebellopontine angle by virtue of the translabyrinthine approach is measured and analyzed. The parameters are compared with those measured for the retrosigmoid approach. This series objectively confirms that the translabyrinthine approach offers the neurotologic surgeon a shorter operative depth to the tumor, via a similar-sized craniotomy. This permits superior visualization by virtue of a wider angle of surgical access. Such access is achieved with the merit of minimal cerebellar retraction.
Barriers to adequate prenatal care utilization in American Samoa
Hawley, Nicola L; Brown, Carolyn; Nu’usolia, Ofeira; Ah-Ching, John; Muasau-Howard, Bethel; McGarvey, Stephen T
2013-01-01
Objective To describe the utilization of prenatal care in American Samoan women and to identify socio-demographic predictors of inadequate prenatal care utilization. Methods Using data from prenatal clinic records, women (n=692) were categorized according to the Adequacy of Prenatal Care Utilization Index as having received adequate plus, adequate, intermediate or inadequate prenatal care during their pregnancy. Categorical socio-demographic predictors of the timing of initiation of prenatal care (week of gestation) and the adequacy of received services were identified using one way Analysis of Variance (ANOVA) and independent samples t-tests. Results Between 2001 and 2008 85.4% of women received inadequate prenatal care. Parity (P=0.02), maternal unemployment (P=0.03), and both parents being unemployed (P=0.03) were negatively associated with the timing of prenatal care initation. Giving birth in 2007–2008, after a prenatal care incentive scheme had been introduced in the major hospital, was associated with earlier initiation of prenatal care (20.75 versus 25.12 weeks; P<0.01) and improved adequacy of received services (95.04% versus 83.8%; P=0.02). Conclusion The poor prenatal care utilization in American Samoa is a major concern. Improving healthcare accessibility will be key in encouraging women to attend prenatal care. The significant improvements in the adequacy of prenatal care seen in 2007–2008 suggest that the prenatal care incentive program implemented in 2006 may be a very positive step toward addressing issues of prenatal care utilization in this population. PMID:24045912
Are Academic Programs Adequate for the Software Profession?
ERIC Educational Resources Information Center
Koster, Alexis
2010-01-01
According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…
Shi, Runhua; McLarty, Jerry W
2009-10-01
In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.
ERIC Educational Resources Information Center
Callamaras, Peter
1983-01-01
This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.
Nicodemus, Kristin K; Luna, Augustin; Shugart, Yin Yao
2007-01-01
Researchers conducting family-based association studies have a wide variety of transmission/disequilibrium (TD)-based methods to choose from, but few guidelines exist in the selection of a particular method to apply to available data. Using a simulation study design, we compared the power and type I error of eight popular TD-based methods under different family structures, frequencies of missing parental data, genetic models, and population stratifications. No method was uniformly most powerful under all conditions, but type I error was appropriate for nearly every test statistic under all conditions. Power varied widely across methods, with a 46.5% difference in power observed between the most powerful and the least powerful method when 50% of families consisted of an affected sib pair and one parent genotyped under an additive genetic model and a 35.2% difference when 50% of families consisted of a single affection-discordant sibling pair without parental genotypes available under an additive genetic model. Methods were generally robust to population stratification, although some slightly less so than others. The choice of a TD-based test statistic should be dependent on the predominant family structure ascertained, the frequency of missing parental genotypes, and the assumed genetic model.
Smoking Cessation/Prevention in the Air Force: How Adequate
2007-11-02
FNP , Com. Member Approval DaW APPROVED: F.G. Abdellah, EdD, ScD, RN, FAAN Date Dean CURRICULUM VITAE Name: Cheryl Anita Udensi Permanent Address...sample of charts and client interview was employed to compare providers’ documented practice protocols with established guidelines set by the Department of...validity in smoking cessation practices was utilized. A pilot study was done to determine intercoder reliability. Descriptive statistics were utilized to
Adequate Yearly Progress (AYP) at Your Library Media Center
ERIC Educational Resources Information Center
Anderson, Cynthia
2007-01-01
Together administration and the library media center form a team that can make a difference in student learning and, in turn, in student achievement. The library media center can contribute to improve student learning, and there is an amazingly small cost that administration must pay for this powerful support. This article addresses…
Systemic Crisis of Civilization: In Search for Adequate Solution
NASA Astrophysics Data System (ADS)
Khozin, Grigori
In December 1972 a jumbo jet crashed in the Florida Everglades with the loss of 101 lives. The pilot, distracted by a minor malfunction, failed to note until too late the warning signal that - correctly - indicated an impending disaster. His sudden, astonished cry of Hey, what happening here? were his last words 1. Three decades after this tragic episode, as the Humankind approaches the threshold of the third Millennium, the problem of adequate reaction to warning signals of different nature and of distinguishing minor malfunctions in everyday life of society, in economy and technology as well as in evolution of biosphere from grave threats to the world community and the phenomenon of life on our planet remains crucial to human survival and the future of Civilization. Rational use of knowledge and technology available to the world community remains in this context the corner stone of discussions on the destiny of the intelligent life both on the planet Earth and in the Universe (the fact of intelligent life in the Universe is to be detected by the Humankind)…
ENSURING ADEQUATE SAFETY WHEN USING HYDROGEN AS A FUEL
Coutts, D
2007-01-22
Demonstration projects using hydrogen as a fuel are becoming very common. Often these projects rely on project-specific risk evaluations to support project safety decisions. This is necessary because regulations, codes, and standards (hereafter referred to as standards) are just being developed. This paper will review some of the approaches being used in these evolving standards, and techniques which demonstration projects can implement to bridge the gap between current requirements and stakeholder desires. Many of the evolving standards for hydrogen-fuel use performance-based language, which establishes minimum performance and safety objectives, as compared with prescriptive-based language that prescribes specific design solutions. This is being done for several reasons including: (1) concern that establishing specific design solutions too early will stifle invention, (2) sparse performance data necessary to support selection of design approaches, and (3) a risk-adverse public which is unwilling to accept losses that were incurred in developing previous prescriptive design standards. The evolving standards often contain words such as: ''The manufacturer shall implement the measures and provide the information necessary to minimize the risk of endangering a person's safety or health''. This typically implies that the manufacturer or project manager must produce and document an acceptable level of risk. If accomplished using comprehensive and systematic process the demonstration project risk assessment can ease the transition to widespread commercialization. An approach to adequately evaluate and document the safety risk will be presented.
DARHT -- an adequate EIS: A NEPA case study
Webb, M.D.
1997-08-01
In April 1996 the US District Court in Albuquerque ruled that the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility Environmental Impact Statement (EIS), prepared by the Los Alamos Area Office, US Department of Energy (DOE), was adequate. The DARHT EIS had been prepared in the face of a lawsuit in only 10 months, a third of the time usually allotted for a DOE EIS, and for only a small fraction of the cost of a typical DOE EIS, and for only a small fraction of the cost of a typical DOE EIS. It subject was the first major facility to be built in decades for the DOE nuclear weapons stockpile stewardship program. It was the first EIS to be prepared for a proposal at DOE`s Los Alamos National Laboratory since 1979, and the first ever prepared by the Los Alamos Area Office. Much of the subject matter was classified. The facility had been specially designed to minimize impacts to a nearby prehistoric Native American ruin, and extensive consultation with American Indian Pueblos was required. The week that the draft EIS was published Laboratory biologists identified a previously unknown pair of Mexican spotted owls in the immediate vicinity of the project, bringing into play the consultation requirements of the Endangered Species Act. In spite of these obstacles, the resultant DARHT EIS was reviewed by the court and found to meet all statutory and regulatory requirements; the court praised the treatment of the classified material which served as a basis for the environmental analysis.
Dose Limits for Man do not Adequately Protect the Ecosystem
Higley, Kathryn A.; Alexakhin, Rudolf M.; McDonald, Joseph C.
2004-08-01
It has been known for quite some time that different organisms display differing degrees of sensitivity to the effects of ionizing radiations. Some microorganisms such as the bacterium Micrococcus radiodurans, along with many species of invertebrates, are extremely radio-resistant. Humans might be categorized as being relatively sensitive to radiation, and are a bit more resistant than some pine trees. Therefore, it could be argued that maintaining the dose limits necessary to protect humans will also result in the protection of most other species of flora and fauna. This concept is usually referred to as the anthropocentric approach. In other words, if man is protected then the environment is also adequately protected. The ecocentric approach might be stated as; the health of humans is effectively protected only when the environment is not unduly exposed to radiation. The ICRP is working on new recommendations dealing with the protection of the environment, and this debate should help to highlight a number of relevant issues concerning that topic.
Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A
2008-01-01
Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.
On Adequate Comparisons of Antenna Phase Center Variations
NASA Astrophysics Data System (ADS)
Schoen, S.; Kersten, T.
2013-12-01
One important part for ensuring the high quality of the International GNSS Service's (IGS) products is the collection and publication of receiver - and satellite antenna phase center variations (PCV). The PCV are crucial for global and regional networks, since they introduce a global scale factor of up to 16ppb or changes in the height component with an amount of up to 10cm, respectively. Furthermore, antenna phase center variations are also important for precise orbit determination, navigation and positioning of mobile platforms, like e.g. the GOCE and GRACE gravity missions, or for the accurate Precise Point Positioning (PPP) processing. Using the EUREF Permanent Network (EPN), Baire et al. (2012) showed that individual PCV values have a significant impact on the geodetic positioning. The statements are further supported by studies of Steigenberger et al. (2013) where the impact of PCV for local-ties are analysed. Currently, there are five calibration institutions including the Institut für Erdmessung (IfE) contributing to the IGS PCV file. Different approaches like field calibrations and anechoic chamber measurements are in use. Additionally, the computation and parameterization of the PCV are completely different within the methods. Therefore, every new approach has to pass a benchmark test in order to ensure that variations of PCV values of an identical antenna obtained from different methods are as consistent as possible. Since the number of approaches to obtain these PCV values rises with the number of calibration institutions, there is the necessity for an adequate comparison concept, taking into account not only the numerical values but also stochastic information and computational issues of the determined PCVs. This is of special importance, since the majority of calibrated receiver antennas published by the IGS origin from absolute field calibrations based on the Hannover Concept, Wübbena et al. (2000). In this contribution, a concept for the adequate
Data series embedding and scale invariant statistics.
Michieli, I; Medved, B; Ristov, S
2010-06-01
Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated.
Bridge, P D; Sawilowsky, S S
1999-03-01
To effectively evaluate medical literature, practicing physicians and medical researchers must understand the impact of statistical tests on research outcomes. Applying inefficient statistics not only increases the need for resources, but more importantly increases the probability of committing a Type I or Type II error. The t-test is one of the most prevalent tests used in the medical field and is the uniformally most powerful unbiased test (UMPU) under normal curve theory. But does it maintain its UMPU properties when assumptions of normality are violated? A Monte Carlo investigation evaluates the comparative power of the independent samples t-test and its nonparametric counterpart, the Wilcoxon Rank-Sum (WRS) test, to violations from population normality, using three commonly occurring distributions and small sample sizes. The t-test was more powerful under relatively symmetric distributions, although the magnitude of the differences was moderate. Under distributions with extreme skews, the WRS held large power advantages. When distributions consist of heavier tails or extreme skews, the WRS should be the test of choice. In turn, when population characteristics are unknown, the WRS is recommended, based on the magnitude of these power differences in extreme skews, and the modest variation in symmetric distributions.
Are Vancomycin Trough Concentrations Adequate for Optimal Dosing?
Youn, Gilmer; Jones, Brenda; Jelliffe, Roger W.; Drusano, George L.; Rodvold, Keith A.; Lodise, Thomas P.
2014-01-01
The current vancomycin therapeutic guidelines recommend the use of only trough concentrations to manage the dosing of adults with Staphylococcus aureus infections. Both vancomycin efficacy and toxicity are likely to be related to the area under the plasma concentration-time curve (AUC). We assembled richly sampled vancomycin pharmacokinetic data from three studies comprising 47 adults with various levels of renal function. With Pmetrics, the nonparametric population modeling package for R, we compared AUCs estimated from models derived from trough-only and peak-trough depleted versions of the full data set and characterized the relationship between the vancomycin trough concentration and AUC. The trough-only and peak-trough depleted data sets underestimated the true AUCs compared to the full model by a mean (95% confidence interval) of 23% (11 to 33%; P = 0.0001) and 14% (7 to 19%; P < 0.0001), respectively. In contrast, using the full model as a Bayesian prior with trough-only data allowed 97% (93 to 102%; P = 0.23) accurate AUC estimation. On the basis of 5,000 profiles simulated from the full model, among adults with normal renal function and a therapeutic AUC of ≥400 mg · h/liter for an organism for which the vancomycin MIC is 1 mg/liter, approximately 60% are expected to have a trough concentration below the suggested minimum target of 15 mg/liter for serious infections, which could result in needlessly increased doses and a risk of toxicity. Our data indicate that adjustment of vancomycin doses on the basis of trough concentrations without a Bayesian tool results in poor achievement of maximally safe and effective drug exposures in plasma and that many adults can have an adequate vancomycin AUC with a trough concentration of <15 mg/liter. PMID:24165176
Matching occupation and self: does matching theory adequately model children's thinking?
Watson, Mark; McMahon, Mary
2004-10-01
The present exploratory-descriptive cross-national study focused on the career development of 11- to 14-yr.-old children, in particular whether they can match their personal characteristics with their occupational aspirations. Further, the study explored whether their matching may be explained in terms of a fit between person and environment using Holland's theory as an example. Participants included 511 South African and 372 Australian children. Findings relate to two items of the Revised Career Awareness Survey that require children to relate personal-social knowledge to their favorite occupation. Data were analyzed in three stages using descriptive statistics, i.e., mean scores, frequencies, and percentage agreement. The study indicated that children perceived their personal characteristics to be related to their occupational aspirations. However, how this matching takes place is not adequately accounted for in terms of a career theory such as that of Holland.
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Montano, Rosa
2013-01-01
We investigate the performance of three statistics, R [subscript 1], R [subscript 2] (Glas in "Psychometrika" 53:525-546, 1988), and M [subscript 2] (Maydeu-Olivares & Joe in "J. Am. Stat. Assoc." 100:1009-1020, 2005, "Psychometrika" 71:713-732, 2006) to assess the overall fit of a one-parameter logistic model…
Cappon, Gregg D; Bowman, Christopher J; Hurtt, Mark E; Grantham, Lonnie E
2012-10-01
An important aspect of the enhanced pre- and postnatal developmental (ePPND) toxicity study in nonhuman primates (NHP) is that it combines in utero and postnatal assessments in a single study. However, it is unclear if NHP ePPND studies are suitable to perform all of the evaluations incorporated into rodent PPND studies. To understand the value of including cognitive assessment in a NHP ePPND toxicity study, we performed a power analysis of object discrimination reversal task data using a modified Wisconsin General Testing Apparatus (ODR-WGTA) from two NHP ePPND studies. ODR-WGTA endpoints evaluated were days to learning and to first reversal, and number of reversals. With α = 0.05 and a one-sided t-test, a sample of seven provided 80% power to predict a 100% increase in all three of the ODR-WGTA endpoints; a sample of 25 provided 80% power to predict a 50% increase. Similar power analyses were performed with data from the Cincinnati Water Maze (CWM) and passive avoidance tests from three rat PPND toxicity studies. Groups of 5 and 15 in the CWM and passive avoidance test, respectively, provided 80% power to detect a 100% change. While the power of the CWM is not far superior to the NHP ODR-WGTA, a clear advantage is the routine use of larger sample size, with a group of 20 rats the CWM provides ~90% power to detect a 50% change. Due to the limitations on the number of animals, the ODR-WGTA may not be suitable for assessing cognitive impairment in NHP ePPND studies.
... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...
Statistical Energy Analysis Program
NASA Technical Reports Server (NTRS)
Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.
1985-01-01
Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
Statistical inference involving binomial and negative binomial parameters.
García-Pérez, Miguel A; Núñez-Antón, Vicente
2009-05-01
Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.
Kawano, Toshihiko
2015-11-10
This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν _{a} is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
21 CFR 801.5 - Medical devices; adequate directions for use.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical devices; adequate directions for use. 801... (CONTINUED) MEDICAL DEVICES LABELING General Labeling Provisions § 801.5 Medical devices; adequate directions for use. Adequate directions for use means directions under which the layman can use a device...
ERIC Educational Resources Information Center
Chicot, Katie; Holmes, Hilary
2012-01-01
The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…
NASA Astrophysics Data System (ADS)
Khan, Shahjahan
Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model
NASA Astrophysics Data System (ADS)
Khan, Shahjahan
Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model
Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe
2016-01-01
Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.
Order Statistics and Nonparametric Statistics.
2014-09-26
Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
ERIC Educational Resources Information Center
Martin, Tammy Faith
2012-01-01
The purpose of this study was to examine principal leadership styles and their influence on school performance as measured by adequate yearly progress at selected Title I schools in South Carolina. The main focus of the research study was to complete descriptive statistics on principal leadership styles in schools that met or did not meet adequate…
Scholfield, D.J.; Fields, M.; Beal, T.; Lewis, C.G.; Behall, K.M. )
1989-02-09
The symptoms of copper (Cu) deficiency are known to be more severe when rats are fed a diet with fructose (F) as the principal carbohydrate. Mortality, in males, due to cardiac abnormalities usually occurs after five weeks of a 62% F, 0.6 ppm Cu deficient diet. These effects are not observed if cornstarch (CS) is the carbohydrate (CHO) source. Studies with F containing diets have shown increased catecholamine (C) turnover rates while diets deficient in Cu result in decreased norepinephrine (N) levels in tissues. Dopamine B-hydroxylase (EC 1.14.17.1) is a Cu dependent enzyme which catalyzes the conversion of dopamine (D) to N. An experiment was designed to investigate the effects of CHO and dietary Cu on levels of three C in cardiac tissue. Thirty-two male and female Sprague-Dawley rats were fed Cu deficient or adequate diets with 60% of calories from F or CS for 6 weeks. N, epinephrine (E) and D were measured by HPLC. Statistical analysis indicates that Cu deficiency tends to decrease N levels, while having the reverse effect on E. D did not appear to change. These findings indicate that Cu deficiency but not dietary CHO can affect the concentration of N and E in rat cardiac tissue.
Adequate nutrient intake can reduce cardiovascular disease risk in African Americans.
Reusser, Molly E; DiRienzo, Douglas B; Miller, Gregory D; McCarron, David A
2003-03-01
Cardiovascular disease kills nearly as many Americans each year as the next seven leading causes of death combined. The prevalence of cardiovascular disease and most of its associated risk factors is markedly higher and increasing more rapidly among African Americans than in any other racial or ethnic group. Improving these statistics may be simply a matter of improving diet quality. In recent years, a substantial and growing body of evidence has revealed that dietary patterns complete in all food groups, including nutrient-rich dairy products, are essential for preventing and reducing cardiovascular disease and the conditions that contribute to it. Several cardiovascular risk factors, including hypertension, insulin resistance syndrome, and obesity, have been shown to be positively influenced by dietary patterns that include adequate intake of dairy products. The benefits of nutrient-rich dietary patterns have been specifically tested in randomized, controlled trials emphasizing African American populations. These studies demonstrated proportionally greater benefits for African Americans without evidence of adverse effects such as symptoms of lactose intolerance. As currently promoted for the prevention of certain cancers and osteoporosis, regular consumption of diets that meet recommended nutrient intake levels might also be the most effective approach for reducing cardiovascular disease risk in African Americans.
Gaussian membership functions are most adequate in representing uncertainty in measurements
NASA Technical Reports Server (NTRS)
Kreinovich, V.; Quintana, C.; Reznik, L.
1992-01-01
In rare situations, like fundamental physics, we perform experiments without knowing what their results will be. In the majority of real-life measurement situations, we more or less know beforehand what kind of results we will get. Of course, this is not the precise knowledge of the type 'the result will be between alpha - beta and alpha + beta,' because in this case, we would not need any measurements at all. This is usually a knowledge that is best represented in uncertain terms, like 'perhaps (or 'most likely', etc.) the measured value x is between alpha - beta and alpha + beta.' Traditional statistical methods neglect this additional knowledge and process only the measurement results. So it is desirable to be able to process this uncertain knowledge as well. A natural way to process it is by using fuzzy logic. But, there is a problem; we can use different membership functions to represent the same uncertain statements, and different functions lead to different results. What membership function do we choose? In the present paper, we show that under some reasonable assumptions, Gaussian functions mu(x) = exp(-beta(x(exp 2))) are the most adequate choice of the membership functions for representing uncertainty in measurements. This representation was efficiently used in testing jet engines to airplanes and spaceships.
NASA Astrophysics Data System (ADS)
Mondal, Rajesh; Bharadwaj, Somnath; Majumdar, Suman
2017-01-01
The epoch of reionization (EoR) 21-cm signal is expected to become highly non-Gaussian as reionization progresses. This severely affects the error-covariance of the EoR 21-cm power spectrum that is important for predicting the prospects of a detection with ongoing and future experiments. Most earlier works have assumed that the EoR 21-cm signal is a Gaussian random field where (1) the error-variance depends only on the power spectrum and the number of Fourier modes in the particular k bin, and (2) the errors in the different k bins are uncorrelated. Here, we use an ensemble of simulated 21-cm maps to analyse the error-covariance at various stages of reionization. We find that even at the very early stages of reionization (bar{x}_{H I}˜ 0.9), the error-variance significantly exceeds the Gaussian predictions at small length-scales (k > 0.5 Mpc-1) while they are consistent at larger scales. The errors in most k bins (both large and small scales) are however found to be correlated. Considering the later stages (bar{x}_{H I}=0.15), the error-variance shows an excess in all k bins within k ≥ 0.1 Mpc-1, and it is around 200 times larger than the Gaussian prediction at k ˜ 1 Mpc-1. The errors in the different k bins are all also highly correlated, barring the two smallest k bins that are anti-correlated with the other bins. Our results imply that the predictions for different 21-cm experiments based on the Gaussian assumption underestimate the errors, and it is necessary to incorporate the non-Gaussianity for more realistic predictions.
Not Available
1994-12-08
This report presents a summary of electric power industry statistics at national, regional, and state levels: generating capability and additions, net generation, fossil-fuel statistics, retail sales and revenue, finanical statistics, environmental statistics, power transactions, demand side management, nonutility power producers. Purpose is to provide industry decisionmakers, government policymakers, analysts, and the public with historical data that may be used in understanding US electricity markets.
Thiele, Bret
2002-01-01
The human right to adequate housing is enshrined in international law. The right to adequate housing can be traced to the Universal Declaration of Human Rights, which was unanimously adopted by the world community in 1948. Since that time, the right to adequate housing has been reaffirmed on numerous occasions and further defined and elaborated. A key component of this right is habitability of housing, which should comply with health and safety standards. Therefore, the right to adequate housing provides an additional tool for advocates and others interested in promoting healthful housing and living conditions and thereby protecting individual and community health. PMID:11988432
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
NASA Astrophysics Data System (ADS)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
The Effects of Flare Definitions on the Statistics of Derived Flare Distrubtions
NASA Astrophysics Data System (ADS)
Ryan, Daniel; Dominique, Marie; Seaton, Daniel B.; Stegen, Koen; White, Arthur
2016-05-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. However, statistical flare studies are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds which may affect the derived flare distributions. We explore the effect of the arbitrary thresholds used in the GOES event list and LYRA Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the algorithms’ flare start thresholds. We also find that the power law exponents of these distributions are not stable but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are clearly non-power law. We show that this is consistent with an insufficient degradation correction which causes LYRA absolute irradiance values to be unreliable. This means that they should not be used for flare statistics or energetics unless degradation is adequately accounted for. However they can be used to study time variations over shorter timescales and for space weather monitoring.
NASA Astrophysics Data System (ADS)
Lambert, I. B.
2012-04-01
This presentation will consider the adequacy of global uranium and thorium resources to meet realistic nuclear power demand scenarios over the next half century. It is presented on behalf of, and based on evaluations by, the Uranium Group - a joint initiative of the OECD Nuclear Energy Agency and the International Atomic Energy Agency, of which the author is a Vice Chair. The Uranium Group produces a biennial report on Uranium Resources, Production and Demand based on information from some 40 countries involved in the nuclear fuel cycle, which also briefly reviews thorium resources. Uranium: In 2008, world production of uranium amounted to almost 44,000 tonnes (tU). This supplied approximately three-quarters of world reactor requirements (approx. 59,000 tU), the remainder being met by previously mined uranium (so-called secondary sources). Information on availability of secondary sources - which include uranium from excess inventories, dismantling nuclear warheads, tails and spent fuel reprocessing - is incomplete, but such sources are expected to decrease in market importance after 2013. In 2008, the total world Reasonably Assured plus Inferred Resources of uranium (recoverable at less than 130/kgU) amounted to 5.4 million tonnes. In addition, it is clear that there are vast amounts of uranium recoverable at higher costs in known deposits, plus many as yet undiscovered deposits. The Uranium Group has concluded that the uranium resource base is more than adequate to meet projected high-case requirements for nuclear power for at least half a century. This conclusion does not assume increasing replacement of uranium by fuels from reprocessing current reactor wastes, or by thorium, nor greater reactor efficiencies, which are likely to ameliorate future uranium demand. However, progressively increasing quantities of uranium will need to be mined, against a backdrop of the relatively small number of producing facilities around the world, geopolitical uncertainties and
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2010 CFR
2010-01-01
... veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT... and Adequate Veterinary Care § 2.40 Attending veterinarian and adequate veterinary care (dealers and... veterinary care to its animals in compliance with this section. (1) Each dealer and exhibitor shall employ...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2010 CFR
2010-01-01
... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
40 CFR 152.20 - Exemptions for pesticides adequately regulated by another Federal agency.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Exemptions for pesticides adequately... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Exemptions § 152.20 Exemptions for pesticides adequately regulated by another Federal agency. The...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 4 2014-10-01 2014-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Assurances of adequate capacity and services. 438... Improvement Access Standards § 438.207 Assurances of adequate capacity and services. (a) Basic rule. The State... provides supporting documentation that demonstrates that it has the capacity to serve the...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall provide adequate veterinary care to its animals in compliance with this section: (1) Each research facility...
Statistical Challenges of Astronomy
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.; Babu, G. Jogesh
Digital sky surveys, data from orbiting telescopes, and advances in computation have increased the quantity and quality of astronomical data by several orders of magnitude in recent years. Making sense of this wealth of data requires sophisticated statistical and data analytic techniques. Fortunately, statistical methodologies have similarly made great strides in recent years. Powerful synergies thus emerge when astronomers and statisticians join in examining astrostatistical problems and approaches. The volume focuses on several themes: · The increasing power of Bayesian approaches to modeling astronomical data · The growth of enormous databases, leading an emerging federated Virtual Observatory, and their impact on modern astronomical research · Statistical modeling of critical datasets, such as galaxy clustering and fluctuations in the microwave background radiation, leading to a new era of precision cosmology · Methodologies for uncovering clusters and patterns in multivariate data · The characterization of multiscale patterns in imaging and time series data As in earlier volumes in this series, research contributions discussing topics in one field are joined with commentary from scholars in the other. Short contributed papers covering dozens of astrostatistical topics are also included.
Code of Federal Regulations, 2011 CFR
2011-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...
Code of Federal Regulations, 2014 CFR
2014-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... of maintaining adequate technical, physical, and security safeguards to prevent...
Statistical considerations in monitoring birds over large areas
Johnson, D.H.
2000-01-01
The proper design of a monitoring effort depends primarily on the objectives desired, constrained by the resources available to conduct the work. Typically, managers have numerous objectives, such as determining abundance of the species, detecting changes in population size, evaluating responses to management activities, and assessing habitat associations. A design that is optimal for one objective will likely not be optimal for others. Careful consideration of the importance of the competing objectives may lead to a design that adequately addresses the priority concerns, although it may not be optimal for any individual objective. Poor design or inadequate sample sizes may result in such weak conclusions that the effort is wasted. Statistical expertise can be used at several stages, such as estimating power of certain hypothesis tests, but is perhaps most useful in fundamental considerations of describing objectives and designing sampling plans.
... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...
Barth, Amy E.; Barnes, Marcia; Francis, David J.; Vaughn, Sharon; York, Mary
2015-01-01
Separate mixed model analyses of variance (ANOVA) were conducted to examine the effect of textual distance on the accuracy and speed of text consistency judgments among adequate and struggling comprehenders across grades 6–12 (n = 1203). Multiple regressions examined whether accuracy in text consistency judgments uniquely accounted for variance in comprehension. Results suggest that there is considerable growth across the middle and high school years, particularly for adequate comprehenders in those text integration processes that maintain local coherence. Accuracy in text consistency judgments accounted for significant unique variance for passage-level, but not sentence-level comprehension, particularly for adequate comprehenders. PMID:26166946
Using Multitheory Model of Health Behavior Change to Predict Adequate Sleep Behavior.
Knowlden, Adam P; Sharma, Manoj; Nahar, Vinayak K
The purpose of this article was to use the multitheory model of health behavior change in predicting adequate sleep behavior in college students. A valid and reliable survey was administered in a cross-sectional design (n = 151). For initiation of adequate sleep behavior, the construct of behavioral confidence (P < .001) was found to be significant and accounted for 24.4% of the variance. For sustenance of adequate sleep behavior, changes in social environment (P < .02), emotional transformation (P < .001), and practice for change (P < .001) were significant and accounted for 34.2% of the variance.
Nonstationary statistical theory for multipactor
Anza, S.; Vicente, C.; Gil, J.
2010-06-15
This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.
Photovoltaic power systems workshop
NASA Technical Reports Server (NTRS)
Killian, H. J.; Given, R. W.
1978-01-01
Discussions are presented on apparent deficiencies in NASA planning and technology development relating to a standard power module (25-35 kW) and to future photovoltaic power systems in general. Topics of discussion consider the following: (1) adequate studies on power systems; (2) whether a standard power system module should be developed from a standard spacecraft; (3) identification of proper approaches to cost reduction; (4) energy storage avoidance; (5) attitude control; (6) thermal effects of heat rejection on solar array configuration stability; (7) assembly of large power systems in space; and (8) factoring terrestrial photovoltaic work into space power systems for possible payoff.
Region 8: Colorado Lamar and Steamboat Springs Adequate Letter (11/12/2002)
This letter from EPA to Colorado Department of Public Health and Environment determined Lamar and Steamboat Springs particulate matter (PM10) maintenance plan for Motor Vehicle Emissions Budgets adequate for transportation conformity purposes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... To Maintain Adequate Floodplain Management Regulations AGENCY: Federal Emergency Management Agency... floodplain management regulations meeting minimum requirements under the National Flood Insurance Program... they have brought their floodplain management regulations into compliance with the NFIP...
Region 9: California Adequate / Inadequate Letter Attachment (5/30/2008)
This is a document that states that it has been found adequate for transportation conformitypurposes certain 8-hour ozone and PM2.5 motor vehicleemissions budgets in the 2007 South Coast StateImplementation Plan.
General statistical framework for quantitative proteomics by stable isotope labeling.
Navarro, Pedro; Trevisan-Herraz, Marco; Bonzon-Kulichenko, Elena; Núñez, Estefanía; Martínez-Acedo, Pablo; Pérez-Hernández, Daniel; Jorge, Inmaculada; Mesa, Raquel; Calvo, Enrique; Carrascal, Montserrat; Hernáez, María Luisa; García, Fernando; Bárcena, José Antonio; Ashman, Keith; Abian, Joaquín; Gil, Concha; Redondo, Juan Miguel; Vázquez, Jesús
2014-03-07
The combination of stable isotope labeling (SIL) with mass spectrometry (MS) allows comparison of the abundance of thousands of proteins in complex mixtures. However, interpretation of the large data sets generated by these techniques remains a challenge because appropriate statistical standards are lacking. Here, we present a generally applicable model that accurately explains the behavior of data obtained using current SIL approaches, including (18)O, iTRAQ, and SILAC labeling, and different MS instruments. The model decomposes the total technical variance into the spectral, peptide, and protein variance components, and its general validity was demonstrated by confronting 48 experimental distributions against 18 different null hypotheses. In addition to its general applicability, the performance of the algorithm was at least similar than that of other existing methods. The model also provides a general framework to integrate quantitative and error information fully, allowing a comparative analysis of the results obtained from different SIL experiments. The model was applied to the global analysis of protein alterations induced by low H₂O₂ concentrations in yeast, demonstrating the increased statistical power that may be achieved by rigorous data integration. Our results highlight the importance of establishing an adequate and validated statistical framework for the analysis of high-throughput data.
Individual and contextual determinants of adequate maternal health care services in Kenya.
Achia, Thomas N O; Mageto, Lillian E
2015-01-01
This study aimed to examine individual and community level factors associated with adequate use of maternal antenatal health services in Kenya. Individual and community level factors associated with adequate use of maternal health care (MHC) services were obtained from the 2008-09 Kenya Demographic and Health Survey data set. Multilevel partial-proportional odds logit models were fitted using STATA 13.0 to quantify the relations of the selected covariates to adequate MHC use, defined as a three-category ordinal variable. The sample consisted of 3,621 women who had at least one live birth in the five-year period preceding this survey. Only 18 percent of the women had adequate use of MHC services. Greater educational attainment by the woman or her partner, higher socioeconomic status, access to medical insurance coverage, and greater media exposure were the individual-level factors associated with adequate use of MHC services. Greater community ethnic diversity, higher community-level socioeconomic status, and greater community-level health facility deliveries were the contextual-level factors associated with adequate use of MHC. To improve the use of MHC services in Kenya, the government needs to design and implement programs that target underlying individual and community level factors, providing focused and sustained health education to promote the use of antenatal, delivery, and postnatal care.
Statistics in fusion experiments
NASA Astrophysics Data System (ADS)
McNeill, D. H.
1997-11-01
Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).
Sample size estimation and power analysis for clinical research studies
Suresh, KP; Chandrashekara, S
2012-01-01
Determining the optimal sample size for a study assures an adequate power to detect statistical significance. Hence, it is a critical step in the design of a planned research protocol. Using too many participants in a study is expensive and exposes more number of subjects to procedure. Similarly, if study is underpowered, it will be statistically inconclusive and may make the whole protocol a failure. This paper covers the essentials in calculating power and sample size for a variety of applied study designs. Sample size computation for single group mean, survey type of studies, 2 group studies based on means and proportions or rates, correlation studies and for case-control for assessing the categorical outcome are presented in detail. PMID:22870008
Power Plant Water Intake Assessment.
ERIC Educational Resources Information Center
Zeitoun, Ibrahim H.; And Others
1980-01-01
In order to adequately assess the impact of power plant cooling water intake on an aquatic ecosystem, total ecosystem effects must be considered, rather than merely numbers of impinged or entrained organisms. (Author/RE)
Temptations of Power and Certainty.
ERIC Educational Resources Information Center
Amundson, Jon; And Others
1993-01-01
Contends that, when therapists do not adequately account for the position of clients, they fall prey to temptation of certainty and that, when therapists attempt to impose corrections from such certainty, they fall prey to temptation of power. Offers suggestions for sidestepping power/certainty by contrasting therapies of power and certainty with…
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
Wang, Zhifang; Zhu, Wenming; Mo, Zhe; Wang, Yuanyang; Mao, Guangming; Wang, Xiaofeng; Lou, Xiaoming
2017-01-01
Universal salt iodization (USI) has been implemented for two decades in China. It is crucial to periodically monitor iodine status in the most vulnerable population, such as pregnant women. A cross-sectional study was carried out in an evidence-proved iodine-sufficient province to evaluate iodine intake in pregnancy. According to the WHO/UNICEF/ICCIDD recommendation criteria of adequate iodine intake in pregnancy (150–249 µg/L), the median urinary iodine concentration (UIC) of the total 8159 recruited pregnant women was 147.5 µg/L, which indicated pregnant women had iodine deficiency at the province level. Overall, 51.0% of the total study participants had iodine deficiency with a UIC < 150 µg/L and only 32.9% of them had adequate iodine. Participants living in coastal areas had iodine deficiency with a median UIC of 130.1 µg/L, while those in inland areas had marginally adequate iodine intake with a median UIC of 158.1 µg/L (p < 0.001). Among the total study participants, 450 pregnant women consuming non-iodized salt had mild-moderate iodine deficiency with a median UIC of 99.6 µg/L; 7363 pregnant women consuming adequately iodized salt had a lightly statistically higher median UIC of 151.9 µg/L, compared with the recommended adequate level by the WHO/UNICEF/ICCIDD (p < 0.001). Consuming adequately iodized salt seemed to lightly increase the median UIC level, but it may not be enough to correct iodine nutrition status to an optimum level as recommended by the WHO/UNICEF/ICCIDD. We therefore suggest that, besides strengthening USI policy, additional interventive measure may be needed to improve iodine intake in pregnancy. PMID:28230748
Harris, Alex; Reeder, Rachelle; Hyun, Jenny
2011-01-01
The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.
Khalfaoui, M; Knani, S; Hachicha, M A; Lamine, A Ben
2003-07-15
New theoretical expressions to model the five adsorption isotherm types have been established. Using the grand canonical ensemble in statistical physics, we give an analytical expression to each of five physical adsorption isotherm types classified by Brunauer, Emett, and Teller, often called BET isotherms. The establishment of these expressions is based on statistical physics and theoretical considerations. This method allowed estimation of all the mathematical parameters in the models. The physicochemical parameters intervening in the adsorption process that the models present could be deduced directly from the experimental adsorption isotherms by numerical simulation. We determine the adequate model for each type of isotherm, which fixes by direct numerical simulation the monolayer, multilayer, or condensation character. New equations are discussed and results obtained are verified for experimental data from the literature. The new theoretical expressions that we have proposed, based on statistical physics treatment, are rather powerful to better understand and interpret the various five physical adsorption type isotherms at a microscopic level.
Bello-Silva, Marina Stella; Wehner, Martin; Eduardo, Carlos de Paula; Lampert, Friedrich; Poprawe, Reinhart; Hermans, Martin; Esteves-Oliveira, Marcella
2013-01-01
This study aimed to evaluate the possibility of introducing ultra-short pulsed lasers (USPL) in restorative dentistry by maintaining the well-known benefits of lasers for caries removal, but also overcoming disadvantages, such as thermal damage of irradiated substrate. USPL ablation of dental hard tissues was investigated in two phases. Phase 1--different wavelengths (355, 532, 1,045, and 1,064 nm), pulse durations (picoseconds and femtoseconds) and irradiation parameters (scanning speed, output power, and pulse repetition rate) were assessed for enamel and dentin. Ablation rate was determined, and the temperature increase measured in real time. Phase 2--the most favorable laser parameters were evaluated to correlate temperature increase to ablation rate and ablation efficiency. The influence of cooling methods (air, air-water spray) on ablation process was further analyzed. All parameters tested provided precise and selective tissue ablation. For all lasers, faster scanning speeds resulted in better interaction and reduced temperature increase. The most adequate results were observed for the 1064-nm ps-laser and the 1045-nm fs-laser. Forced cooling caused moderate changes in temperature increase, but reduced ablation, being considered unnecessary during irradiation with USPL. For dentin, the correlation between temperature increase and ablation efficiency was satisfactory for both pulse durations, while for enamel, the best correlation was observed for fs-laser, independently of the power used. USPL may be suitable for cavity preparation in dentin and enamel, since effective ablation and low temperature increase were observed. If adequate laser parameters are selected, this technique seems to be promising for promoting the laser-assisted, minimally invasive approach.
Brady, T.J.; Thrall, J.H.; Lo, K.; Pitt, B.
1980-12-01
Rest and exercise radionuclide ventriculograms were obtained on 77 symptomatic patients without prior documented coronary artery disease (CAD). Coronary artery disease was present by angiograms in 48. Radionuclide ventriculography (RNV) was abnormal in 41 patients (overall sensitivity 85%). In 29 patients with normal coronary arteries, RNV was normal in 24 (specificity 83%). To determine if the exercise level affects sensitivity, the studies were graded for adequacy of exercise. It was considered adequate if patients developed (a) chest pain, or (b) ST segment depression of at least 1 mm, or (c) if they achieved a pressure rate product greater than 250. Among the 48 patients with coronary artery disease, 35 achieved adequate exercise. Thirty-three had an abnormal RNV (sensitivity 94%). In 13 patients who failed to achieve adequate exercise, RNV was abnormal in eight (sensitivity of only 62%). Some patients with coronary artery disease may have a normal ventricular response at inadequate levels of stress.
Borruat, F X; Buechi, E R; Piguet, B; Fitting, P; Zografos, L; Herbort, C P
1991-05-01
We compared the frequency of severe ocular complications secondary to Herpes Zoster Ophthalmicus (HZO) in 232 patients. They were divided into three groups: 1) patients without treatment (n = 164); 2) patients treated adequately (n = 48) with acyclovir (ACV; 5 x 800 mg/d orally and ophthalmic ointment 5 x /d for a minimum of 7 days, given within three days after skin eruption); and, 3) patients treated inadequately (n = 20) with ACV (only topical treatment, insufficient doses, interrupted treatment, delayed treatment). Patients with no treatment or with inadequate treatments showed the same frequency of severe ocular complications (21% (34/164) and 25% (5/20), respectively). In contrast, when adequate treatment of ACV was given complications occurred in only 4% (2/48) of cases. This study emphasizes the need for prompt (within three days after skin eruption) and adequate (5 x 800 mg/d for at least 7 days) treatment of ACV to prevent the severe complications of HZO.
Broadband inversion of 1J(CC) responses in 1,n-ADEQUATE spectra.
Reibarkh, Mikhail; Williamson, R Thomas; Martin, Gary E; Bermel, Wolfgang
2013-11-01
Establishing the carbon skeleton of a molecule greatly facilitates the process of structure elucidation, both manual and computer-assisted. Recent advances in the family of ADEQUATE experiments demonstrated their potential in this regard. 1,1-ADEQUATE, which provides direct (13)C-(13)C correlation via (1)J(CC), and 1,n-ADEQUATE, which typically yields (3)J(CC) and (1)J(CC) correlations, are more sensitive and more widely applicable experiments than INADEQUATE and PANACEA. A recently reported modified pulse sequence that semi-selectively inverts (1)J(CC) correlations in 1,n-ADEQUATE spectra provided a significant improvement, allowing (1)J(CC) and (n)J(CC) correlations to be discerned in the same spectrum. However, the reported experiment requires a careful matching of the amplitude transfer function with (1)J(CC) coupling constants in order to achieve the inversion, and even then some (1)J(CC) correlations could still have positive intensity due to the oscillatory nature of the transfer function. Both shortcomings limit the practicality of the method. We now report a new, dual-optimized inverted (1)J(CC) 1,n-ADEQUATE experiment, which provides more uniform inversion of (1)J(CC) correlations across the range of 29-82 Hz. Unlike the original method, the dual optimization experiment does not require fine-tuning for the molecule's (1)J(CC) coupling constant values. Even more usefully, the dual-optimized version provides up to two-fold improvement in signal-to-noise for some long-range correlations. Using modern, cryogenically-cooled probes, the experiment can be successfully applied to samples of ~1 mg under favorable circumstances. The improvements afforded by dual optimization inverted (1)J(CC) 1,n-ADEQUATE experiment make it a useful and practical tool for NMR structure elucidation and should facilitate the implementation and utilization of the experiment.
Goodman, Melody S; Gaskin, Darrell J; Si, Xuemei; Stafford, Jewel D; Lachance, Christina; Kaphingst, Kimberly A
2012-09-01
Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent's race, ethnicity, age, education, and country of birth.
Gaskin, Darrell J.; Si, Xuemei; Stafford, Jewel D.; Lachance, Christina; Kaphingst, Kimberly A.
2012-01-01
Residential segregation has been shown to be associated with health outcomes and health care utilization. We examined the association between racial composition of five physical environments throughout the life course and adequate health literacy among 836 community health center patients in Suffolk County, NY. Respondents who attended a mostly White junior high school or currently lived in a mostly White neighborhood were more likely to have adequate health literacy compared to those educated or living in predominantly minority or diverse environments. This association was independent of the respondent’s race, ethnicity, age, education, and country of birth. PMID:22658579
NASA Astrophysics Data System (ADS)
Nowak, Bernard; Łuczak, Rafał
2015-09-01
The article discusses the improvement of thermal working conditions in underground mine workings, using local refrigeration systems. It considers the efficiency of air cooling with direct action air compression refrigerator of the TS-300B type. As a result of a failure to meet the required operating conditions of the aforementioned air cooling system, frequently there are discrepancies between the predicted (and thus the expected) effects of its work and the reality. Therefore, to improve the operating efficiency of this system, in terms of effective use of the evaporator cooling capacity, quality criteria were developed, which are easy in practical application. They were obtained in the form of statistical models, describing the effect of independent variables, i.e. the parameters of the inlet air to the evaporator (temperature, humidity and volumetric flow rate), as well as the parameters of the water cooling the condenser (temperature and volumetric flow rate), on the thermal power of air cooler, treated as the dependent variable. Statistical equations describing the performance of the analyzed air cooling system were determined, based on the linear and nonlinear multiple regression. The obtained functions were modified by changing the values of the coefficients in the case of linear regression, and of the coefficients and exponents in the case of non-linear regression, with the independent variables. As a result, functions were obtained, which were more convenient in practical applications. Using classical statistics methods, the quality of fitting the regression function to the experimental data was evaluated. Also, the values of the evaporator thermal power of the refrigerator, which were obtained on the basis of the measured air parameters, were compared with the calculated ones, by using the obtained regression functions. These statistical models were built on the basis of the results of measurements in different operating conditions of the TS-300B
1993-05-14
p. 15. 15 Ibid p. 15. 16 John Morgan Dederer Making Bricks Without Straw (Manhattan, Kansas, Sunflower University Press,1983), p. 2 17 W. Robert... Dederer , John M. Making Bricks Without Straw. Manhzitan, Kansas: Sunflower University Press, 1983. Evans, Geoffrey. Slim as Military Commander
Statistics Poker: Reinforcing Basic Statistical Concepts
ERIC Educational Resources Information Center
Leech, Nancy L.
2008-01-01
Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…
Rossell, David
2016-01-01
Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040
A derivation of the statistical characteristics of SAR imagery data. [Rayleigh speckle statistics
NASA Technical Reports Server (NTRS)
Wu, C.
1981-01-01
Basic statistical properties of the speckle effect and the associated spatial correlation of SAR image data are discussed. Statistics of SAR sensed measurement and their relationships to the surface mean power reflectivity are derived. The Rayleigh speckle model is reviewed. Applications of the derived statistics to SAR radiometric measures and image processing are considered.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
... ENERGY Safety Analysis Requirements for Defining Adequate Protection for the Public and the Workers... designed to hold firmly in place. 10 CFR Part 830 imposes a requirement that a documented safety analysis... provide guidance on meeting the requirements imposed by DOE Order 5480.23, Nuclear Safety Analysis...
42 CFR 413.24 - Adequate cost data and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... Adequate data capable of being audited is consistent with good business concepts and effective and efficient management of any organization, whether it is operated for profit or on a nonprofit basis. It is a... contract for services (for example, a management contract), directly assigning the costs to the...
Prenatal zinc supplementation of zinc-adequate rats adversely affects immunity in offspring
Technology Transfer Automated Retrieval System (TEKTRAN)
We previously showed that zinc (Zn) supplementation of Zn-adequate dams induced immunosuppressive effects that persist in the offspring after weaning. We investigated whether the immunosuppressive effects were due to in utero exposure and/or mediated via milk using a cross-fostering design. Pregnant...
Towards Defining Adequate Lithium Trials for Individuals with Mental Retardation and Mental Illness.
ERIC Educational Resources Information Center
Pary, Robert J.
1991-01-01
Use of lithium with mentally retarded individuals with psychiatric conditions and/or behavior disturbances is discussed. The paper describes components of an adequate clinical trial and reviews case studies and double-blind cases. The paper concludes that aggression is the best indicator for lithium use, and reviews treatment parameters and…
ADEQUATE SHELTERS AND QUICK REACTIONS TO WARNING: A KEY TO CIVIL DEFENSE.
LYNCH, F X
1963-11-08
Case histories collected by investigators in Japan during 1945 illustrate both the effectiveness of shelters and the dangers inherent in apathy of the population, which suffered needless casualties by ignoring air raid warnintgs. Adequate shelters and immediate response to warnings are essential to survival in nuclear attack.
ERIC Educational Resources Information Center
Moser, Sharon
2010-01-01
The 2007-2008 school year marked the first year Florida's Title I schools that did not made Adequate Yearly Progress (AYP) for five consecutive years entered into restructuring as mandated by the "No Child Left Behind Act" of 2001. My study examines the perceptions of teacher entering into their first year of school restructuring due to…
Code of Federal Regulations, 2010 CFR
2010-10-01
... record systems. These security safeguards shall apply to all systems in which identifiable personal data... data and automated systems shall be adequately trained in the security and privacy of personal data. (4... technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of manual...
Effect of tranquilizers on animal resistance to the adequate stimuli of the vestibular apparatus
NASA Technical Reports Server (NTRS)
Maksimovich, Y. B.; Khinchikashvili, N. V.
1980-01-01
The effect of tranquilizers on vestibulospinal reflexes and motor activity was studied in 900 centrifuged albino mice. Actometric studies have shown that the tranquilizers have a group capacity for increasing animal resistance to the action of adequate stimuli to the vestibular apparatus.
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 1 2012-01-01 2012-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2014 CFR
2014-01-01
... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
9 CFR 2.40 - Attending veterinarian and adequate veterinary care (dealers and exhibitors).
Code of Federal Regulations, 2013 CFR
2013-01-01
... 9 Animals and Animal Products 1 2013-01-01 2013-01-01 false Attending veterinarian and adequate veterinary care (dealers and exhibitors). 2.40 Section 2.40 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Attending...
ERIC Educational Resources Information Center
Ma, Xin; Shen, Jianping; Krenn, Huilan Y.
2014-01-01
Using national data from the 2007-08 School and Staffing Survey, we compared the relationships between parental involvement and school outcomes related to adequate yearly progress (AYP) in urban, suburban, and rural schools. Parent-initiated parental involvement demonstrated significantly positive relationships with both making AYP and staying off…
Godrich, Stephanie L; Lo, Johnny; Davies, Christina R; Darby, Jill; Devine, Amanda
2017-01-03
Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads (n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria (p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed (p = 0.007), promotion (p = 0.017), location of food outlets (p = 0.027), and price (p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing.
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2011 CFR
2011-01-01
... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall...
9 CFR 2.33 - Attending veterinarian and adequate veterinary care.
Code of Federal Regulations, 2014 CFR
2014-01-01
... veterinary care. 2.33 Section 2.33 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE ANIMAL WELFARE REGULATIONS Research Facilities § 2.33 Attending veterinarian and adequate veterinary care. (a) Each research facility shall have an attending veterinarian who shall...
Identifying the Factors Impacting the Adequately Yearly Progress Performance in the United States
ERIC Educational Resources Information Center
Hsieh, Ju-Shan
2013-01-01
The NCLB (No Child Left Behind Act) specifies that states must develop AYP (adequate yearly progress) statewide measurable objectives for improved achievement by all students, including economically disadvantaged students, students from minority races, students with disabilities, and students with limited English proficiency. By the 2013-2014…
42 CFR 438.207 - Assurances of adequate capacity and services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Assurances of adequate capacity and services. 438.207 Section 438.207 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Quality Assessment and...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... HUMAN SERVICES Food and Drug Administration Hemoglobin Standards and Maintaining Adequate Iron Stores in... workshop. The Food and Drug Administration (FDA) is announcing a public workshop entitled: ``Hemoglobin... discuss blood donor hemoglobin and hematocrit qualification standards in the United States, its impact...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...
21 CFR 514.117 - Adequate and well-controlled studies.
Code of Federal Regulations, 2013 CFR
2013-04-01
... adequate and well-controlled studies of a new animal drug is to distinguish the effect of the new animal... with one or more controls to provide a quantitative evaluation of drug effects. The protocol and the... for special circumstances. Examples include studies in which the effect of the new animal drug is...
30 CFR 227.801 - What if a State does not adequately perform a delegated function?
Code of Federal Regulations, 2010 CFR
2010-07-01
... delegated function? 227.801 Section 227.801 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT DELEGATION TO STATES Performance Review § 227.801 What if a State does not adequately perform a delegated function? If your performance of the delegated function does...
Science Education as a Contributor to Adequate Yearly Progress and Accountability Programs
ERIC Educational Resources Information Center
Judson, Eugene
2010-01-01
The No Child Left Behind (NCLB) Act requires states to measure the adequate yearly progress (AYP) of each public school and local educational agency (LEA) and to hold schools and LEAs accountable for failing to make AYP. Although it is required that science be assessed in at least three grades, the achievement results from science examinations are…
Understanding the pelvic pain mechanism is key to find an adequate therapeutic approach.
Van Kerrebroeck, Philip
2016-06-25
Pain is a natural mechanism to actual or potential tissue damage and involves both a sensory and an emotional experience. In chronic pelvic pain, localisation of pain can be widespread and can cause considerable distress. A multidisciplinary approach is needed in order to fully understand the pelvic pain mechanism and to identify an adequate therapeutic approach.
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 1 2012-07-01 2012-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
36 CFR 13.960 - Who determines when there is adequate snow cover?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 1 2013-07-01 2013-07-01 false Who determines when there is adequate snow cover? 13.960 Section 13.960 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR NATIONAL PARK SYSTEM UNITS IN ALASKA Special Regulations-Denali National Park...
Human milk feeding supports adequate growth in infants
Technology Transfer Automated Retrieval System (TEKTRAN)
Despite current nutritional strategies, premature infants remain at high risk for extrauterine growth restriction. The use of an exclusive human milk-based diet is associated with decreased incidence of necrotizing enterocolitis (NEC), but concerns exist about infants achieving adequate growth. The ...
ERIC Educational Resources Information Center
Vannest, Kimberly J.; Temple-Harvey, Kimberly K.; Mason, Benjamin A.
2009-01-01
Because schools are held accountable for the academic performance of all students, it is important to focus on academics and the need for effective teaching practices. Adequate yearly progress, a method of accountability that is part of the No Child Left Behind Act (2001), profoundly affects the education of students who have emotional and…
Influenza 2005-2006: vaccine supplies adequate, but bird flu looms.
Mossad, Sherif B
2005-11-01
Influenza vaccine supplies appear to be adequate for the 2005-2006 season, though delivery has been somewhat delayed. However, in the event of a pandemic of avian flu-considered inevitable by most experts, although no one knows when it will happen-the United States would be woefully unprepared.
ERIC Educational Resources Information Center
Daugherty, Lindsay; Dossani, Rafiq; Johnson, Erin-Elizabeth; Wright, Cameron
2014-01-01
To realize the potential benefits of technology use in early childhood education (ECE), and to ensure that technology can help to address the digital divide, providers, families of young children, and young children themselves must have access to an adequate technology infrastructure. The goals for technology use in ECE that a technology…
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Progressive statistics for studies in sports medicine and exercise science.
Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri
2009-01-01
Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.
Statistics used in current nursing research.
Zellner, Kathleen; Boerst, Connie J; Tabb, Wil
2007-02-01
Undergraduate nursing research courses should emphasize the statistics most commonly used in the nursing literature to strengthen students' and beginning researchers' understanding of them. To determine the most commonly used statistics, we reviewed all quantitative research articles published in 13 nursing journals in 2000. The findings supported Beitz's categorization of kinds of statistics. Ten primary statistics used in 80% of nursing research published in 2000 were identified. We recommend that the appropriate use of those top 10 statistics be emphasized in undergraduate nursing education and that the nursing profession continue to advocate for the use of methods (e.g., power analysis, odds ratio) that may contribute to the advancement of nursing research.
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on how many people survive this type of ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
On More Sensitive Periodogram Statistics
NASA Astrophysics Data System (ADS)
Bélanger, G.
2016-05-01
Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Reviewer Bias for Statistically Significant Results: A Reexamination.
ERIC Educational Resources Information Center
Fagley, N. S.; McKinney, I. Jean
1983-01-01
Reexamines the article by Atkinson, Furlong, and Wampold (1982) and questions their conclusion that reviewers were biased toward statistically significant results. A statistical power analysis shows the power of their bogus study was low. Low power in a study reporting nonsignificant findings is a valid reason for recommending not to publish.…
The concept of adequate causation and Max Weber's comparative sociology of religion.
Buss, A
1999-06-01
Max Weber's The Protestant Ethic and the Spirit of Capitalism, studied in isolation, shows mainly an elective affinity or an adequacy on the level of meaning between the Protestant ethic and the 'spirit' of capitalism. Here it is suggested that Weber's subsequent essays on 'The Economic Ethics of World Religions' are the result of his opinion that adequacy on the level of meaning needs and can be verified by causal adequacy. After some introductory remarks, particularly on elective affinity, the paper tries to develop the concept of adequate causation and the related concept of objective possibility on the basis of the work of v. Kries on whom Weber heavily relied. In the second part, this concept is used to show how the study of the economic ethics of India, China, Rome and orthodox Russia can support the thesis that the 'spirit' of capitalism, although it may not have been caused by the Protestant ethic, was perhaps adequately caused by it.
Chapman, S; Liberman, J
2005-01-01
The right to information is a fundamental consumer value. Following the advent of health warnings, the tobacco industry has repeatedly asserted that smokers are fully informed of the risks they take, while evidence demonstrates widespread superficial levels of awareness and understanding. There remains much that tobacco companies could do to fulfil their responsibilities to inform smokers. We explore issues involved in the meaning of "adequately informed" smoking and discuss some of the key policy and regulatory implications. We use the idea of a smoker licensing scheme—under which it would be illegal to sell to smokers who had not demonstrated an adequate level of awareness—as a device to explore some of these issues. We also explore some of the difficulties that addiction poses for the notion that smokers might ever voluntarily assume the risks of smoking. PMID:16046703
Beneath the Skin: Statistics, Trust, and Status
ERIC Educational Resources Information Center
Smith, Richard
2011-01-01
Overreliance on statistics, and even faith in them--which Richard Smith in this essay calls a branch of "metricophilia"--is a common feature of research in education and in the social sciences more generally. Of course accurate statistics are important, but they often constitute essentially a powerful form of rhetoric. For purposes of analysis and…
Myth 19: Is Advanced Placement an Adequate Program for Gifted Students?
ERIC Educational Resources Information Center
Gallagher, Shelagh A.
2009-01-01
Is it a myth that Advanced Placement (AP) is an adequate program for gifted students? AP is so covered with myths and assumptions that it is hard to get a clear view of the issues. In this article, the author finds the answer about AP by looking at current realties. First, AP is hard for gifted students to avoid. Second, AP never was a program…
Godrich, Stephanie L.; Lo, Johnny; Davies, Christina R.; Darby, Jill; Devine, Amanda
2017-01-01
Improving the suboptimal vegetable consumption among the majority of Australian children is imperative in reducing chronic disease risk. The objective of this research was to determine whether there was a relationship between food security determinants (FSD) (i.e., food availability, access, and utilisation dimensions) and adequate vegetable consumption among children living in regional and remote Western Australia (WA). Caregiver-child dyads (n = 256) living in non-metropolitan/rural WA completed cross-sectional surveys that included questions on FSD, demographics and usual vegetable intake. A total of 187 dyads were included in analyses, which included descriptive and logistic regression analyses via IBM SPSS (version 23). A total of 13.4% of children in this sample had adequate vegetable intake. FSD that met inclusion criteria (p ≤ 0.20) for multivariable regression analyses included price; promotion; quality; location of food outlets; variety of vegetable types; financial resources; and transport to outlets. After adjustment for potential demographic confounders, the FSD that predicted adequate vegetable consumption were, variety of vegetable types consumed (p = 0.007), promotion (p = 0.017), location of food outlets (p = 0.027), and price (p = 0.043). Food retail outlets should ensure that adequate varieties of vegetable types (i.e., fresh, frozen, tinned) are available, vegetable messages should be promoted through food retail outlets and in community settings, towns should include a range of vegetable purchasing options, increase their reliance on a local food supply and increase transport options to enable affordable vegetable purchasing. PMID:28054955
Wu, Felicia; Stacy, Shaina L; Kensler, Thomas W
2013-09-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America.
Wu, Felicia
2013-01-01
The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295
Current strategies for the restoration of adequate lordosis during lumbar fusion.
Barrey, Cédric; Darnis, Alice
2015-01-18
Not restoring the adequate lumbar lordosis during lumbar fusion surgery may result in mechanical low back pain, sagittal unbalance and adjacent segment degeneration. The objective of this work is to describe the current strategies and concepts for restoration of adequate lordosis during fusion surgery. Theoretical lordosis can be evaluated from the measurement of the pelvic incidence and from the analysis of spatial organization of the lumbar spine with 2/3 of the lordosis given by the L4-S1 segment and 85% by the L3-S1 segment. Technical aspects involve patient positioning on the operating table, release maneuvers, type of instrumentation used (rod, screw-rod connection, interbody cages), surgical sequence and the overall surgical strategy. Spinal osteotomies may be required in case of fixed kyphotic spine. AP combined surgery is particularly efficient in restoring lordosis at L5-S1 level and should be recommended. Finally, not one but several strategies may be used to achieve the need for restoration of adequate lordosis during fusion surgery.
Oil & gas in the 1990`s and beyond: Adequate supplies, growing demand, flat prices
Kennedy, J.L.
1995-06-01
Long term petroleum market fundamentals are clear: supplies are adequate and world demand will continue to grow steadily. Adequate supplies insure that prices will not increase significantly, on average, till the end of the 1990`s, probably much beyond. Despite plentiful supply and modest price increases, there will be peaks and valleys in the price graph as productive capacity is used up, then expanded. Tens of billions of dollars will be needed over the next decade to expand producing capacity. World oil consumption will increase at about 1.5% per year, at least for the next decade. Demand in Asia and Latin America will grow several times faster than this average world rate. World natural gas demand will grow at more then 2% per year well past 2000. Oil and gas companies around the world have changed the way they operate to survive the market realities of the 1990`s. restructuring, outsourcing, and partnering will continue as increasing costs and flat prices squeeze profits. Energy use patterns will change. Fuel and other product specifications will change. Market shares of oil and gas will shift. But opportunities abound in this new market environment. Growing markets always provide opportunities. Technology has helped operators dramatically lower finding, developing, and producing costs. The petroleum age is far from being over. Growing markets, adequate supply, affordable products, and a 60% market share. Those are the signs of an industry with a bright future.
Current strategies for the restoration of adequate lordosis during lumbar fusion
Barrey, Cédric; Darnis, Alice
2015-01-01
Not restoring the adequate lumbar lordosis during lumbar fusion surgery may result in mechanical low back pain, sagittal unbalance and adjacent segment degeneration. The objective of this work is to describe the current strategies and concepts for restoration of adequate lordosis during fusion surgery. Theoretical lordosis can be evaluated from the measurement of the pelvic incidence and from the analysis of spatial organization of the lumbar spine with 2/3 of the lordosis given by the L4-S1 segment and 85% by the L3-S1 segment. Technical aspects involve patient positioning on the operating table, release maneuvers, type of instrumentation used (rod, screw-rod connection, interbody cages), surgical sequence and the overall surgical strategy. Spinal osteotomies may be required in case of fixed kyphotic spine. AP combined surgery is particularly efficient in restoring lordosis at L5-S1 level and should be recommended. Finally, not one but several strategies may be used to achieve the need for restoration of adequate lordosis during fusion surgery. PMID:25621216
A test for adequate wastewater treatment based on glutathione S transferase isoenzyme profile.
Grammou, A; Samaras, P; Papadimitriou, C; Papadopoulos, A I
2013-04-01
Discharge to the environment of treated or non-treated municipal wastewater imposes several threats to coastal and estuarine ecosystems which are difficult to assess. In our study we evaluate the use of the isoenzyme profile of glutathione S transferase (GST) in combination with the kinetic characteristics of the whole enzyme and of heme peroxidase, as a test of adequate treatment of municipal wastewater. For this reason, Artemia nauplii were incubated in artificial seawater prepared by wastewater samples, such as secondary municipal effluents produced by a conventional activated sludge unit and advanced treated effluents produced by the employment of coagulation, activated carbon adsorption and chlorination as single processes or as combined ones. Characteristic changes of the isoenzyme pattern and the enzymes' kinetic properties were caused by chlorinated secondary municipal effluent or by secondary non-chlorinated effluent. Advanced treatment by combination of coagulation and/or carbon adsorption resulted to less prominent changes, suggesting more adequate treatment. Our results suggest that GST isoenzyme profile in combination with the kinetic properties of the total enzyme family is a sensitive test for the evaluation of the adequateness of the treatment of reclaimed wastewater and the reduction of potentially harmful compounds. Potentially, it may offer a 'fingerprint' characteristic of a particular effluent and probably of the treatment level it has been subjected.
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Melvin R. Novick: His Work in Bayesian Statistics.
ERIC Educational Resources Information Center
Lindley, D. V.
1987-01-01
Discusses Melvin R. Novick's work in the area of Bayesian statistics. This area of statistics was seen as a powerful scientific tool that allows educational researchers to have a better understanding of their data. (RB)
NASA Technical Reports Server (NTRS)
Levy, R.; Mcginness, H.
1976-01-01
Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.
Kimlin, Michael; Sun, Jiandong; Sinclair, Craig; Heward, Sue; Hill, Jane; Dunstone, Kimberley; Brodie, Alison
2016-01-01
An adequate vitamin D status, as measured by serum 25-hydroxyvitamin D (25(OH)D) concentration, is important in humans for maintenance of healthy bones and muscle function. Serum 25(OH)D concentration was assessed in participants from Melbourne, Australia (37.81S, 144.96E), who were provided with the current Australian guidelines on sun exposure for 25(OH)D adequacy (25(OH)D ≥50 nmol/L). Participants were interviewed in February (summer, n=104) and August (winter, n=99) of 2013. Serum 25(OH)D concentration was examined as a function of measures of sun exposure and sun protection habits with control of key characteristics such as dietary intake of vitamin D, body mass index (BMI) and skin colour, that may modify this relationship. The mean 25(OH)D concentration in participants who complied with the current sun exposure guidelines was 67.3 nmol/L in summer and 41.9 nmol/L in winter. At the end of the study, 69.3% of participants who complied with the summer sun exposure guidelines were 25(OH)D adequate, while only 27.6% of participants who complied with the winter sun exposure guidelines were 25(OH)D adequate at the end of the study. The results suggest that the current Australian guidelines for sun exposure for 25(OH)D adequacy are effective for most in summer and ineffective for most in winter. This article is part of a Special Issue entitled '17th Vitamin D Workshop'.
Adequate Iodine Status in New Zealand School Children Post-Fortification of Bread with Iodised Salt
Jones, Emma; McLean, Rachael; Davies, Briar; Hawkins, Rochelle; Meiklejohn, Eva; Ma, Zheng Feei; Skeaff, Sheila
2016-01-01
Iodine deficiency re-emerged in New Zealand in the 1990s, prompting the mandatory fortification of bread with iodised salt from 2009. This study aimed to determine the iodine status of New Zealand children when the fortification of bread was well established. A cross-sectional survey of children aged 8–10 years was conducted in the cities of Auckland and Christchurch, New Zealand, from March to May 2015. Children provided a spot urine sample for the determination of urinary iodine concentration (UIC), a fingerpick blood sample for Thyroglobulin (Tg) concentration, and completed a questionnaire ascertaining socio-demographic information that also included an iodine-specific food frequency questionnaire (FFQ). The FFQ was used to estimate iodine intake from all main food sources including bread and iodised salt. The median UIC for all children (n = 415) was 116 μg/L (females 106 μg/L, males 131 μg/L) indicative of adequate iodine status according to the World Health Organisation (WHO, i.e., median UIC of 100–199 μg/L). The median Tg concentration was 8.7 μg/L, which was <10 μg/L confirming adequate iodine status. There was a significant difference in UIC by sex (p = 0.001) and ethnicity (p = 0.006). The mean iodine intake from the food-only model was 65 μg/day. Bread contributed 51% of total iodine intake in the food-only model, providing a mean iodine intake of 35 μg/day. The mean iodine intake from the food-plus-iodised salt model was 101 μg/day. In conclusion, the results of this study confirm that the iodine status in New Zealand school children is now adequate. PMID:27196925
Welcome, Menizibeya Osain
2011-01-01
Objectives: As an important element of national security, public health not only functions to provide adequate and timely medical care but also track, monitor, and control disease outbreak. The Nigerian health care had suffered several infectious disease outbreaks year after year. Hence, there is need to tackle the problem. This study aims to review the state of the Nigerian health care system and to provide possible recommendations to the worsening state of health care in the country. To give up-to-date recommendations for the Nigerian health care system, this study also aims at reviewing the dynamics of health care in the United States, Britain, and Europe with regards to methods of medical intelligence/surveillance. Materials and Methods: Databases were searched for relevant literatures using the following keywords: Nigerian health care, Nigerian health care system, and Nigerian primary health care system. Additional keywords used in the search were as follows: United States (OR Europe) health care dynamics, Medical Intelligence, Medical Intelligence systems, Public health surveillance systems, Nigerian medical intelligence, Nigerian surveillance systems, and Nigerian health information system. Literatures were searched in scientific databases Pubmed and African Journals OnLine. Internet searches were based on Google and Search Nigeria. Results: Medical intelligence and surveillance represent a very useful component in the health care system and control diseases outbreak, bioattack, etc. There is increasing role of automated-based medical intelligence and surveillance systems, in addition to the traditional manual pattern of document retrieval in advanced medical setting such as those in western and European countries. Conclusion: The Nigerian health care system is poorly developed. No adequate and functional surveillance systems are developed. To achieve success in health care in this modern era, a system well grounded in routine surveillance and medical
Nebulized antibiotics. An adequate option for treating ventilator-associated respiratory infection?
Rodríguez, A; Barcenilla, F
2015-03-01
Ventilator-associated tracheobronchitis (VAT) is a frequent complication in critical patients. The 90% of those who develop it receive broad-spectrum antibiotic (ATB) treatment, without any strong evidence of its favorable impact. The use of nebulized ATB could be a valid treatment option, to reduce the use of systemic ATB and the pressure of selection on the local flora. Several studies suggest that an adequate nebulization technique can ensure high levels of ATB even in areas of lung consolidation, and to obtain clinical and microbiological cure. New studies are needed to properly assess the impact of treatment with nebulized ATB on the emergence of resistance.
Computer synthesis of human motion as a part of an adequate motion analysis experiment
NASA Astrophysics Data System (ADS)
Ivanov, Alexandre A.; Sholukha, Victor A.; Zinkovsky, Anatoly V.
1999-05-01
The role of problem of computer synthesis of a human motion for a traditional problem of control generalized and muscular forces determination is discussed. It is emphasized significance of computer model choice for adequate analysis kinematic and dynamic experimental data. On the basis of an imitation computer model influence of model's parameters values is demonstrated. With help of non-stationary constraints we can simulate human motions that satisfy to the most significant parameters of the concerned class of motion. Some results of simulation are discussed. We arrive at a conclusion that for correct interpretation of an experiment mixed problem of bodies system dynamics must be solved.
Thermodynamic Limit in Statistical Physics
NASA Astrophysics Data System (ADS)
Kuzemsky, A. L.
2014-03-01
The thermodynamic limit in statistical thermodynamics of many-particle systems is an important but often overlooked issue in the various applied studies of condensed matter physics. To settle this issue, we review tersely the past and present disposition of thermodynamic limiting procedure in the structure of the contemporary statistical mechanics and our current understanding of this problem. We pick out the ingenious approach by Bogoliubov, who developed a general formalism for establishing the limiting distribution functions in the form of formal series in powers of the density. In that study, he outlined the method of justification of the thermodynamic limit when he derived the generalized Boltzmann equations. To enrich and to weave our discussion, we take this opportunity to give a brief survey of the closely related problems, such as the equipartition of energy and the equivalence and nonequivalence of statistical ensembles. The validity of the equipartition of energy permits one to decide what are the boundaries of applicability of statistical mechanics. The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern statistical physics of the infinite and "small" many-particle systems.
Statistical Mechanics of Money
NASA Astrophysics Data System (ADS)
Dragulescu, Adrian; Yakovenko, Victor
2000-03-01
We study a network of agents exchanging money between themselves. We find that the stationary probability distribution of money M is the Gibbs distribution exp(-M/T), where T is an effective ``temperature'' equal to the average amount of money per agent. This is in agreement with the general laws of statistical mechanics, because money is conserved during each transaction and the number of agents is held constant. We have verified the emergence of the Gibbs distribution in computer simulations of various trading rules and models. When the time-reversal symmetry of the trading rules is explicitly broken, deviations from the Gibbs distribution may occur, as follows from the Boltzmann-equation approach to the problem. Money distribution characterizes the purchasing power of a system. A seller would maximize his/her income by setting the price of a product equal to the temperature T of the system. Buying products from a system of temperature T1 and selling it to a system of temperature T2 would generate profit T_2-T_1>0, as in a thermal machine.
Kindermann, G; Jung, E M; Maassen, V; Bise, K
1996-08-01
The Problem of an Adequate Surgical Approach: Frequency of malignant teratomas is, according to the literature, 2%-10%. Examining 194 own cases (1983-1993) it was 1.5%. We found one squamous cell carcinoma (0.5%). Additionally we found 2 immature teratomas (1%). We point out the different biological behaviour of malignant mature teratomas and immature teratomas. We agree with the majority of authors that the method of choice is the intact removal of all teratomas without iatrogen rupture or contamination of the abdominal cavity by contents of the teratoma. This adequate surgical procedure can and should be performed by laparotomy or laparoscopy with endobag. The often practised method of cutting open the cyst during laparoscopy, sucking off the contents or cutting the teratoma into pieces, has been proven to lead to implantation and worsening the prognosis in case of a malignant teratoma. Even the rinsing of the abdominal cavity, usually carried out with this method, could not compensate always for the disadvantage of this "dirty" endoscopical method compared with usual oncological standards. This is pointed out by case reports in the literature and the first analysis of a German survey with early-follow-up of 192 laparoscopically managed ovarian malignancies [11a]. The principle of intact removal of every teratoma should again be kept in mind.
MRI can determine the adequate area for debridement in the case of Fournier's gangrene.
Yoneda, Akira; Fujita, Fumihiko; Tokai, Hirotaka; Ito, Yuichiro; Haraguchi, Masashi; Tajima, Yoshitsugu; Kanematsu, Takashi
2010-01-01
A 57-year-old man was transferred to our hospital because of gluteal pain. His right buttock had flare and swelling. Complete blood count showed leukocytosis, and renal failure was evident. Pelvic computed tomography (CT) revealed that the abscess, including gas, was widespread into the hypodermal tissue of the right buttock. Fournier's gangrene had been suspected, and immediate drainage was performed on the right buttock. The symptom and the condition improved rapidly, but on the day after the operation, the patient became drowsy and fell into endotoxic shock. Magnetic resonance imaging (MRI) revealed strong inflammation along the entire fascia of the right femur and necrotizing fasciitis. MRI was very useful for identification of the necrotic range. Immediately, an emergency operation was performed; 3 wide incisions were made on the right thigh and crus for drainage. The patient was cared for intensively under a sedated condition, and irrigation and debridement were repeated every day. Culture of the pus revealed mixed infection of Escherichia coli and anaerobic bacteria, and a large quantity of antimicrobial drug was used. The inflammatory reaction decreased, and the patient's general condition tentatively improved. With Fournier's gangrene, initiating adequate surgical and medical treatment is essential. Therefore, MRI should be used in the early exact diagnosis of this disease to obtain knowledge of the extent of necrosis and to determine the adequate area for debridement.
Ko, Yousang; Shin, Jeong Hwan; Lee, Hyun-Kyung; Lee, Young Seok; Lee, Suh-Young; Park, So Young; Mo, Eun-Kyung; Kim, Changhwan
2017-01-01
Background A sputum culture is the most reliable indicator of the infectiousness of pulmonary tuberculosis (PTB); however, a spontaneous sputum specimen may not be suitable. The aim of this study was to evaluate the infectious period in patients with non–drug-resistant (DR) PTB receiving adequate standard chemotherapy, using induced sputum (IS) specimens. Methods We evaluated the duration of infectiousness of PTB using a retrospective cohort design. Results Among the 35 patients with PTB, 22 were smear-positive. The rates of IS culture positivity from baseline to the sixth week of anti-tuberculosis medication in the smear-positive PTB group were 100%, 100%, 91%, 73%, 36%, and 18%, respectively. For smear-positive PTB cases, the median time of conversion to culture negativity was 35.0 days (range, 28.0–42.0 days). In the smear-negative PTB group (n=13), the weekly rates of positive IS culture were 100%, 77%, 39%, 8%, 0%, and 0%, respectively, and the median time to conversion to culture-negative was 21.0 days (range, 17.5–28.0 days). Conclusion The infectiousness of PTB, under adequate therapy, may persist longer than previously reported, even in patients with non-DR PTB. PMID:28119744
Siahkouhian, Marefat; Khodadadi, Davar
2013-01-01
Purpose Optimal training intensity and the adequate exercise level for physical fitness is one of the most important interests of coaches and sports physiologists. The aim of this study was to investigate the validity of the Narita et al target heart rate equation for the adequate exercise training level in sedentary young boys. Methods Forty two sedentary young boys (19.07±1.16 years) undertook a blood lactate transition threshold maximal treadmill test to volitional exhaustion with continuous respiratory gas measurements according to the Craig method. The anaerobic threshold (AT) of the participants then was calculated using the Narita target heart rate equation. Results Hopkin's spreadsheet to obtain confidence limit and the chance of the true difference between gas measurements and Narita target heart rate equation revealed that the Narita equation most likely underestimates the measured anaerobic threshold in sedentary young boys (168.76±15 vs. 130.08±14.36) (Difference ±90% confidence limit: 38.1±18). Intraclass correlation coefficient (ICC) showed a poor agreement between the criterion method and Narita equation (ICC= 0.03). Conclusion According to the results, the Narita equation underestimates the measured AT. It seems that the Narita equation is a good predictor of aerobic not AT which can be investigated in the future studies. PMID:24427475
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Conger, Bruce
2012-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapid Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently and the ventilation flow is adequate for maintaining CO2 washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, testing planned in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing. 1
Electric power annual 1995. Volume II
1996-12-01
This document summarizes pertinent statistics on various aspects of the U.S. electric power industry for the year and includes a graphic presentation. Data is included on electric utility retail sales and revenues, financial statistics, environmental statistics of electric utilities, demand-side management, electric power transactions, and non-utility power producers.
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
Experiment in Elementary Statistics
ERIC Educational Resources Information Center
Fernando, P. C. B.
1976-01-01
Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Teaching Statistics Using SAS.
ERIC Educational Resources Information Center
Mandeville, Garrett K.
The Statistical Analysis System (SAS) is presented as the single most appropriate statistical package to use as an aid in teaching statistics. A brief review of literature in which SAS is compared to SPSS, BMDP, and other packages is followed by six examples which demonstrate features unique to SAS which have pedagogical utility. Of particular…
Minnesota Health Statistics 1988.
ERIC Educational Resources Information Center
Minnesota State Dept. of Health, St. Paul.
This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…
Statistical error in particle simulations of low mach number flows
Hadjiconstantinou, N G; Garcia, A L
2000-11-13
We present predictions for the statistical error due to finite sampling in the presence of thermal fluctuations in molecular simulation algorithms. The expressions are derived using equilibrium statistical mechanics. The results show that the number of samples needed to adequately resolve the flowfield scales as the inverse square of the Mach number. Agreement of the theory with direct Monte Carlo simulations shows that the use of equilibrium theory is justified.
Statistical Methods for Astronomy
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.; Babu, G. Jogesh
Statistical methodology, with deep roots in probability theory, providesquantitative procedures for extracting scientific knowledge from astronomical dataand for testing astrophysical theory. In recent decades, statistics has enormouslyincreased in scope and sophistication. After a historical perspective, this reviewoutlines concepts of mathematical statistics, elements of probability theory,hypothesis tests, and point estimation. Least squares, maximum likelihood, andBayesian approaches to statistical inference are outlined. Resampling methods,particularly the bootstrap, provide valuable procedures when distributionsfunctions of statistics are not known. Several approaches to model selection andgoodness of fit are considered.
Statistical analysis principles for Omics data.
Dunkler, Daniela; Sánchez-Cabo, Fátima; Heinze, Georg
2011-01-01
In Omics experiments, typically thousands of hypotheses are tested simultaneously, each based on very few independent replicates. Traditional tests like the t-test were shown to perform poorly with this new type of data. Furthermore, simultaneous consideration of many hypotheses, each prone to a decision error, requires powerful adjustments for this multiple testing situation. After a general introduction to statistical testing, we present the moderated t-statistic, the SAM statistic, and the RankProduct statistic which have been developed to evaluate hypotheses in typical Omics experiments. We also provide an introduction to the multiple testing problem and discuss some state-of-the-art procedures to address this issue. The presented test statistics are subjected to a comparative analysis of a microarray experiment comparing tissue samples of two groups of tumors. All calculations can be done using the freely available statistical software R. Accompanying, commented code is available at: http://www.meduniwien.ac.at/msi/biometrie/MIMB.
Code of Federal Regulations, 2010 CFR
2010-10-01
... disclosure or destruction of manual and automatic record systems. These security safeguards shall apply to... use of records contained in a system of records are adequately trained to protect the security and... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Labeling of cosmetic products for which adequate..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS COSMETIC PRODUCT WARNING STATEMENTS Warning Statements § 740.10 Labeling of cosmetic products for which adequate substantiation of safety has not...
Adequate connexin-mediated coupling is required for proper insulin production
1995-01-01
To assess whether connexin (Cx) expression contributes to insulin secretion, we have investigated normal and tumoral insulin-producing cells for connexins, gap junctions, and coupling. We have found that the glucose-sensitive cells of pancreatic islets and of a rat insulinoma are functionally coupled by gap junctions made of Cx43. In contrast, cells of several lines secreting insulin abnormally do not express Cx43, gap junctions, and coupling. After correction of these defects by stable transfection of Cx43 cDNA, cells expressing modest levels of Cx43 and coupling, as observed in native beta-cells, showed an expression of the insulin gene and an insulin content that were markedly elevated, compared with those observed in both wild-type (uncoupled) cells and in transfected cells overexpressing Cx43. These findings indicate that adequate levels of Cx-mediated coupling are required for proper insulin production and storage. PMID:8522612
Cheema, U; Alekseeva, T; Abou-Neel, E A; Brown, R A
2010-10-06
A major question in biomimetic tissue engineering is how much of the structure/function of native vasculature needs to be reproduced for effective tissue perfusion. O2 supplied to cells in 3D scaffolds in vitro is initially dependent upon diffusion through the scaffold and cell consumption. Low O2 (3%) enhances specific cell behaviours, but where O2 is critically low (pathological hypoxia) cell survival becomes compromised. We measured real-time O2 in 3D scaffolds and introduced micro-channelled architecture to controllably increase delivery of O2 to cells and switch off the hypoxic response. Simple static micro-channelling gives adequate perfusion and can be used to control cell generated hypoxia-induced signalling.
NASA Astrophysics Data System (ADS)
Wei, Zhao; Tao, Feng; Jun, Wang
2013-10-01
An efficient, robust, and accurate approach is developed for image registration, which is especially suitable for large-scale change and arbitrary rotation. It is named the adequately sampling polar transform and weighted angular projection function (ASPT-WAPF). The proposed ASPT model overcomes the oversampling problem of conventional log-polar transform. Additionally, the WAPF presented as the feature descriptor is robust to the alteration in the fovea area of an image, and reduces the computational cost of the following registration process. The experimental results show two major advantages of the proposed method. First, it can register images with high accuracy even when the scale factor is up to 10 and the rotation angle is arbitrary. However, the maximum scaling estimated by the state-of-the-art algorithms is 6. Second, our algorithm is more robust to the size of the sampling region while not decreasing the accuracy of the registration.
NASA Astrophysics Data System (ADS)
Hikov, Todor; Pecheva, Emilia; Montgomery, Paul; Antoni, Frederic; Leong-Hoi, Audrey; Petrov, Todor
2017-01-01
This work aims at evaluating the possibility of introducing state-of-the-art commercial femtosecond laser system in restorative dentistry by maintaining well-known benefits of lasers for caries removal, but also in overcoming disadvantages such as thermal damage of irradiated substrate. Femtosecond ablation of dental hard tissue is investigated by changing the irradiation parameters (pulsed laser energy, scanning speed and pulse repetition rate), assessed for enamel and dentin. The femtosecond laser system used in this work may be suitable for cavity preparation in dentin and enamel, due to the expected effective ablation and low temperature increase when using ultra short laser pulses. If adequate laser parameters are selected, this system seems to be promising for promoting a laser-assisted, minimally invasive approach in restorative dentistry.
J-modulated ADEQUATE experiments using different kinds of refocusing pulses.
Thiele, Christina M; Bermel, Wolfgang
2007-10-01
Owing to the recent developments concerning residual dipolar couplings (RDCs), the interest in methods for the accurate determination of coupling constants is renascenting. We intended to use the J-modulated ADEQUATE experiment by Kövér et al. for the measurement of (13)C - (13)C coupling constants at natural abundance. The use of adiabatic composite chirp pulses instead of the conventional 180 degrees pulses, which compensate for the offset dependence of (13)C 180 degrees pulses, led to irregularities of the line shapes in the indirect dimension causing deviations of the extracted coupling constants. This behaviour was attributed to coupling evolution, during the time of the adiabatic pulse (2 ms), in the J-modulation spin echo. The replacement of this pulse by different kinds of refocusing pulses indicated that a pair of BIPs (broadband inversion pulses), which behave only partially adiabatic, leads to correct line shapes and coupling constants conserving the good sensitivity obtained with adiabatic pulses.
A Nomogram to Predict Adequate Lymph Node Recovery before Resection of Colorectal Cancer
Zhang, Zhen-yu; Li, Cong; Gao, Wei; Yin, Xiao-wei; Luo, Qi-feng; Liu, Nan; Basnet, Shiva; Dai, Zhen-ling; Ge, Hai-yan
2016-01-01
Increased lymph node count (LNC) has been associated with prolonged survival in colorectal cancer (CRC), but the underlying mechanisms are still poorly understood. The study aims to identify new predictors and develop a preoperative nomogram for predicting the probability of adequate LNC (≥ 12). 501 eligible patients were retrospectively selected to identify clinical-pathological factors associated with LNC ≥ 12 through univariate and multivariate logistic regression analyses. The nomogram was built according to multivariate analyses of preoperative factors. Model performance was assessed with concordance index (c-index) and area under the receiver operating characteristic curve (AUC), followed by internal validation and calibration using 1000-resample bootstrapping. Clinical validity of the nomogram and LNC impact on stage migration were also evaluated. Multivariate analyses showed patient age, CA19-9, circulating lymphocytes, neutrophils, platelets, tumor diameter, histology and deposit significantly correlated with LNC (P < 0.05). The effects were marginal for CEA, anemia and CRC location (0.05 < P < 0.1). The multivariate analyses of preoperative factors suggested decreased age, CEA, CA19-9, neutrophils, proximal location, and increased platelets and diameter were significantly associated with increased probability of LNC ≥ 12 (P < 0.05). The nomogram achieved c-indexes of 0.75 and 0.73 before and after correction for overfitting. The AUC was 0.75 (95% CI, 0.70–0.79) and the clinically valid threshold probabilities were between 10% and 60% for the nomogram to predict LNC < 12. Additionally, increased probability of adequate LNC before surgery was associated with increased LNC and negative lymph nodes rather than increased positive lymph nodes, lymph node ratio, pN stages or AJCC stages. Collectively, the results indicate the LNC is multifactorial and irrelevant to stage migration. The significant correlations with preoperative circulating markers may
Tetens, Inge; Dejgård Jensen, Jørgen; Smed, Sinne; Gabrijelčič Blenkuš, Mojca; Rayner, Mike; Darmon, Nicole; Robertson, Aileen
2016-01-01
Background Food-Based Dietary Guidelines (FBDGs) are developed to promote healthier eating patterns, but increasing food prices may make healthy eating less affordable. The aim of this study was to design a range of cost-minimized nutritionally adequate health-promoting food baskets (FBs) that help prevent both micronutrient inadequacy and diet-related non-communicable diseases at lowest cost. Methods Average prices for 312 foods were collected within the Greater Copenhagen area. The cost and nutrient content of five different cost-minimized FBs for a family of four were calculated per day using linear programming. The FBs were defined using five different constraints: cultural acceptability (CA), or dietary guidelines (DG), or nutrient recommendations (N), or cultural acceptability and nutrient recommendations (CAN), or dietary guidelines and nutrient recommendations (DGN). The variety and number of foods in each of the resulting five baskets was increased through limiting the relative share of individual foods. Results The one-day version of N contained only 12 foods at the minimum cost of DKK 27 (€ 3.6). The CA, DG, and DGN were about twice of this and the CAN cost ~DKK 81 (€ 10.8). The baskets with the greater variety of foods contained from 70 (CAN) to 134 (DGN) foods and cost between DKK 60 (€ 8.1, N) and DKK 125 (€ 16.8, DGN). Ensuring that the food baskets cover both dietary guidelines and nutrient recommendations doubled the cost while cultural acceptability (CAN) tripled it. Conclusion Use of linear programming facilitates the generation of low-cost food baskets that are nutritionally adequate, health promoting, and culturally acceptable. PMID:27760131
[Quality of Mesorectal Excision ("Plane of Surgery") - Which Quality Targets are Adequate?].
Hermanek, P; Merkel, S; Ptok, H; Hohenberger, W
2015-12-01
Today, the examination of rectal cancer specimens includes the obligate macroscopic assessment of the quality of mesorectal excision by the pathologist reporting the plane of surgery. The frequency of operations in the muscularis propria plane of surgery (earlier described as incomplete mesorectal excision) is essential. The quality of mesorectal excision is important for the prognosis, especially as local recurrences are observed more frequently after operations in the muscularis propria plane of surgery. For the definition of quality targets, data of 13 studies published between 2006 and 2012, each with more than 100 patients and adequate specialisation and experience of the surgeons (5413 patients), data of the prospective multicentric observation study "Quality Assurance - Rectal Cancer" (at the Institute for Quality Assurance in Operative Medicine at the Otto-von-Guericke University at Magdeburg) from 2005 to 2010 (8044 patients) and data of the Department of Surgery, University Hospital Erlangen, from 1998 to 2011 (991 patients) were analysed. The total incidence of operations in the muscularis propria plane of surgery was 5.0 % (721/14 448). Even with adequate specialisation and experience of the surgeon, the frequency of operations in the muscularis propria plane of surgery is higher in abdominoperineal excisions than in sphincter-preserving surgery (8.4 vs. 2.8 %, p < 0.001). Thus, the quality target for the frequency of operations in the muscularis propria plane should be defined as < 5 % for sphincter-preserving procedures and as < 10 % for abdominoperineal excisions.
Maintaining Adequate CO2 Washout for an Advanced EMU via a New Rapid Cycle Amine Technology
NASA Technical Reports Server (NTRS)
Chullen, Cinda
2011-01-01
Over the past several years, NASA has realized tremendous progress in Extravehicular Activity (EVA) technology development. This has been evidenced by the progressive development of a new Rapic Cycle Amine (RCA) system for the Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support Subsystem (PLSS). The PLSS is responsible for the life support of the crew member in the spacesuit. The RCA technology is responsible for carbon dioxide (CO2) and humidity control. Another aspect of the RCA is that it is on-back vacuum-regenerable, efficient, and reliable. The RCA also simplifies the PLSS schematic by eliminating the need for a condensing heat exchanger for humidity control in the current EMU. As development progresses on the RCA, it is important that the sizing be optimized so that the demand on the PLSS battery is minimized. As well, maintaining the CO2 washout at adequate levels during an EVA is an absolute requirement of the RCA and associated ventilation system. Testing has been underway in-house at NASA Johnson Space Center and analysis has been initiated to evaluate whether the technology provides exemplary performance in ensuring that the CO2 is removed sufficiently enough and the ventilation flow is adequate enough to maintain CO2 1 Project Engineer, Space Suit and Crew Survival Systems Branch, Crew and Thermal Systems Division, 2101 NASA Parkway, Houston, TX 77058/EC5. washout in the AEMU spacesuit helmet of the crew member during an EVA. This paper will review the recent developments of the RCA unit, the testing results performed in-house with a spacesuit simulator, and the associated analytical work along with insights from the medical aspect on the testing.
Peek-Asa, C; Allareddy, V; Yang, J; Taylor, C; Lundell, J; Zwerling, C
2005-01-01
Objective: The National Fire Protection Association (NFPA) has specific recommendations about the number, location, and type of smoke alarms that are needed to provide maximum protection for a household. No previous studies have examined whether or not homes are completely protected according to these guidelines. The authors describe the prevalence and home characteristics associated with compliance to recommendations for smoke alarm installation by the NFPA. Design, setting, and subjects: Data are from the baseline on-site survey of a randomized trial to measure smoke alarm effectiveness. The trial was housed in a longitudinal cohort study in a rural Iowa county. Of 1005 homes invited, 691 (68.8%) participated. Main outcome measures: Information about smoke alarm type, placement, and function, as well as home and occupant characteristics, was collected through an on-site household survey. Results: Although 86.0% of homes had at least one smoke alarm, only 22.3% of homes (approximately one in five) were adequately protected according to NFPA guidelines. Fourteen percent of homes had no functioning smoke alarms. More than half of the homes with smoke alarms did not have enough of them or had installed them incorrectly, and 42.4% of homes with alarms had at least one alarm that did not operate. Homes with at least one high school graduate were nearly four times more likely to be fully protected. Homes that had multiple levels, a basement, or were cluttered or poorly cleaned were significantly less likely to be fully protected. Conclusion: These findings indicate that consumers may not be knowledgeable about the number of alarms they need or how to properly install them. Occupants are also not adequately maintaining the alarms that are installed. PMID:16326772
Sair, A I; Booren, A M; Berry, B W; Smith, D M
1999-02-01
The objectives were to (i) compare the use of triose phosphate isomerase (TPI) activity and internal color scores for determination of cooking adequacy of beef patties and (ii) determine the effect of frozen storage and fat content on residual TPI activity in ground beef. Ground beef patties (24.4% fat) were cooked to five temperatures ranging from 60.0 to 82.2 degrees C. TPI activity decreased as beef patty cooking temperature was increased from 60.0 to 71.1 degrees C; however, no difference (P > 0.05) in activity (6.3 U/kg meat) was observed in patties cooked to 71.1 degrees C and above. Degree of doneness color scores, a* values and b* values, of ground beef patties decreased as internal temperature was increased from 60.0 to 71.1 degrees C; however, temperature had no effect on L* values. TPI activity in raw ground beef after five freeze-thaw cycles did not differ from the control. Three freeze-thaw cycles of raw ground beef resulted in a 57.2% decrease in TPI activity after cooking. TPI activity of cooked beef increased during 2 months of frozen storage, but TPI activity in ground beef stored for 3 months or longer did not differ from the unfrozen control. While past research has shown color to be a poor indicator of adequate thermal processing, our results suggest that undercooked ground beef patties could be distinguished from those that had been adequately cooked following U.S. Department of Agriculture guidelines using residual TPI activity as a marker.
Options for Affordable Fission Surface Power Systems
NASA Technical Reports Server (NTRS)
Houts, Mike; Gaddis, Steve; Porter, Ron; VanDyke, Melissa; Martin Jim; Godfroy, Tom; Bragg-Sitton, Shannon; Garber, Anne; Pearson, Boise
2006-01-01
Fission surface power systems could provide abundant power anywhere on free surface of the moon or Mars. Locations could include permanently shaded regions on the moon and high latitudes on Mars. To be fully utilized; however, fission surface power systems must be safe, have adequate performance, and be affordable. This paper discusses options for the design and development of such systems.
Effects of flare definitions on the statistics of derived flare distributions
NASA Astrophysics Data System (ADS)
Ryan, D. F.; Dominique, M.; Seaton, D.; Stegen, K.; White, A.
2016-08-01
The statistical examination of solar flares is crucial to revealing their global characteristics and behaviour. Such examinations can tackle large-scale science questions or give context to detailed single-event studies. However, they are often performed using standard but basic flare detection algorithms relying on arbitrary thresholds. This arbitrariness may lead to important scientific conclusions being drawn from results caused by subjective choices in algorithms rather than the true nature of the Sun. In this paper, we explore the effect of the arbitrary thresholds used in the Geostationary Operational Environmental Satellite (GOES) event list and Large Yield RAdiometer (LYRA) Flare Finder algorithms. We find that there is a small but significant relationship between the power law exponent of the GOES flare peak flux frequency distribution and the flare start thresholds of the algorithms. We also find that the power law exponents of these distributions are not stable, but appear to steepen with increasing peak flux. This implies that the observed flare size distribution may not be a power law at all. We show that depending on the true value of the exponent of the flare size distribution, this deviation from a power law may be due to flares missed by the flare detection algorithms. However, it is not possible determine the true exponent from GOES/XRS observations. Additionally we find that the PROBA2/LYRA flare size distributions are artificially steep and clearly non-power law. We show that this is consistent with an insufficient degradation correction. This means that PROBA2/LYRA should not be used for flare statistics or energetics unless degradation is adequately accounted for. However, it can be used to study variations over shorter timescales and for space weather monitoring.
Statistical phenomena in particle beams
Bisognano, J.J.
1984-09-01
Particle beams are subject to a variety of apparently distinct statistical phenomena such as intrabeam scattering, stochastic cooling, electron cooling, coherent instabilities, and radiofrequency noise diffusion. In fact, both the physics and mathematical description of these mechanisms are quite similar, with the notion of correlation as a powerful unifying principle. In this presentation we will attempt to provide both a physical and a mathematical basis for understanding the wide range of statistical phenomena that have been discussed. In the course of this study the tools of the trade will be introduced, e.g., the Vlasov and Fokker-Planck equations, noise theory, correlation functions, and beam transfer functions. Although a major concern will be to provide equations for analyzing machine design, the primary goal is to introduce a basic set of physical concepts having a very broad range of applicability.
Not Available
1994-01-06
The Electric Power Annual presents a summary of electric utility statistics at national, regional and State levels. The objective of the publication is to provide industry decisionmakers, government policymakers, analysts and the general public with historical data that may be used in understanding US electricity markets. The Electric Power Annual is prepared by the Survey Management Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. ``The US Electric Power Industry at a Glance`` section presents a profile of the electric power industry ownership and performance, and a review of key statistics for the year. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; retail sales; revenue; financial statistics; environmental statistics; electric power transactions; demand-side management; and nonutility power producers. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences in US electricity power systems. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. Monetary values in this publication are expressed in nominal terms.
Statistical Properties of Online Auctions
NASA Astrophysics Data System (ADS)
Namazi, Alireza; Schadschneider, Andreas
We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.
Phase statistics of seismic coda waves.
Anache-Ménier, D; van Tiggelen, B A; Margerin, L
2009-06-19
We report the analysis of the statistics of the phase fluctuations in the coda of earthquakes recorded during a temporary experiment deployed at Pinyon Flats Observatory, California. The observed distributions of the spatial derivatives of the phase in the seismic coda exhibit universal power-law decays whose exponents agree accurately with circular Gaussian statistics. The correlation function of the phase derivative is measured and used to estimate the mean free path of Rayleigh waves.
... Naloxone Pain Prevention Treatment Trends & Statistics Women and Drugs Publications Funding Funding Opportunities Clinical Research Post-Award Concerns General Information Grant & Contract Application ...
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Adopting adequate leaching requirement for practical response models of basil to salinity
NASA Astrophysics Data System (ADS)
Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour
2016-07-01
Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.
Toward Understanding the Role of Technological Tools in Statistical Learning.
ERIC Educational Resources Information Center
Ben-Zvi, Dani
2000-01-01
Begins with some context setting on new views of statistics and statistical education reflected in the introduction of exploratory data analysis (EDA) into the statistics curriculum. Introduces a detailed example of an EDA learning activity in the middle school that makes use of the power of the spreadsheet to mediate students' construction of…
Carnegie Mellon Course Dissects Statistics about Sexual Orientation
ERIC Educational Resources Information Center
Keller, Josh
2007-01-01
Most statistics courses emphasize the power of statistics. Michele DiPietro's course focuses on the failures. Gay and lesbian studies are certainly fertile ground for bad guesses and unreliable statistics. The most famous number, that 10 percent of the population is gay, was taken from a biased Kinsey sample of white men ages 16 to 55 in 1948, and…
Ray, Amrita; Weeks, Daniel E
2008-05-01
Linkage analysis programs invariably assume that the stated familial relationships are correct. Thus, it is common practice to resolve relationship errors by either discarding individuals with erroneous relationships or using an inferred alternative pedigree structure. These approaches are less than ideal because discarding data is wasteful and using inferred data can be statistically unsound. We have developed two linkage statistics that model relationship uncertainty by weighting over the possible true relationships. Simulations of data containing relationship errors were used to assess our statistics and compare them to the maximum-likelihood statistic (MLS) and the Sall non-parametric LOD score using true and discarded (where problematic individuals with erroneous relationships are discarded from the pedigree) structures. We simulated both small pedigree (SP) and large pedigree (LP) data sets typed genome-wide. Both data sets have several underlying true relationships; SP has one apparent relationship--full sibling--and LP has several different apparent relationship types. The results show that for both SP and LP, our relationship uncertainty linkage statistics (RULS) have power nearly as high as the MLS and Sall using the true structure. Also, the RULS have greater power to detect linkage than the MLS and Sall using the discarded structure. For example, for the SP data set and a dominant disease model, both the RULS had power of about 93%, while Sall and MLS have 90% and 83% power on the discarded structure. Thus, our RULS provide a statistically sound and powerful approach to the commonly encountered problem of relationship errors.
Optical rogue wave statistics in laser filamentation.
Kasparian, Jérôme; Béjot, Pierre; Wolf, Jean-Pierre; Dudley, John M
2009-07-06
We experimentally observed optical rogue wave statistics during high power femtosecond pulse filamentation in air. We characterized wavelength-dependent intensity fluctuations across 300 nm broadband filament spectra generated by pulses with several times the critical power for filamentation. We show how the statistics vary from a near-Gaussian distribution in the vicinity of the pump to a long tailed "L-shaped" distribution at the short wavelength and long wavelength edges. The results are interpreted in terms of pump noise transfer via self-phase modulation.
2006-02-01
1 The Marine Air-Ground Team: Still Not Adequately Training for the Urban Fight Subject Area Training EWS 2006...The Marine Air-Ground Team: Still Not Adequately Training for the Urban Fight Submitted by Captain RC Rybka to Majors GC Schreffler and RR...estimated to average 1 hour per response , including the time for reviewing instructions, searching existing data sources, gathering and maintaining the
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
The placental pursuit for an adequate oxidant balance between the mother and the fetus
Herrera, Emilio A.; Krause, Bernardo; Ebensperger, German; Reyes, Roberto V.; Casanello, Paola; Parra-Cordero, Mauro; Llanos, Anibal J.
2014-01-01
The placenta is the exchange organ that regulates metabolic processes between the mother and her developing fetus. The adequate function of this organ is clearly vital for a physiologic gestational process and a healthy baby as final outcome. The umbilico-placental vasculature has the capacity to respond to variations in the materno-fetal milieu. Depending on the intensity and the extensity of the insult, these responses may be immediate-, mediate-, and long-lasting, deriving in potential morphostructural and functional changes later in life. These adjustments usually compensate the initial insults, but occasionally may switch to long-lasting remodeling and dysfunctional processes, arising maladaptation. One of the most challenging conditions in modern perinatology is hypoxia and oxidative stress during development, both disorders occurring in high-altitude and in low-altitude placental insufficiency. Hypoxia and oxidative stress may induce endothelial dysfunction and thus, reduction in the perfusion of the placenta and restriction in the fetal growth and development. This Review will focus on placental responses to hypoxic conditions, usually related with high-altitude and placental insufficiency, deriving in oxidative stress and vascular disorders, altering fetal and maternal health. Although day-to-day clinical practice, basic and clinical research are clearly providing evidence of the severe impact of oxygen deficiency and oxidative stress establishment during pregnancy, further research on umbilical and placental vascular function under these conditions is badly needed to clarify the myriad of questions still unsettled. PMID:25009498
Prevention of mother to child transmission lay counsellors: Are they adequately trained?
Thurling, Catherine H; Harris, Candice
2012-06-05
South Africa's high prevalence of human immunodeficiency virus (HIV) infected women requires a comprehensive health care approach to pregnancy because of the added risk of their HIV status. As a result of the shortage of health care workers in South Africa, lay counsellors play important roles in the prevention of mother to child transmission of HIV (PMTCT). There is no standardization of training of lay counsellors in South Africa, and training varies in length depending on the training organisation. The study aimed to investigate the training of lay counsellors by analysing their training curricula and interviewing lay counsellors about their perceptions of their training. A two phase research method was applied. Phase one documented an analysis of the training curricula. Phase two was semi-structured interviews with the participants. Purposive sampling was undertaken for this study. The total sample size was 13 people, with a final sample of 9 participants, determined at the point of data saturation. The research was qualitative, descriptive and contextual in design. The curricula analysed had different styles of delivery, and the approaches to learning and courses varied, resulting in inconsistent training outcomes. A need for supervision and mentorship in the working environment was also noted. The training of lay counsellors needs to be adapted to meet the extended roles that they are playing in PMTCT. The standardization of training programmes, and the incorporation of a system of mentorship in the work environment, would ensure that the lay counsellors are adequately prepared for their role in PMTCT.
Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios
NASA Technical Reports Server (NTRS)
Juarez, Alfredo; Harper, Susana Tapia
2016-01-01
The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.
Kuon, Eberhard; Dorn, Christian; Schmitt, Moritz; Dahm, Johannes B
2003-05-01
In this study, the cinegraphic image intensifier entrance dose level for coronary angiography was changed in four steps from dose level A (0.041 microGy frame(-1)), allowing high contrast, but coarse mottled background, to level D (0.164 microGy frame(-1)), affording high transparency and sharpness. Using this new approach throughout the course of 404 consecutive cardiac catheterizations, we reduced patient radiation exposures down to 11 to 16% of currently typical values: i.e., mean dose area products of 5.97 Gy cm2 (n = 91), 6.73 (n = 113), 8.11 (n = 91), and 8.90 (n = 109); cinegraphic dose area products of 2.34, 3.64, 4.56, and 5.49; and cinegraphic dose area products frame(-1) of 13.3, 19.8, 27.0, and 30.2 mGy cm2, for levels A, B, C, and D, respectively. The number of cinegraphic frames ranged within 168 to 182 per case. Our results show that during catheterization interventionalists should vary image intensifier entrance dose levels in accordance with documented structure, angulation, and body mass index. With the exception of cases with special requirements, lower dose levels typically guarantee an adequate image quality.
Saugstad, Letten F
2006-01-01
The present mismatch between what our brain needs, and the modern diet neglects our marine heritage. Last century, the priority in nutrition and food production was to achieve a high protein diet and somatic growth and function. The dietary content of omega-3 (N-3) required by the brain was neglected although evidence for the essentiality of certain fatty acids was published in 1929 and specifically re-affirmed for omega 3 in the brain in the 1970s. Cognitive decline with age and neurodegenerative disorder with dementia are now rising. This review describes signs of N-3 deficit in Alzheimer and Parkinson Disease, where maximum change involves the primary sites: olfactory cortex and the hippocampus. The olfactory agnosia observed in schizophrenia supports an N-3 deficit as does a reduction of key ologodendrocyte- and myelin-related genes in this disorder and affective disorder, where a rise in dementia accords with a deficit of N-3 also in this disorder. N-3 normalizes cerebral excitability at all levels. That the two disorders are localized at the extremes of excitability, is supported by their opposing treatments: convulsant neuroleptics and anti-epileptic antidepressants. An adequate N-3 diet will probably prevent most psychotic episodes and prove that neurodegenerative disorder with dementia is also to a large extent not only preventable but avoidable.
Saugstad, Letten F
2006-01-01
The present mismatch between what our brain needs, and the modern diet neglects our marine heritage. Last century, the priority in nutrition and food production was to achieve a high protein diet and somatic growth and function. The dietary content of omega-3 (N-3) required by the brain was neglected although evidence for the essentiality of certain fatty acids was published in 1929 and specifically re-affirmed for omega 3 in the brain in the 1970s. Cognitive decline with age and neurodegenerative disorder with dementia are now rising. This review describes signs of N-3 deficit in Alzheimer and Parkinson Disease, where maximum change involves the primary sites: olfactory cortex and the hippocampus. The olfactory agnosia observed in schizophrenia supports an N-3 deficit as does a reduction of key ologodendrocyte- and myelin-related genes in this disorder and affective disorder, where a rise in dementia accords with a deficit of N-3 also in this disorder. N-3 normalizes cerebral excitability at all levels. That the two disorders are localized at the extremes of excitability, is supported by their opposing treatments: convulsant neuroleptics and anti-epileptic anti-depressants. An adequate N-3 diet will probably prevent most psychotic episodes and prove that neurodegenerative disorder with dementia is also to a large extent not only preventable but avoidable.
Are image quality metrics adequate to evaluate the quality of geometric objects?
NASA Astrophysics Data System (ADS)
Rogowitz, Bernice E.; Rushmeier, Holly E.
2001-06-01
Geometric objects are often represented by many millions of triangles or polygons, which limits the ease with which they can be transmitted and displayed electronically. This has lead to the development of many algorithms for simplifying geometric models, and to the recognition that metrics are required to evaluate their success. The goal is to create computer graphic renderings of the object that do not appear to be degraded to a human observer. The perceptual evaluation of simplified objects is a new topic. One approach has been to sue image-based metrics to predict the perceived degradation of simplified 3D models. Since that 2D images of 3D objects can have significantly different perceived quality, depending on the direction of the illumination, 2D measures of image quality may not adequately capture the perceived quality of 3D objects. To address this question, we conducted experiments in which we explicitly compared the perceived quality of animated 3D objects and their corresponding 2D still image projections. Our results suggest that 2D judgements do not provide a good predictor of 3D image quality, and identify a need to develop 'object quality metrics.'
A high UV environment does not ensure adequate Vitamin D status
NASA Astrophysics Data System (ADS)
Kimlin, M. G.; Lang, C. A.; Brodie, A.; Harrison, S.; Nowak, M.; Moore, M. R.
2006-12-01
Queensland has the highest rates of skin cancer in the world and due to the high levels of solar UV in this region it is assumed that incidental UV exposure should provide adequate vitamin D status for the population. This research was undertaken to test this assumption among healthy free-living adults in south-east Queensland, Australia (27°S), at the end of winter. This research was approved by Queensland University of Technology Human Research Ethics Committee and conducted under the guidelines of the Declaration of Helsinki. 10.2% of the sample had serum vitamin D levels below 25nm/L (deficiency) and a further 32.3% had levels between 25nm/L and 50nm/L (insufficiency). Vitamin D deficiency and insufficiency can occur at the end of winter, even in sunny climates. The wintertime UV levels in south-east Queensland (UV index 4-6) are equivalent to summertime UV levels in northern regions of Europe and the USA. These ambient UV levels are sufficient to ensure synthesis of vitamin D requirements. We investigated individual UV exposure (through a self reported sun exposure questionnaire) and found correlations between exposure and Vitamin D status. Further research is needed to explore the interactions between the solar UV environment and vitamin D status, particularly in high UV environments, such as Queensland.
Determination of the need for selenium by chicks fed practical diets adequate in vitamin E
Combs, G.F. Jr.; Su, Q.; Liu, C.H.; Sinisalo, M.; Combs, S.B.
1986-03-01
Experiments were conducted to compare the dietary needs for selenium (Se) by chicks fed either purified (amino acid-based) or practical (corn- and soy-based) diets that were adequate with respect to vitamin E (i.e., contained 100 IU/kg) and all other known nutrients with the single exception of Se (i.e., contained only 0.10 ppm Se). Studies were conducted in Ithaca using Single Comb White Leghorn chicks fed the purified basal diet and in Beijing using chicks of the same breed fed either the same purified basal diet or the practical diet formulated to be similar to that used in poultry production in some parts of China and the US. Results showed that each basal diet produced severe depletion of Se-dependent glutathione peroxidase (SeGSHpx) in plasma, liver and pancreas according to the same time-course, but that other consequences of severe uncomplicated Se deficiency were much more severe among chicks fed the purified diet (e.g., growth depression, pancreatic dysfunction as indicated by elevated plasma amylase and abnormal pancreatic histology). Chicks fed the practical Se-deficient diet showed reduced pancreas levels of copper, zinc and molybdenum and elevated plasma levels of iron; they required ca. 0.10 ppm dietary Se to sustain normal SeGSHpx in several tissues and to prevent elevated amylase in plasma. The dietary Se requirement of the chick is, therefore, estimated to be 0.10 ppm.
Burnier, Michel; Wuerzner, Gregoire; Bochud, Murielle
2015-01-01
Among the various strategies to reduce the incidence of non-communicable diseases reduction of sodium intake in the general population has been recognized as one of the most cost-effective means because of its potential impact on the development of hypertension and cardiovascular diseases. Yet, this strategic health recommendation of the WHO and many other international organizations is far from being universally accepted. Indeed, there are still several unresolved scientific and epidemiological questions that maintain an ongoing debate. Thus what is the adequate low level of sodium intake to recommend to the general population and whether national strategies should be oriented to the overall population or only to higher risk fractions of the population such as salt-sensitive patients are still discussed. In this paper, we shall review the recent results of the literature regarding salt, blood pressure and cardiovascular risk and we present the recommendations recently proposed by a group of experts of Switzerland. The propositions of the participating medical societies are to encourage national health authorities to continue their discussion with the food industry in order to reduce the sodium intake of food products with a target of mean salt intake of 5–6 grams per day in the population. Moreover, all initiatives to increase the information on the effect of salt on health and on the salt content of food are supported. PMID:26321959
Bhattarai, M D
2012-09-01
On one hand there is obvious inadequate health coverage to the rural population and on the other hand the densely populated urban area is facing the triple burden of increasing non-communicable and communicable health problems and the rising health cost. The postgraduate medical training is closely interrelated with the adequate health service delivery and health economics. In relation to the prevailing situation, the modern medical education trend indicates the five vital issues. These are i). Opportunity needs to be given to all MBBS graduates for General Specialist and Sub-Specialist Training inside the country to complete their medical education, ii). Urgent need for review of PG residential training criteria including appropriate bed and teacher criteria as well as entry criteria and eligibility criteria, iii). Involvement of all available units of hospitals fulfilling the requirements of the residential PG training criteria, iv). PG residential trainings involve doing the required work in the hospitals entitling them full pay and continuation of the service without any training fee or tuition fee, and v). Planning of the proportions of General Specialty and Sub-Specialty Training fields, particularly General Practice (GP) including its career and female participation. With increased number of medical graduates, now it seems possible to plan for optimal health coverage to the populations with appropriate postgraduate medical training. The medical professionals and public health workers must make the Government aware of the vital responsibility and the holistic approach required.
Barth, Amy E; Denton, Carolyn A; Stuebing, Karla K; Fletcher, Jack M; Cirino, Paul T; Francis, David J; Vaughn, Sharon
2010-05-01
The cerebellar hypothesis of dyslexia posits that cerebellar deficits are associated with reading disabilities and may explain why some individuals with reading disabilities fail to respond to reading interventions. We tested these hypotheses in a sample of children who participated in a grade 1 reading intervention study (n = 174) and a group of typically achieving children (n = 62). At posttest, children were classified as adequately responding to the intervention (n = 82), inadequately responding with decoding and fluency deficits (n = 36), or inadequately responding with only fluency deficits (n = 56). Based on the Bead Threading and Postural Stability subtests from the Dyslexia Screening Test-Junior, we found little evidence that assessments of cerebellar functions were associated with academic performance or responder status. In addition, we did not find evidence supporting the hypothesis that cerebellar deficits are more prominent for poor readers with "specific" reading disabilities (i.e., with discrepancies relative to IQ) than for poor readers with reading scores consistent with IQ. In contrast, measures of phonological awareness, rapid naming, and vocabulary were strongly associated with responder status and academic outcomes. These results add to accumulating evidence that fails to associate cerebellar functions with reading difficulties.
Yamamoto, Chiho; Miyoshi, Hideaki; Ono, Kota; Sugawara, Hajime; Kameda, Reina; Ichiyama, Mei; Yamamoto, Kohei; Nomoto, Hiroshi; Nakamura, Akinobu; Atsumi, Tatsuya
2016-06-30
To investigate if ipragliflozin, a novel sodium-glucose co-transporter 2 inhibitor, alters body composition and to identify variables associated with reductions in visceral adipose tissue in Japanese patients with type 2 diabetes mellitus. This prospective observational study enrolled Japanese participants with type 2 diabetes mellitus. Subjects were administered ipragliflozin (50 mg/day) once daily for 16 weeks. Body composition, visceral adipose tissue volume and plasma variables were measured at 0, 8, and 16-weeks. The subjects' lifestyle habits including diet and exercise were evaluated at baseline and 16 weeks. The primary endpoint was defined as the decrease of visceral adipose tissue mass. Twenty-four of 26 enrolled participants completed the study. The visceral adipose tissue decreased significantly (110 ± 33 to 101 ± 36 cm(2), p = 0.005) as well as other parameters for metabolic insufficiency including hemoglobin A1c. Seventy-one % of the total body weight reduction (-2.49 kg) was estimated by a decrease in fat mass (-1.77 kg), and the remaining reduction (22%) by water volume (-0.55 kg). A minor but significant reduction in the skeletal muscle index was also observed. Correlation analyses were performed to identify variables associated with changes in visceral adipose tissue and the only significant variable identified was diet therapy (Spearman's r = -0.416, p = 0.043). Ipragliflozin significantly decreased visceral adipose tissue, and improved parametres for metabolic dysfunction. Adequate diet therapy would be necessary to induce and enhance the therapeutic merit.
Aurally-adequate time-frequency analysis for scattered sound in auditoria
NASA Astrophysics Data System (ADS)
Norris, Molly K.; Xiang, Ning; Kleiner, Mendel
2005-04-01
The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.
Statistical regimes of random laser fluctuations
Lepri, Stefano; Cavalieri, Stefano; Oppo, Gian-Luca; Wiersma, Diederik S.
2007-06-15
Statistical fluctuations of the light emitted from amplifying random media are studied theoretically and numerically. The characteristic scales of the diffusive motion of light lead to Gaussian or power-law (Levy) distributed fluctuations depending on external control parameters. In the Levy regime, the output pulse is highly irregular leading to huge deviations from a mean-field description. Monte Carlo simulations of a simplified model which includes the population of the medium demonstrate the two statistical regimes and provide a comparison with dynamical rate equations. Different statistics of the fluctuations helps to explain recent experimental observations reported in the literature.
Teaching Statistics without Sadistics.
ERIC Educational Resources Information Center
Forte, James A.
1995-01-01
Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…
STATSIM: Exercises in Statistics.
ERIC Educational Resources Information Center
Thomas, David B.; And Others
A computer-based learning simulation was developed at Florida State University which allows for high interactive responding via a time-sharing terminal for the purpose of demonstrating descriptive and inferential statistics. The statistical simulation (STATSIM) is comprised of four modules--chi square, t, z, and F distribution--and elucidates the…
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
Towards Statistically Undetectable Steganography
2011-06-30
Statistically Undciectable Steganography 5a. CONTRACT NUMBER FA9550-08-1-0084 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Prof. Jessica...approved for public release: distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Fundamental asymptotic laws for imperfect steganography ...formats. 15. SUBJECT TERMS Steganography . covert communication, statistical detectability. asymptotic performance, secure pay load, minimum
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…
ERIC Educational Resources Information Center
Singer, Arlene
This guide outlines a one semester Option Y course, which has seven learner objectives. The course is designed to provide students with an introduction to the concerns and methods of statistics, and to equip them to deal with the many statistical matters of importance to society. Topics covered include graphs and charts, collection and…
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Code of Federal Regulations, 2013 CFR
2013-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... security safeguards to prevent unauthorized disclosure or destruction of manual and automatic record..., and security safeguards to prevent unauthorized disclosure or destruction of manual and...
Code of Federal Regulations, 2011 CFR
2011-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... security safeguards to prevent unauthorized disclosure or destruction of manual and automatic record..., and security safeguards to prevent unauthorized disclosure or destruction of manual and...
Code of Federal Regulations, 2014 CFR
2014-10-01
... adequate technical, physical, and security safeguards to prevent unauthorized disclosure or destruction of... security safeguards to prevent unauthorized disclosure or destruction of manual and automatic record..., and security safeguards to prevent unauthorized disclosure or destruction of manual and...
Statistical Mechanics of Confined Biological Materials
NASA Astrophysics Data System (ADS)
El Kinani, R.; Benhamou, M.; Kaïdi, H.
2017-03-01
In this work, we propose a model to study the Statistical Mechanics of a confined bilayer-membrane that fluctuates between two interactive flat substrates. From the scaling laws point of view, the bilayer-membranes and strings are very similar. Therefore, it is sufficient to consider only the problem of a string. We assume that the bilayer-membrane (or string) interact with the substrate via a Double Morse potential that reproduces well the characteristics of the real interaction. We show that the Statistical Mechanic of the string can be adequately described by the Schrödinger equation approach that we solve exactly using the Bethe method. Finally, from the exact value of the energy of the ground state, we extract the expression of the free energy density as well as the specific heat.
Saurí, Josep; Bermel, Wolfgang; Buevich, Alexei V; Sherer, Edward C; Joyce, Leo A; Sharaf, Maged H M; Schiff, Paul L; Parella, Teodor; Williamson, R Thomas; Martin, Gary E
2015-08-24
Cryptospirolepine is the most structurally complex alkaloid discovered and characterized thus far from any Cryptolepis specie. Characterization of several degradants of the original, sealed NMR sample a decade after the initial report called the validity of the originally proposed structure in question. We now report the development of improved, homodecoupled variants of the 1,1- and 1,n-ADEQUATE (HD-ADEQUATE) NMR experiments; utilization of these techniques was critical to successfully resolving long-standing structural questions associated with crytospirolepine.
Is the Current US Navy Pacific Basing Structure Adequate for the Twenty-First Century?
2006-12-15
fiscal constraints, and without an Asian continental power clearly capable of mounting a credible military and naval threat to America, a reassessment...the devastating effects of Mount Pinatubo’s eruption on Clark Air force Base resulted in the closure of the US Naval base in Subic Bay, Philippines...Seventh Fleet ships converged on the Philippines to evacuate US military and families after the eruption of Mount Pinatubo. During Operation Restore Hope
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.