Sample records for sufficient statistical power

  1. Targeted On-Demand Team Performance App Development

    DTIC Science & Technology

    2016-10-01

    from three sites; 6) Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes...statistical analyses, and examine any resulting qualitative data for trends or connections to statistical outcomes. On Schedule 21 Predictive...Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes.  What opportunities for

  2. The Performance of a PN Spread Spectrum Receiver Preceded by an Adaptive Interference Suppression Filter.

    DTIC Science & Technology

    1982-12-01

    Sequence dj Estimate of the Desired Signal DEL Sampling Time Interval DS Direct Sequence c Sufficient Statistic E/T Signal Power Erfc Complimentary Error...Namely, a white Gaussian noise (WGN) generator was added. Also, a statistical subroutine was added in order to assess performance improvement at the...reference code and then passed through a correlation detector whose output is the sufficient 1 statistic , e . Using a threshold device and the sufficient

  3. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    PubMed

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    PubMed Central

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  5. Alignment-free sequence comparison (II): theoretical power of comparison statistics.

    PubMed

    Wan, Lin; Reinert, Gesine; Sun, Fengzhu; Waterman, Michael S

    2010-11-01

    Rapid methods for alignment-free sequence comparison make large-scale comparisons between sequences increasingly feasible. Here we study the power of the statistic D2, which counts the number of matching k-tuples between two sequences, as well as D2*, which uses centralized counts, and D2S, which is a self-standardized version, both from a theoretical viewpoint and numerically, providing an easy to use program. The power is assessed under two alternative hidden Markov models; the first one assumes that the two sequences share a common motif, whereas the second model is a pattern transfer model; the null model is that the two sequences are composed of independent and identically distributed letters and they are independent. Under the first alternative model, the means of the tuple counts in the individual sequences change, whereas under the second alternative model, the marginal means are the same as under the null model. Using the limit distributions of the count statistics under the null and the alternative models, we find that generally, asymptotically D2S has the largest power, followed by D2*, whereas the power of D2 can even be zero in some cases. In contrast, even for sequences of length 140,000 bp, in simulations D2* generally has the largest power. Under the first alternative model of a shared motif, the power of D2*approaches 100% when sufficiently many motifs are shared, and we recommend the use of D2* for such practical applications. Under the second alternative model of pattern transfer,the power for all three count statistics does not increase with sequence length when the sequence is sufficiently long, and hence none of the three statistics under consideration canbe recommended in such a situation. We illustrate the approach on 323 transcription factor binding motifs with length at most 10 from JASPAR CORE (October 12, 2009 version),verifying that D2* is generally more powerful than D2. The program to calculate the power of D2, D2* and D2S can be downloaded from http://meta.cmb.usc.edu/d2. Supplementary Material is available at www.liebertonline.com/cmb.

  6. Seven ways to increase power without increasing N.

    PubMed

    Hansen, W B; Collins, L M

    1994-01-01

    Many readers of this monograph may wonder why a chapter on statistical power was included. After all, by now the issue of statistical power is in many respects mundane. Everyone knows that statistical power is a central research consideration, and certainly most National Institute on Drug Abuse grantees or prospective grantees understand the importance of including a power analysis in research proposals. However, there is ample evidence that, in practice, prevention researchers are not paying sufficient attention to statistical power. If they were, the findings observed by Hansen (1992) in a recent review of the prevention literature would not have emerged. Hansen (1992) examined statistical power based on 46 cohorts followed longitudinally, using nonparametric assumptions given the subjects' age at posttest and the numbers of subjects. Results of this analysis indicated that, in order for a study to attain 80-percent power for detecting differences between treatment and control groups, the difference between groups at posttest would need to be at least 8 percent (in the best studies) and as much as 16 percent (in the weakest studies). In order for a study to attain 80-percent power for detecting group differences in pre-post change, 22 of the 46 cohorts would have needed relative pre-post reductions of greater than 100 percent. Thirty-three of the 46 cohorts had less than 50-percent power to detect a 50-percent relative reduction in substance use. These results are consistent with other review findings (e.g., Lipsey 1990) that have shown a similar lack of power in a broad range of research topics. Thus, it seems that, although researchers are aware of the importance of statistical power (particularly of the necessity for calculating it when proposing research), they somehow are failing to end up with adequate power in their completed studies. This chapter argues that the failure of many prevention studies to maintain adequate statistical power is due to an overemphasis on sample size (N) as the only, or even the best, way to increase statistical power. It is easy to see how this overemphasis has come about. Sample size is easy to manipulate, has the advantage of being related to power in a straight-forward way, and usually is under the direct control of the researcher, except for limitations imposed by finances or subject availability. Another option for increasing power is to increase the alpha used for hypothesis-testing but, as very few researchers seriously consider significance levels much larger than the traditional .05, this strategy seldom is used. Of course, sample size is important, and the authors of this chapter are not recommending that researchers cease choosing sample sizes carefully. Rather, they argue that researchers should not confine themselves to increasing N to enhance power. It is important to take additional measures to maintain and improve power over and above making sure the initial sample size is sufficient. The authors recommend two general strategies. One strategy involves attempting to maintain the effective initial sample size so that power is not lost needlessly. The other strategy is to take measures to maximize the third factor that determines statistical power: effect size.

  7. Publication Bias in "Red, Rank, and Romance in Women Viewing Men," by Elliot et al. (2010)

    ERIC Educational Resources Information Center

    Francis, Gregory

    2013-01-01

    Elliot et al. (2010) reported multiple experimental findings that the color red modified women's ratings of attractiveness, sexual desirability, and status of a photographed man. An analysis of the reported statistics of these studies indicates that the experiments lack sufficient power to support these claims. Given the power of the experiments,…

  8. Wind speed statistics for Goldstone, California, anemometer sites

    NASA Technical Reports Server (NTRS)

    Berg, M.; Levy, R.; Mcginness, H.; Strain, D.

    1981-01-01

    An exploratory wind survey at an antenna complex was summarized statistically for application to future windmill designs. Data were collected at six locations from a total of 10 anemometers. Statistics include means, standard deviations, cubes, pattern factors, correlation coefficients, and exponents for power law profile of wind speed. Curves presented include: mean monthly wind speeds, moving averages, and diurnal variation patterns. It is concluded that three of the locations have sufficiently strong winds to justify consideration for windmill sites.

  9. Non-gaussian statistics of pencil beam surveys

    NASA Technical Reports Server (NTRS)

    Amendola, Luca

    1994-01-01

    We study the effect of the non-Gaussian clustering of galaxies on the statistics of pencil beam surveys. We derive the probability from the power spectrum peaks by means of Edgeworth expansion and find that the higher order moments of the galaxy distribution play a dominant role. The probability of obtaining the 128 Mpc/h periodicity found in pencil beam surveys is raised by more than one order of magnitude, up to 1%. Further data are needed to decide if non-Gaussian distribution alone is sufficient to explain the 128 Mpc/h periodicity, or if extra large-scale power is necessary.

  10. Using public control genotype data to increase power and decrease cost of case-control genetic association studies.

    PubMed

    Ho, Lindsey A; Lange, Ethan M

    2010-12-01

    Genome-wide association (GWA) studies are a powerful approach for identifying novel genetic risk factors associated with human disease. A GWA study typically requires the inclusion of thousands of samples to have sufficient statistical power to detect single nucleotide polymorphisms that are associated with only modest increases in risk of disease given the heavy burden of a multiple test correction that is necessary to maintain valid statistical tests. Low statistical power and the high financial cost of performing a GWA study remains prohibitive for many scientific investigators anxious to perform such a study using their own samples. A number of remedies have been suggested to increase statistical power and decrease cost, including the utilization of free publicly available genotype data and multi-stage genotyping designs. Herein, we compare the statistical power and relative costs of alternative association study designs that use cases and screened controls to study designs that are based only on, or additionally include, free public control genotype data. We describe a novel replication-based two-stage study design, which uses free public control genotype data in the first stage and follow-up genotype data on case-matched controls in the second stage that preserves many of the advantages inherent when using only an epidemiologically matched set of controls. Specifically, we show that our proposed two-stage design can substantially increase statistical power and decrease cost of performing a GWA study while controlling the type-I error rate that can be inflated when using public controls due to differences in ancestry and batch genotype effects.

  11. Is Neurofeedback an Efficacious Treatment for ADHD? A Randomised Controlled Clinical Trial

    ERIC Educational Resources Information Center

    Gevensleben, Holger; Holl, Birgit; Albrecht, Bjorn; Vogel, Claudia; Schlamp, Dieter; Kratz, Oliver; Studer, Petra; Rothenberger, Aribert; Moll, Gunther H.; Heinrich, Hartmut

    2009-01-01

    Background: For children with attention deficit/hyperactivity disorder (ADHD), a reduction of inattention, impulsivity and hyperactivity by neurofeedback (NF) has been reported in several studies. But so far, unspecific training effects have not been adequately controlled for andor studies do not provide sufficient statistical power. To overcome…

  12. Prefrontal left--dominant hemisphere--gamma and delta oscillators in general anaesthesia with volatile anaesthetics during open thoracic surgery.

    PubMed

    Saniova, Beata; Drobny, Michal; Drobna, Eva; Hamzik, Julian; Bakosova, Erika; Fischer, Martin

    2016-01-01

    The main objective was to indicate sufficient general anaesthesia (GA) inhibition for negative experience rejection in GA. We investigated the group of patients (n = 17, mean age 63.59 years, 9 male--65.78 years, 8 female - 61.13 years) during GA in open thorax surgery and analyzed EEG signal by power spectrum (pEEG) delta (DR), and gamma rhythms (GR). EEG was performed: OPO - the day before surgery and in surgery phases OP1-OP5 during GA. Particular GA phases: OP1 = after pre- medication, OP2 = surgery onset, OP3 = surgery with one-side lung ventilation, OP4 = end of surgery, both sides ventilation, OP5 = end of GA. pEEG registering in the left frontal region Fp1-A1 montage in 17 right handed persons. Mean DR power in OP2 phase is significantly higher than in phase OP5 and mean DR power in OP3 is higher than in OP5. One-lung ventilation did not change minimal alveolar concentration and gases should not accelerate decrease in mean DR power. Higher mean value of GR power in OPO than in OP3 was statistically significant. Mean GR power in OP3 is statistically significantly lower than in OP4 correlating with the same gases concentration in OP3 and OP4. Our results showed DR power decreased since OP2 till the end of GA it means inhibition represented by power DR fluently decreasing is sufficient for GA depth. GR power decay near the working memory could reduce conscious cognition and unpleasant explicit experience in GA.

  13. Properties of different selection signature statistics and a new strategy for combining them.

    PubMed

    Ma, Y; Ding, X; Qanbari, S; Weigend, S; Zhang, Q; Simianer, H

    2015-11-01

    Identifying signatures of recent or ongoing selection is of high relevance in livestock population genomics. From a statistical perspective, determining a proper testing procedure and combining various test statistics is challenging. On the basis of extensive simulations in this study, we discuss the statistical properties of eight different established selection signature statistics. In the considered scenario, we show that a reasonable power to detect selection signatures is achieved with high marker density (>1 SNP/kb) as obtained from sequencing, while rather small sample sizes (~15 diploid individuals) appear to be sufficient. Most selection signature statistics such as composite likelihood ratio and cross population extended haplotype homozogysity have the highest power when fixation of the selected allele is reached, while integrated haplotype score has the highest power when selection is ongoing. We suggest a novel strategy, called de-correlated composite of multiple signals (DCMS) to combine different statistics for detecting selection signatures while accounting for the correlation between the different selection signature statistics. When examined with simulated data, DCMS consistently has a higher power than most of the single statistics and shows a reliable positional resolution. We illustrate the new statistic to the established selective sweep around the lactase gene in human HapMap data providing further evidence of the reliability of this new statistic. Then, we apply it to scan selection signatures in two chicken samples with diverse skin color. Our analysis suggests that a set of well-known genes such as BCO2, MC1R, ASIP and TYR were involved in the divergent selection for this trait.

  14. Using the bootstrap to establish statistical significance for relative validity comparisons among patient-reported outcome measures

    PubMed Central

    2013-01-01

    Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463

  15. The bias of the log power spectrum for discrete surveys

    NASA Astrophysics Data System (ADS)

    Repp, Andrew; Szapudi, István

    2018-03-01

    A primary goal of galaxy surveys is to tighten constraints on cosmological parameters, and the power spectrum P(k) is the standard means of doing so. However, at translinear scales P(k) is blind to much of these surveys' information - information which the log density power spectrum recovers. For discrete fields (such as the galaxy density), A* denotes the statistic analogous to the log density: A* is a `sufficient statistic' in that its power spectrum (and mean) capture virtually all of a discrete survey's information. However, the power spectrum of A* is biased with respect to the corresponding log spectrum for continuous fields, and to use P_{A^*}(k) to constrain the values of cosmological parameters, we require some means of predicting this bias. Here, we present a prescription for doing so; for Euclid-like surveys (with cubical cells 16h-1 Mpc across) our bias prescription's error is less than 3 per cent. This prediction will facilitate optimal utilization of the information in future galaxy surveys.

  16. The wave-tide-river delta classification revisited: Introducing the effects of Humans on delta equilibriu

    NASA Astrophysics Data System (ADS)

    Besset, M.; Anthony, E.; Sabatier, F.

    2016-12-01

    The influence of physical processes on river deltas has long been identified, mainly on the basis of delta morphology. A cuspate delta is considered as wave-dominated, a delta with finger-like extensions is characterized as river-dominated, and a delta with estuarine re-entrants is considered tide-dominated (Galloway, 1975). The need for a more quantitative classification is increasingly recognized, and is achievable through quantified combinations, a good example being Syvitski and Saito (2007) wherein the joint influence of marine power - wave and tides - is compared to that of river influence. This need is further justified as deltas become more and more vulnerable. Going forward from the Syvitski and Saito (2007) approach, we confront, from a large database on 60 river deltas, the maximum potential power of waves and the tidal range (both representing marine power), and the specific stream power and river sediment supply reflecting an increasingly human-impacted river influence. The results show that 45 deltas (75%) have levels of marine power that are significantly higher than those of specific stream power. Five deltas have sufficient stream power to counterbalance marine power but a present sediment supply inadequate for them to be statistically considered as river-dominated. Six others have a sufficient sediment supply but a specific stream power that is not high enough for them to be statistically river-dominated. A major manifestation of the interplay of these parameters is accelerated delta erosion worldwide, shifting the balance towards marine power domination. Deltas currently eroding are mainly influenced by marine power (93%), and small deltas (< 300 km2 of deltaic protuberance) are the most vulnerable (82%). These high levels of erosion domination, compounded by accelerated subsidence, are related to human-induced sediment supply depletion and changes in water discharge in the face of the sediment-dispersive capacity of waves and currents.

  17. Simultaneous Use of Multiple Answer Copying Indexes to Improve Detection Rates

    ERIC Educational Resources Information Center

    Wollack, James A.

    2006-01-01

    Many of the currently available statistical indexes to detect answer copying lack sufficient power at small [alpha] levels or when the amount of copying is relatively small. Furthermore, there is no one index that is uniformly best. Depending on the type or amount of copying, certain indexes are better than others. The purpose of this article was…

  18. Statistical Learning in a Natural Language by 8-Month-Old Infants

    PubMed Central

    Pelucchi, Bruna; Hay, Jessica F.; Saffran, Jenny R.

    2013-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants’ ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition. PMID:19489896

  19. Statistical learning in a natural language by 8-month-old infants.

    PubMed

    Pelucchi, Bruna; Hay, Jessica F; Saffran, Jenny R

    2009-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants' ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition.

  20. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    PubMed

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  1. Discovering human germ cell mutagens with whole genome sequencing: Insights from power calculations reveal the importance of controlling for between-family variability.

    PubMed

    Webster, R J; Williams, A; Marchetti, F; Yauk, C L

    2018-07-01

    Mutations in germ cells pose potential genetic risks to offspring. However, de novo mutations are rare events that are spread across the genome and are difficult to detect. Thus, studies in this area have generally been under-powered, and no human germ cell mutagen has been identified. Whole Genome Sequencing (WGS) of human pedigrees has been proposed as an approach to overcome these technical and statistical challenges. WGS enables analysis of a much wider breadth of the genome than traditional approaches. Here, we performed power analyses to determine the feasibility of using WGS in human families to identify germ cell mutagens. Different statistical models were compared in the power analyses (ANOVA and multiple regression for one-child families, and mixed effect model sampling between two to four siblings per family). Assumptions were made based on parameters from the existing literature, such as the mutation-by-paternal age effect. We explored two scenarios: a constant effect due to an exposure that occurred in the past, and an accumulating effect where the exposure is continuing. Our analysis revealed the importance of modeling inter-family variability of the mutation-by-paternal age effect. Statistical power was improved by models accounting for the family-to-family variability. Our power analyses suggest that sufficient statistical power can be attained with 4-28 four-sibling families per treatment group, when the increase in mutations ranges from 40 to 10% respectively. Modeling family variability using mixed effect models provided a reduction in sample size compared to a multiple regression approach. Much larger sample sizes were required to detect an interaction effect between environmental exposures and paternal age. These findings inform study design and statistical modeling approaches to improve power and reduce sequencing costs for future studies in this area. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  2. Maximum likelihood estimation of signal-to-noise ratio and combiner weight

    NASA Technical Reports Server (NTRS)

    Kalson, S.; Dolinar, S. J.

    1986-01-01

    An algorithm for estimating signal to noise ratio and combiner weight parameters for a discrete time series is presented. The algorithm is based upon the joint maximum likelihood estimate of the signal and noise power. The discrete-time series are the sufficient statistics obtained after matched filtering of a biphase modulated signal in additive white Gaussian noise, before maximum likelihood decoding is performed.

  3. Tonopen XL assessment of intraocular pressure through silicone hydrogel contact lenses.

    PubMed

    Schornack, Muriel; Rice, Melissa; Hodge, David

    2012-09-01

    To assess the accuracy of Tonopen XL measurement of intraocular pressure (IOP) through low-power (-0.25 to -3.00) and high power (-3.25 to -6.00) silicone hydrogel lenses of 3 different materials (galyfilcon A, senofilcon A, and lotrafilcon B). Seventy-eight patients were recruited for participation in this study. All were habitual wearers of silicone hydrogel contact lenses, and none had been diagnosed with glaucoma, ocular hypertension, or anterior surface disease. IOP was measured with and without lenses in place in the right eye only. Patients were randomized to initial measurement either with or without the lens in place. A single examiner collected all data. No statistically significant differences were noted between IOP measured without lenses and IOP measured through low-power lotrafilcon B lenses or high-power or low-power galyfilcon A and senofilcon A lenses. However, we did find a statistically significant difference between IOP measured without lenses and IOP measured through high-power lotrafilcon B lenses. In general, Tonopen XL measurement of IOP through silicone hydrogel lenses may be sufficiently accurate for clinical purposes. However, Tonopen XL may overestimate IOP if performed through a silicone hydrogel lens of relatively high modulus.

  4. Six Guidelines for Interesting Research.

    PubMed

    Gray, Kurt; Wegner, Daniel M

    2013-09-01

    There are many guides on proper psychology, but far fewer on interesting psychology. This article presents six guidelines for interesting research. The first three-Phenomena First, Be Surprising, and Grandmothers, Not Scientists-suggest how to choose your research question; the last three-Be The Participant, Simple Statistics, and Powerful Beginnings-suggest how to answer your research question and offer perspectives on experimental design, statistical analysis, and effective communication. These guidelines serve as reminders that replicability is necessary but not sufficient for compelling psychological science. Interesting research considers subjective experience; it listens to the music of the human condition. © The Author(s) 2013.

  5. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  6. Breast Reference Set Application: Chris Li-FHCRC (2014) — EDRN Public Portal

    Cancer.gov

    This application proposes to use Reference Set #1. We request access to serum samples collected at the time of breast biopsy from subjects with IC (n=30) or benign disease without atypia (n=30). Statistical power: With 30 BC cases and 30 normal controls, a 25% difference in mean metabolite levels can be detected between groups with 80% power and α=0.05, assuming coefficients of variation of 30%, consistent with our past studies. These sample sizes appear sufficient to enable detection of changes similar in magnitude to those previously reported in pre-clinical (BC recurrence) specimens (20).

  7. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  8. Load- and skill-related changes in segmental contributions to a weightlifting movement.

    PubMed

    Enoka, R M

    1988-04-01

    An exemplary short duration, high-power, weightlifting event was examined to determine whether the ability to lift heavier loads and whether variations in the level of skill were accompanied by quantitative changes in selected aspects of lower extremity joint power-time histories. Six experienced weightlifters, three skilled and three less skilled, performed the double-knee-bend execution of the pull in Olympic weightlifting, a movement which lasted almost 1 s. Analysis-of-variance statistics were performed on selected peak and average values of power generated by the three skilled subjects as they lifted three loads (69, 77, and 86% of their competition maximum). The results indicated that the skilled subjects lifted heavier loads by increasing the average power, but not the peak power, about the knee and ankle joints. In addition, the changes with load were more subtle than a mere quantitative scaling and also seemed to be associated with a skill element in the form of variation in the duration of the phases of power production and absorption. Similarly, statistical differences (independent t-test) due to skill did not involve changes in the magnitude of power but rather the temporal organization of the movement. Thus, the ability to successfully execute the double-knee-bend movement depends on an athlete's ability to both generate a sufficient magnitude of joint power and to organize the phases of power production and absorption into an appropriate temporal sequence.

  9. Cosmic microwave background power asymmetry from non-Gaussian modulation.

    PubMed

    Schmidt, Fabian; Hui, Lam

    2013-01-04

    Non-Gaussianity in the inflationary perturbations can couple observable scales to modes of much longer wavelength (even superhorizon), leaving as a signature a large-angle modulation of the observed cosmic microwave background power spectrum. This provides an alternative origin for a power asymmetry that is otherwise often ascribed to a breaking of statistical isotropy. The non-Gaussian modulation effect can be significant even for typical ~10(-5) perturbations while respecting current constraints on non-Gaussianity if the squeezed limit of the bispectrum is sufficiently infrared divergent. Just such a strongly infrared-divergent bispectrum has been claimed for inflation models with a non-Bunch-Davies initial state, for instance. Upper limits on the observed cosmic microwave background power asymmetry place stringent constraints on the duration of inflation in such models.

  10. Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation

    NASA Astrophysics Data System (ADS)

    Matsumoto, Takumi; Sagawa, Takahiro

    2018-04-01

    A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.

  11. Distinguishing Positive Selection From Neutral Evolution: Boosting the Performance of Summary Statistics

    PubMed Central

    Lin, Kao; Li, Haipeng; Schlötterer, Christian; Futschik, Andreas

    2011-01-01

    Summary statistics are widely used in population genetics, but they suffer from the drawback that no simple sufficient summary statistic exists, which captures all information required to distinguish different evolutionary hypotheses. Here, we apply boosting, a recent statistical method that combines simple classification rules to maximize their joint predictive performance. We show that our implementation of boosting has a high power to detect selective sweeps. Demographic events, such as bottlenecks, do not result in a large excess of false positives. A comparison to other neutrality tests shows that our boosting implementation performs well compared to other neutrality tests. Furthermore, we evaluated the relative contribution of different summary statistics to the identification of selection and found that for recent sweeps integrated haplotype homozygosity is very informative whereas older sweeps are better detected by Tajima's π. Overall, Watterson's θ was found to contribute the most information for distinguishing between bottlenecks and selection. PMID:21041556

  12. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  13. Assessment of economic factors affecting the satellite power system. Volume 2: The systems implications of rectenna siting issues

    NASA Technical Reports Server (NTRS)

    Chapman, P. K.; Bugos, B. J.; Csigi, K. I.; Glaser, P. E.; Schimke, G. R.; Thomas, R. G.

    1979-01-01

    The feasibility was evaluated of finding potential sites for Solar Power Satellite (SPS) receiving antennas (rectennas) in the continental United States, in sufficient numbers to permit the SPS to make a major contribution to U.S. generating facilities, and to give statistical validity to an assessment of the characteristics of such sites and their implications for the design of the SPS system. It is found that the cost-optimum power output of the SPS does not depend on the particular value assigned to the cost per unit area of a rectenna and its site, as long as it is independent of rectenna area. Many characteristics of the sites chosen affect the optimum design of the rectenna itself.

  14. Implications of clinical trial design on sample size requirements.

    PubMed

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  15. Human dynamics scaling characteristics for aerial inbound logistics operation

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Guo, Jin-Li

    2010-05-01

    In recent years, the study of power-law scaling characteristics of real-life networks has attracted much interest from scholars; it deviates from the Poisson process. In this paper, we take the whole process of aerial inbound operation in a logistics company as the empirical object. The main aim of this work is to study the statistical scaling characteristics of the task-restricted work patterns. We found that the statistical variables have the scaling characteristics of unimodal distribution with a power-law tail in five statistical distributions - that is to say, there obviously exists a peak in each distribution, the shape of the left part closes to a Poisson distribution, and the right part has a heavy-tailed scaling statistics. Furthermore, to our surprise, there is only one distribution where the right parts can be approximated by the power-law form with exponent α=1.50. Others are bigger than 1.50 (three of four are about 2.50, one of four is about 3.00). We then obtain two inferences based on these empirical results: first, the human behaviors probably both close to the Poisson statistics and power-law distributions on certain levels, and the human-computer interaction behaviors may be the most common in the logistics operational areas, even in the whole task-restricted work pattern areas. Second, the hypothesis in Vázquez et al. (2006) [A. Vázquez, J. G. Oliveira, Z. Dezsö, K.-I. Goh, I. Kondor, A.-L. Barabási. Modeling burst and heavy tails in human dynamics, Phys. Rev. E 73 (2006) 036127] is probably not sufficient; it claimed that human dynamics can be classified as two discrete university classes. There may be a new human dynamics mechanism that is different from the classical Barabási models.

  16. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    PubMed

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  17. Journal news

    USGS Publications Warehouse

    Conroy, M.J.; Samuel, M.D.; White, Joanne C.

    1995-01-01

    Statistical power (and conversely, Type II error) is often ignored by biologists. Power is important to consider in the design of studies, to ensure that sufficient resources are allocated to address a hypothesis under examination. Deter- mining appropriate sample size when designing experiments or calculating power for a statistical test requires an investigator to consider the importance of making incorrect conclusions about the experimental hypothesis and the biological importance of the alternative hypothesis (or the biological effect size researchers are attempting to measure). Poorly designed studies frequently provide results that are at best equivocal, and do little to advance science or assist in decision making. Completed studies that fail to reject Ho should consider power and the related probability of a Type II error in the interpretation of results, particularly when implicit or explicit acceptance of Ho is used to support a biological hypothesis or management decision. Investigators must consider the biological question they wish to answer (Tacha et al. 1982) and assess power on the basis of biologically significant differences (Taylor and Gerrodette 1993). Power calculations are somewhat subjective, because the author must specify either f or the minimum difference that is biologically important. Biologists may have different ideas about what values are appropriate. While determining biological significance is of central importance in power analysis, it is also an issue of importance in wildlife science. Procedures, references, and computer software to compute power are accessible; therefore, authors should consider power. We welcome comments or suggestions on this subject.

  18. An asymptotic theory for cross-correlation between auto-correlated sequences and its application on neuroimaging data.

    PubMed

    Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng

    2018-04-20

    Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.

  19. Power of mental health nursing research: a statistical analysis of studies in the International Journal of Mental Health Nursing.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2013-02-01

    Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  20. Energy Efficiency Optimization in Relay-Assisted MIMO Systems With Perfect and Statistical CSI

    NASA Astrophysics Data System (ADS)

    Zappone, Alessio; Cao, Pan; Jorswieck, Eduard A.

    2014-01-01

    A framework for energy-efficient resource allocation in a single-user, amplify-and-forward relay-assisted MIMO system is devised in this paper. Previous results in this area have focused on rate maximization or sum power minimization problems, whereas fewer results are available when bits/Joule energy efficiency (EE) optimization is the goal. The performance metric to optimize is the ratio between the system's achievable rate and the total consumed power. The optimization is carried out with respect to the source and relay precoding matrices, subject to QoS and power constraints. Such a challenging non-convex problem is tackled by means of fractional programming and and alternating maximization algorithms, for various CSI assumptions at the source and relay. In particular the scenarios of perfect CSI and those of statistical CSI for either the source-relay or the relay-destination channel are addressed. Moreover, sufficient conditions for beamforming optimality are derived, which is useful in simplifying the system design. Numerical results are provided to corroborate the validity of the theoretical findings.

  1. On base station cooperation using statistical CSI in jointly correlated MIMO downlink channels

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Jiang, Bin; Jin, Shi; Gao, Xiqi; Wong, Kai-Kit

    2012-12-01

    This article studies the transmission of a single cell-edge user's signal using statistical channel state information at cooperative base stations (BSs) with a general jointly correlated multiple-input multiple-output (MIMO) channel model. We first present an optimal scheme to maximize the ergodic sum capacity with per-BS power constraints, revealing that the transmitted signals of all BSs are mutually independent and the optimum transmit directions for each BS align with the eigenvectors of the BS's own transmit correlation matrix of the channel. Then, we employ matrix permanents to derive a closed-form tight upper bound for the ergodic sum capacity. Based on these results, we develop a low-complexity power allocation solution using convex optimization techniques and a simple iterative water-filling algorithm (IWFA) for power allocation. Finally, we derive a necessary and sufficient condition for which a beamforming approach achieves capacity for all BSs. Simulation results demonstrate that the upper bound of ergodic sum capacity is tight and the proposed cooperative transmission scheme increases the downlink system sum capacity considerably.

  2. Hypothesis-Testing Demands Trustworthy Data—A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy

    PubMed Central

    Krefeld-Schwalb, Antonia; Witte, Erich H.; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H0-hypothesis to a statistical H1-verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a “pure” Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis. PMID:29740363

  3. Hypothesis-Testing Demands Trustworthy Data-A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy.

    PubMed

    Krefeld-Schwalb, Antonia; Witte, Erich H; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H 0 -hypothesis to a statistical H 1 -verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a "pure" Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis.

  4. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  5. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091

  6. Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests

    USGS Publications Warehouse

    Carr, R.S.; Biedenbach, J.M.

    1999-01-01

    When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.

  7. Assessment and Implication of Prognostic Imbalance in Randomized Controlled Trials with a Binary Outcome – A Simulation Study

    PubMed Central

    Chu, Rong; Walter, Stephen D.; Guyatt, Gordon; Devereaux, P. J.; Walsh, Michael; Thorlund, Kristian; Thabane, Lehana

    2012-01-01

    Background Chance imbalance in baseline prognosis of a randomized controlled trial can lead to over or underestimation of treatment effects, particularly in trials with small sample sizes. Our study aimed to (1) evaluate the probability of imbalance in a binary prognostic factor (PF) between two treatment arms, (2) investigate the impact of prognostic imbalance on the estimation of a treatment effect, and (3) examine the effect of sample size (n) in relation to the first two objectives. Methods We simulated data from parallel-group trials evaluating a binary outcome by varying the risk of the outcome, effect of the treatment, power and prevalence of the PF, and n. Logistic regression models with and without adjustment for the PF were compared in terms of bias, standard error, coverage of confidence interval and statistical power. Results For a PF with a prevalence of 0.5, the probability of a difference in the frequency of the PF≥5% reaches 0.42 with 125/arm. Ignoring a strong PF (relative risk = 5) leads to underestimating the strength of a moderate treatment effect, and the underestimate is independent of n when n is >50/arm. Adjusting for such PF increases statistical power. If the PF is weak (RR = 2), adjustment makes little difference in statistical inference. Conditional on a 5% imbalance of a powerful PF, adjustment reduces the likelihood of large bias. If an absolute measure of imbalance ≥5% is deemed important, including 1000 patients/arm provides sufficient protection against such an imbalance. Two thousand patients/arm may provide an adequate control against large random deviations in treatment effect estimation in the presence of a powerful PF. Conclusions The probability of prognostic imbalance in small trials can be substantial. Covariate adjustment improves estimation accuracy and statistical power, and hence should be performed when strong PFs are observed. PMID:22629322

  8. Statistical properties of Galactic CMB foregrounds: dust and synchrotron

    NASA Astrophysics Data System (ADS)

    Kandel, D.; Lazarian, A.; Pogosyan, D.

    2018-07-01

    Recent Planck observations have revealed some of the important statistical properties of synchrotron and dust polarization, namely, the B to E mode power and temperature-E (TE) mode cross-correlation. In this paper, we extend our analysis in Kandel et al. that studied the B to E mode power ratio for polarized dust emission to include TE cross-correlation and develop an analogous formalism for synchrotron signal, all using a realistic model of magnetohydrodynamical turbulence. Our results suggest that the Planck results for both synchrotron and dust polarization can be understood if the turbulence in the Galaxy is sufficiently sub-Alfvénic. Making use of the observed poor magnetic field-density correlation, we show that the observed positive TE correlation for dust corresponds to our theoretical expectations. We also show how the B to E ratio as well as the TE cross-correlation can be used to study media magnetization, compressibility, and level of density-magnetic field correlation.

  9. Falsifiability is not optional.

    PubMed

    LeBel, Etienne P; Berger, Derek; Campbell, Lorne; Loving, Timothy J

    2017-08-01

    Finkel, Eastwick, and Reis (2016; FER2016) argued the post-2011 methodological reform movement has focused narrowly on replicability, neglecting other essential goals of research. We agree multiple scientific goals are essential, but argue, however, a more fine-grained language, conceptualization, and approach to replication is needed to accomplish these goals. Replication is the general empirical mechanism for testing and falsifying theory. Sufficiently methodologically similar replications, also known as direct replications, test the basic existence of phenomena and ensure cumulative progress is possible a priori. In contrast, increasingly methodologically dissimilar replications, also known as conceptual replications, test the relevance of auxiliary hypotheses (e.g., manipulation and measurement issues, contextual factors) required to productively investigate validity and generalizability. Without prioritizing replicability, a field is not empirically falsifiable. We also disagree with FER2016's position that "bigger samples are generally better, but . . . that very large samples could have the downside of commandeering resources that would have been better invested in other studies" (abstract). We identify problematic assumptions involved in FER2016's modifications of our original research-economic model, and present an improved model that quantifies when (and whether) it is reasonable to worry that increasing statistical power will engender potential trade-offs. Sufficiently powering studies (i.e., >80%) maximizes both research efficiency and confidence in the literature (research quality). Given that we are in agreement with FER2016 on all key open science points, we are eager to start seeing the accelerated rate of cumulative knowledge development of social psychological phenomena such a sufficiently transparent, powered, and falsifiable approach will generate. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. An Integrative Account of Constraints on Cross-Situational Learning

    PubMed Central

    Yurovsky, Daniel; Frank, Michael C.

    2015-01-01

    Word-object co-occurrence statistics are a powerful information source for vocabulary learning, but there is considerable debate about how learners actually use them. While some theories hold that learners accumulate graded, statistical evidence about multiple referents for each word, others suggest that they track only a single candidate referent. In two large-scale experiments, we show that neither account is sufficient: Cross-situational learning involves elements of both. Further, the empirical data are captured by a computational model that formalizes how memory and attention interact with co-occurrence tracking. Together, the data and model unify opposing positions in a complex debate and underscore the value of understanding the interaction between computational and algorithmic levels of explanation. PMID:26302052

  11. Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.

    PubMed

    Seguin, Brian; Fried, Eliot

    2012-12-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.

  12. Power calculation for overall hypothesis testing with high-dimensional commensurate outcomes.

    PubMed

    Chi, Yueh-Yun; Gribbin, Matthew J; Johnson, Jacqueline L; Muller, Keith E

    2014-02-28

    The complexity of system biology means that any metabolic, genetic, or proteomic pathway typically includes so many components (e.g., molecules) that statistical methods specialized for overall testing of high-dimensional and commensurate outcomes are required. While many overall tests have been proposed, very few have power and sample size methods. We develop accurate power and sample size methods and software to facilitate study planning for high-dimensional pathway analysis. With an account of any complex correlation structure between high-dimensional outcomes, the new methods allow power calculation even when the sample size is less than the number of variables. We derive the exact (finite-sample) and approximate non-null distributions of the 'univariate' approach to repeated measures test statistic, as well as power-equivalent scenarios useful to generalize our numerical evaluations. Extensive simulations of group comparisons support the accuracy of the approximations even when the ratio of number of variables to sample size is large. We derive a minimum set of constants and parameters sufficient and practical for power calculation. Using the new methods and specifying the minimum set to determine power for a study of metabolic consequences of vitamin B6 deficiency helps illustrate the practical value of the new results. Free software implementing the power and sample size methods applies to a wide range of designs, including one group pre-intervention and post-intervention comparisons, multiple parallel group comparisons with one-way or factorial designs, and the adjustment and evaluation of covariate effects. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Statistical inference involving binomial and negative binomial parameters.

    PubMed

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2009-05-01

    Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.

  14. Sample Size in Clinical Cardioprotection Trials Using Myocardial Salvage Index, Infarct Size, or Biochemical Markers as Endpoint.

    PubMed

    Engblom, Henrik; Heiberg, Einar; Erlinge, David; Jensen, Svend Eggert; Nordrehaug, Jan Erik; Dubois-Randé, Jean-Luc; Halvorsen, Sigrun; Hoffmann, Pavel; Koul, Sasha; Carlsson, Marcus; Atar, Dan; Arheden, Håkan

    2016-03-09

    Cardiac magnetic resonance (CMR) can quantify myocardial infarct (MI) size and myocardium at risk (MaR), enabling assessment of myocardial salvage index (MSI). We assessed how MSI impacts the number of patients needed to reach statistical power in relation to MI size alone and levels of biochemical markers in clinical cardioprotection trials and how scan day affect sample size. Controls (n=90) from the recent CHILL-MI and MITOCARE trials were included. MI size, MaR, and MSI were assessed from CMR. High-sensitivity troponin T (hsTnT) and creatine kinase isoenzyme MB (CKMB) levels were assessed in CHILL-MI patients (n=50). Utilizing distribution of these variables, 100 000 clinical trials were simulated for calculation of sample size required to reach sufficient power. For a treatment effect of 25% decrease in outcome variables, 50 patients were required in each arm using MSI compared to 93, 98, 120, 141, and 143 for MI size alone, hsTnT (area under the curve [AUC] and peak), and CKMB (AUC and peak) in order to reach a power of 90%. If average CMR scan day between treatment and control arms differed by 1 day, sample size needs to be increased by 54% (77 vs 50) to avoid scan day bias masking a treatment effect of 25%. Sample size in cardioprotection trials can be reduced 46% to 65% without compromising statistical power when using MSI by CMR as an outcome variable instead of MI size alone or biochemical markers. It is essential to ensure lack of bias in scan day between treatment and control arms to avoid compromising statistical power. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkley, Eric D.; Sego, Landon H.; Lin, Andy

    Adaptive processes in bacterial species can occur rapidly in laboratory culture, leading to genetic divergence between naturally occurring and laboratory-adapted strains. Differentiating wild and closely-related laboratory strains is clearly important for biodefense and bioforensics; however, DNA sequence data alone has thus far not provided a clear signature, perhaps due to lack of understanding of how diverse genome changes lead to adapted phenotypes. Protein abundance profiles from mass spectrometry-based proteomics analyses are a molecular measure of phenotype. Proteomics data contains sufficient information that powerful statistical methods can uncover signatures that distinguish wild strains of Yersinia pestis from laboratory-adapted strains.

  16. Minimal sufficient positive-operator valued measure on a separable Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuramochi, Yui, E-mail: kuramochi.yui.22c@st.kyoto-u.ac.jp

    We introduce a concept of a minimal sufficient positive-operator valued measure (POVM), which is the least redundant POVM among the POVMs that have the equivalent information about the measured quantum system. Assuming the system Hilbert space to be separable, we show that for a given POVM, a sufficient statistic called a Lehmann-Scheffé-Bahadur statistic induces a minimal sufficient POVM. We also show that every POVM has an equivalent minimal sufficient POVM and that such a minimal sufficient POVM is unique up to relabeling neglecting null sets. We apply these results to discrete POVMs and information conservation conditions proposed by the author.

  17. On sufficient statistics of least-squares superposition of vector sets.

    PubMed

    Konagurthu, Arun S; Kasarapu, Parthan; Allison, Lloyd; Collier, James H; Lesk, Arthur M

    2015-06-01

    The problem of superposition of two corresponding vector sets by minimizing their sum-of-squares error under orthogonal transformation is a fundamental task in many areas of science, notably structural molecular biology. This problem can be solved exactly using an algorithm whose time complexity grows linearly with the number of correspondences. This efficient solution has facilitated the widespread use of the superposition task, particularly in studies involving macromolecular structures. This article formally derives a set of sufficient statistics for the least-squares superposition problem. These statistics are additive. This permits a highly efficient (constant time) computation of superpositions (and sufficient statistics) of vector sets that are composed from its constituent vector sets under addition or deletion operation, where the sufficient statistics of the constituent sets are already known (that is, the constituent vector sets have been previously superposed). This results in a drastic improvement in the run time of the methods that commonly superpose vector sets under addition or deletion operations, where previously these operations were carried out ab initio (ignoring the sufficient statistics). We experimentally demonstrate the improvement our work offers in the context of protein structural alignment programs that assemble a reliable structural alignment from well-fitting (substructural) fragment pairs. A C++ library for this task is available online under an open-source license.

  18. Design and Feasibility Assessment of a Retrospective Epidemiological Study of Coal-Fired Power Plant Emissions in the Pittsburgh Pennsylvania Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard A. Bilonick; Daniel Connell; Evelyn Talbott

    2006-12-20

    Eighty-nine (89) percent of the electricity supplied in the 35-county Pittsburgh region (comprising parts of the states of Pennsylvania, Ohio, West Virginia, and Maryland) is generated by coal-fired power plants making this an ideal region in which to study the effects of the fine airborne particulates designated as PM{sub 2.5} emitted by the combustion of coal. This report demonstrates that during the period from 1999-2006 (1) sufficient and extensive exposure data, in particular samples of speciated PM{sub 2.5} components from 1999 to 2003, and including gaseous co-pollutants and weather have been collected, (2) sufficient and extensive mortality, morbidity, and relatedmore » health outcomes data are readily available, and (3) the relationship between health effects and fine particulates can most likely be satisfactorily characterized using a combination of sophisticated statistical methodologies including latent variable modeling (LVM) and generalized linear autoregressive moving average (GLARMA) time series analysis. This report provides detailed information on the available exposure data and the available health outcomes data for the construction of a comprehensive database suitable for analysis, illustrates the application of various statistical methods to characterize the relationship between health effects and exposure, and provides a road map for conducting the proposed study. In addition, a detailed work plan for conducting the study is provided and includes a list of tasks and an estimated budget. A substantial portion of the total study cost is attributed to the cost of analyzing a large number of archived PM{sub 2.5} filters. Analysis of a representative sample of the filters supports the reliability of this invaluable but as-yet untapped resource. These filters hold the key to having sufficient data on the components of PM{sub 2.5} but have a limited shelf life. If the archived filters are not analyzed promptly the important and costly information they contain will be lost.« less

  19. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  20. [An investigation of the statistical power of the effect size in randomized controlled trials for the treatment of patients with type 2 diabetes mellitus using Chinese medicine].

    PubMed

    Ma, Li-Xin; Liu, Jian-Ping

    2012-01-01

    To investigate whether the power of the effect size was based on adequate sample size in randomized controlled trials (RCTs) for the treatment of patients with type 2 diabetes mellitus (T2DM) using Chinese medicine. China Knowledge Resource Integrated Database (CNKI), VIP Database for Chinese Technical Periodicals (VIP), Chinese Biomedical Database (CBM), and Wangfang Data were systematically recruited using terms like "Xiaoke" or diabetes, Chinese herbal medicine, patent medicine, traditional Chinese medicine, randomized, controlled, blinded, and placebo-controlled. Limitation was set on the intervention course > or = 3 months in order to identify the information of outcome assessement and the sample size. Data collection forms were made according to the checking lists found in the CONSORT statement. Independent double data extractions were performed on all included trials. The statistical power of the effects size for each RCT study was assessed using sample size calculation equations. (1) A total of 207 RCTs were included, including 111 superiority trials and 96 non-inferiority trials. (2) Among the 111 superiority trials, fasting plasma glucose (FPG) and glycosylated hemoglobin HbA1c (HbA1c) outcome measure were reported in 9% and 12% of the RCTs respectively with the sample size > 150 in each trial. For the outcome of HbA1c, only 10% of the RCTs had more than 80% power. For FPG, 23% of the RCTs had more than 80% power. (3) In the 96 non-inferiority trials, the outcomes FPG and HbA1c were reported as 31% and 36% respectively. These RCTs had a samples size > 150. For HbA1c only 36% of the RCTs had more than 80% power. For FPG, only 27% of the studies had more than 80% power. The sample size for statistical analysis was distressingly low and most RCTs did not achieve 80% power. In order to obtain a sufficient statistic power, it is recommended that clinical trials should establish clear research objective and hypothesis first, and choose scientific and evidence-based study design and outcome measurements. At the same time, calculate required sample size to ensure a precise research conclusion.

  1. Remote detection of radioactive material using high-power pulsed electromagnetic radiation.

    PubMed

    Kim, Dongsung; Yu, Dongho; Sawant, Ashwini; Choe, Mun Seok; Lee, Ingeun; Kim, Sung Gug; Choi, EunMi

    2017-05-09

    Remote detection of radioactive materials is impossible when the measurement location is far from the radioactive source such that the leakage of high-energy photons or electrons from the source cannot be measured. Current technologies are less effective in this respect because they only allow the detection at distances to which the high-energy photons or electrons can reach the detector. Here we demonstrate an experimental method for remote detection of radioactive materials by inducing plasma breakdown with the high-power pulsed electromagnetic waves. Measurements of the plasma formation time and its dispersion lead to enhanced detection sensitivity compared to the theoretically predicted one based only on the plasma on and off phenomena. We show that lower power of the incident electromagnetic wave is sufficient for plasma breakdown in atmospheric-pressure air and the elimination of the statistical distribution is possible in the presence of radioactive material.

  2. Mining protein-protein interaction networks: denoising effects

    NASA Astrophysics Data System (ADS)

    Marras, Elisabetta; Capobianco, Enrico

    2009-01-01

    A typical instrument to pursue analysis in complex network studies is the analysis of the statistical distributions. They are usually computed for measures which characterize network topology, and are aimed at capturing both structural and dynamics aspects. Protein-protein interaction networks (PPIN) have also been studied through several measures. It is in general observed that a power law is expected to characterize scale-free networks. However, mixing the original noise cover with outlying information and other system-dependent fluctuations makes the empirical detection of the power law a difficult task. As a result the uncertainty level increases when looking at the observed sample; in particular, one may wonder whether the computed features may be sufficient to explain the interactome. We then address noise problems by implementing both decomposition and denoising techniques that reduce the impact of factors known to affect the accuracy of power law detection.

  3. Similarity principles for the biology of pelagic animals

    PubMed Central

    Barenblatt, G. I.; Monin, A. S.

    1983-01-01

    A similarity principle is formulated according to which the statistical pattern of the pelagic population is identical in all scales sufficiently large in comparison with the molecular one. From this principle, a power law is obtained analytically for the pelagic animal biomass distribution over the animal sizes. A hypothesis is presented according to which, under fixed external conditions, the oxygen exchange intensity of an animal is governed only by its mass and density and by the specific absorbing capacity of the animal's respiratory organ. From this hypothesis a power law is obtained by the method of dimensional analysis for the exchange intensity mass dependence. The known empirical values of the exponent of this power law are interpreted as an indication that the oxygen-absorbing organs of the animals can be represented as so-called fractal surfaces. In conclusion the biological principle of the decrease in specific exchange intensity with increase in animal mass is discussed. PMID:16593327

  4. Remote detection of radioactive material using high-power pulsed electromagnetic radiation

    PubMed Central

    Kim, Dongsung; Yu, Dongho; Sawant, Ashwini; Choe, Mun Seok; Lee, Ingeun; Kim, Sung Gug; Choi, EunMi

    2017-01-01

    Remote detection of radioactive materials is impossible when the measurement location is far from the radioactive source such that the leakage of high-energy photons or electrons from the source cannot be measured. Current technologies are less effective in this respect because they only allow the detection at distances to which the high-energy photons or electrons can reach the detector. Here we demonstrate an experimental method for remote detection of radioactive materials by inducing plasma breakdown with the high-power pulsed electromagnetic waves. Measurements of the plasma formation time and its dispersion lead to enhanced detection sensitivity compared to the theoretically predicted one based only on the plasma on and off phenomena. We show that lower power of the incident electromagnetic wave is sufficient for plasma breakdown in atmospheric-pressure air and the elimination of the statistical distribution is possible in the presence of radioactive material. PMID:28486438

  5. A simple formula for insertion loss prediction of large acoustical enclosures using statistical energy analysis method

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Sil; Kim, Jae-Seung; Lee, Seong-Hyun; Seo, Yun-Ho

    2014-12-01

    Insertion loss prediction of large acoustical enclosures using Statistical Energy Analysis (SEA) method is presented. The SEA model consists of three elements: sound field inside the enclosure, vibration energy of the enclosure panel, and sound field outside the enclosure. It is assumed that the space surrounding the enclosure is sufficiently large so that there is no energy flow from the outside to the wall panel or to air cavity inside the enclosure. The comparison of the predicted insertion loss to the measured data for typical large acoustical enclosures shows good agreements. It is found that if the critical frequency of the wall panel falls above the frequency region of interest, insertion loss is dominated by the sound transmission loss of the wall panel and averaged sound absorption coefficient inside the enclosure. However, if the critical frequency of the wall panel falls into the frequency region of interest, acoustic power from the sound radiation by the wall panel must be added to the acoustic power from transmission through the panel.

  6. Evaluating the statistical power of DNA-based identification, exemplified by 'The missing grandchildren of Argentina'.

    PubMed

    Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli

    2017-11-01

    Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and implementations must be extended. All the tools are freely available in Familias (http://www.familias.no) empowered by the R library paramlink. The above approach is applied to a large and important data set: 'The missing grandchildren of Argentina'. We evaluate the power of 196 families from the DNA reference databank (Banco Nacional de Datos Genéticos, http://www.bndg.gob.ar. As a result we show that 58 of the families have poor statistical power and require additional genetic data to enable a positive identification. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui

    2018-04-01

    This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.

  8. Statistical parity-time-symmetric lasing in an optical fibre network.

    PubMed

    Jahromi, Ali K; Hassan, Absar U; Christodoulides, Demetrios N; Abouraddy, Ayman F

    2017-11-07

    Parity-time (PT)-symmetry in optics is a condition whereby the real and imaginary parts of the refractive index across a photonic structure are deliberately balanced. This balance can lead to interesting optical phenomena, such as unidirectional invisibility, loss-induced lasing, single-mode lasing from multimode resonators, and non-reciprocal effects in conjunction with nonlinearities. Because PT-symmetry has been thought of as fragile, experimental realisations to date have been usually restricted to on-chip micro-devices. Here, we demonstrate that certain features of PT-symmetry are sufficiently robust to survive the statistical fluctuations associated with a macroscopic optical cavity. We examine the lasing dynamics in optical fibre-based coupled cavities more than a kilometre in length with balanced gain and loss. Although fluctuations can detune the cavity by more than the free spectral range, the behaviour of the lasing threshold and the laser power is that expected from a PT-stable system. Furthermore, we observe a statistical symmetry breaking upon varying the cavity loss.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Xin, E-mail: xinshih86029@gmail.com; Zhao, Xiangmo, E-mail: xinshih86029@gmail.com; Hui, Fei, E-mail: xinshih86029@gmail.com

    Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations ismore » constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.« less

  10. Bootstrapping in a language of thought: a formal model of numerical concept learning.

    PubMed

    Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D

    2012-05-01

    In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. An empirical approach to sufficient similarity in dose-responsiveness: Utilization of statistical distance as a similarity measure.

    EPA Science Inventory

    Using statistical equivalence testing logic and mixed model theory an approach has been developed, that extends the work of Stork et al (JABES,2008), to define sufficient similarity in dose-response for chemical mixtures containing the same chemicals with different ratios ...

  12. On the brain structure heterogeneity of autism: Parsing out acquisition site effects with significance-weighted principal component analysis.

    PubMed

    Martinez-Murcia, Francisco Jesús; Lai, Meng-Chuan; Górriz, Juan Manuel; Ramírez, Javier; Young, Adam M H; Deoni, Sean C L; Ecker, Christine; Lombardo, Michael V; Baron-Cohen, Simon; Murphy, Declan G M; Bullmore, Edward T; Suckling, John

    2017-03-01

    Neuroimaging studies have reported structural and physiological differences that could help understand the causes and development of Autism Spectrum Disorder (ASD). Many of them rely on multisite designs, with the recruitment of larger samples increasing statistical power. However, recent large-scale studies have put some findings into question, considering the results to be strongly dependent on the database used, and demonstrating the substantial heterogeneity within this clinically defined category. One major source of variance may be the acquisition of the data in multiple centres. In this work we analysed the differences found in the multisite, multi-modal neuroimaging database from the UK Medical Research Council Autism Imaging Multicentre Study (MRC AIMS) in terms of both diagnosis and acquisition sites. Since the dissimilarities between sites were higher than between diagnostic groups, we developed a technique called Significance Weighted Principal Component Analysis (SWPCA) to reduce the undesired intensity variance due to acquisition site and to increase the statistical power in detecting group differences. After eliminating site-related variance, statistically significant group differences were found, including Broca's area and the temporo-parietal junction. However, discriminative power was not sufficient to classify diagnostic groups, yielding accuracies results close to random. Our work supports recent claims that ASD is a highly heterogeneous condition that is difficult to globally characterize by neuroimaging, and therefore different (and more homogenous) subgroups should be defined to obtain a deeper understanding of ASD. Hum Brain Mapp 38:1208-1223, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. The problem is not just sample size: the consequences of low base rates in policing experiments in smaller cities.

    PubMed

    Hinkle, Joshua C; Weisburd, David; Famega, Christine; Ready, Justin

    2013-01-01

    Hot spots policing is one of the most influential police innovations, with a strong body of experimental research showing it to be effective in reducing crime and disorder. However, most studies have been conducted in major cities, and we thus know little about whether it is effective in smaller cities, which account for a majority of police agencies. The lack of experimental studies in smaller cities is likely in part due to challenges designing statistically powerful tests in such contexts. The current article explores the challenges of statistical power and "noise" resulting from low base rates of crime in smaller cities and provides suggestions for future evaluations to overcome these limitations. Data from a randomized experimental evaluation of broken windows policing in hot spots are used to illustrate the challenges that low base rates present for evaluating hot spots policing programs in smaller cities. Analyses show low base rates make it difficult to detect treatment effects. Very large effect sizes would be required to reach sufficient power, and random fluctuations around low base rates make detecting treatment effects difficult, irrespective of power, by masking differences between treatment and control groups. Low base rates present strong challenges to researchers attempting to evaluate hot spots policing in smaller cities. As such, base rates must be taken directly into account when designing experimental evaluations. The article offers suggestions for researchers attempting to expand the examination of hot spots policing and other microplace-based interventions to smaller jurisdictions.

  14. Kidney function endpoints in kidney transplant trials: a struggle for power.

    PubMed

    Ibrahim, A; Garg, A X; Knoll, G A; Akbari, A; White, C A

    2013-03-01

    Kidney function endpoints are commonly used in randomized controlled trials (RCTs) in kidney transplantation (KTx). We conducted this study to estimate the proportion of ongoing RCTs with kidney function endpoints in KTx where the proposed sample size is large enough to detect meaningful differences in glomerular filtration rate (GFR) with adequate statistical power. RCTs were retrieved using the key word "kidney transplantation" from the National Institute of Health online clinical trial registry. Included trials had at least one measure of kidney function tracked for at least 1 month after transplant. We determined the proportion of two-arm parallel trials that had sufficient sample sizes to detect a minimum 5, 7.5 and 10 mL/min difference in GFR between arms. Fifty RCTs met inclusion criteria. Only 7% of the trials were above a sample size of 562, the number needed to detect a minimum 5 mL/min difference between the groups should one exist (assumptions: α = 0.05; power = 80%, 10% loss to follow-up, common standard deviation of 20 mL/min). The result increased modestly to 36% of trials when a minimum 10 mL/min difference was considered. Only a minority of ongoing trials have adequate statistical power to detect between-group differences in kidney function using conventional sample size estimating parameters. For this reason, some potentially effective interventions which ultimately could benefit patients may be abandoned from future assessment. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.

  15. Cognitive and Neural Correlates of Mathematical Giftedness in Adults and Children: A Review

    PubMed Central

    Myers, Timothy; Carey, Emma; Szűcs, Dénes

    2017-01-01

    Most mathematical cognition research has focused on understanding normal adult function and child development as well as mildly and moderately impaired mathematical skill, often labeled developmental dyscalculia and/or mathematical learning disability. In contrast, much less research is available on cognitive and neural correlates of gifted/excellent mathematical knowledge in adults and children. In order to facilitate further inquiry into this area, here we review 40 available studies, which examine the cognitive and neural basis of gifted mathematics. Studies associated a large number of cognitive factors with gifted mathematics, with spatial processing and working memory being the most frequently identified contributors. However, the current literature suffers from low statistical power, which most probably contributes to variability across findings. Other major shortcomings include failing to establish domain and stimulus specificity of findings, suggesting causation without sufficient evidence and the frequent use of invalid backward inference in neuro-imaging studies. Future studies must increase statistical power and neuro-imaging studies must rely on supporting behavioral data when interpreting findings. Studies should investigate the factors shown to correlate with math giftedness in a more specific manner and determine exactly how individual factors may contribute to gifted math ability. PMID:29118725

  16. Spatial Pattern Classification for More Accurate Forecasting of Variable Energy Resources

    NASA Astrophysics Data System (ADS)

    Novakovskaia, E.; Hayes, C.; Collier, C.

    2014-12-01

    The accuracy of solar and wind forecasts is becoming increasingly essential as grid operators continue to integrate additional renewable generation onto the electric grid. Forecast errors affect rate payers, grid operators, wind and solar plant maintenance crews and energy traders through increases in prices, project down time or lost revenue. While extensive and beneficial efforts were undertaken in recent years to improve physical weather models for a broad spectrum of applications these improvements have generally not been sufficient to meet the accuracy demands of system planners. For renewables, these models are often used in conjunction with additional statistical models utilizing both meteorological observations and the power generation data. Forecast accuracy can be dependent on specific weather regimes for a given location. To account for these dependencies it is important that parameterizations used in statistical models change as the regime changes. An automated tool, based on an artificial neural network model, has been developed to identify different weather regimes as they impact power output forecast accuracy at wind or solar farms. In this study, improvements in forecast accuracy were analyzed for varying time horizons for wind farms and utility-scale PV plants located in different geographical regions.

  17. Standardised method of determining vibratory perception thresholds for diagnosis and screening in neurological investigation.

    PubMed Central

    Goldberg, J M; Lindblom, U

    1979-01-01

    Vibration threshold determinations were made by means of an electromagnetic vibrator at three sites (carpal, tibial, and tarsal), which were primarily selected for examining patients with polyneuropathy. Because of the vast variation demonstrated for both vibrator output and tissue damping, the thresholds were expressed in terms of amplitude of stimulator movement measured by means of an accelerometer, instead of applied voltage which is commonly used. Statistical analysis revealed a higher power of discimination for amplitude measurements at all three stimulus sites. Digital read-out gave the best statistical result and was also most practical. Reference values obtained from 110 healthy males, 10 to 74 years of age, were highly correlated with age for both upper and lower extremities. The variance of the vibration perception threshold was less than that of the disappearance threshold, and determination of the perception threshold alone may be sufficient in most cases. PMID:501379

  18. Goodness-of-fit tests for open capture-recapture models

    USGS Publications Warehouse

    Pollock, K.H.; Hines, J.E.; Nichols, J.D.

    1985-01-01

    General goodness-of-fit tests for the Jolly-Seber model are proposed. These tests are based on conditional arguments using minimal sufficient statistics. The tests are shown to be of simple hypergeometric form so that a series of independent contingency table chi-square tests can be performed. The relationship of these tests to other proposed tests is discussed. This is followed by a simulation study of the power of the tests to detect departures from the assumptions of the Jolly-Seber model. Some meadow vole capture-recapture data are used to illustrate the testing procedure which has been implemented in a computer program available from the authors.

  19. Geoscience Education Research Methods: Thinking About Sample Size

    NASA Astrophysics Data System (ADS)

    Slater, S. J.; Slater, T. F.; CenterAstronomy; Physics Education Research

    2011-12-01

    Geoscience education research is at a critical point in which conditions are sufficient to propel our field forward toward meaningful improvements in geosciences education practices. Our field has now reached a point where the outcomes of our research is deemed important to endusers and funding agencies, and where we now have a large number of scientists who are either formally trained in geosciences education research, or who have dedicated themselves to excellence in this domain. At this point we now must collectively work through our epistemology, our rules of what methodologies will be considered sufficiently rigorous, and what data and analysis techniques will be acceptable for constructing evidence. In particular, we have to work out our answer to that most difficult of research questions: "How big should my 'N' be??" This paper presents a very brief answer to that question, addressing both quantitative and qualitative methodologies. Research question/methodology alignment, effect size and statistical power will be discussed, in addition to a defense of the notion that bigger is not always better.

  20. Georg Rasch and Benjamin Wright's Struggle with the Unidimensional Polytomous Model with Sufficient Statistics

    ERIC Educational Resources Information Center

    Andrich, David

    2016-01-01

    This article reproduces correspondence between Georg Rasch of The University of Copenhagen and Benjamin Wright of The University of Chicago in the period from January 1966 to July 1967. This correspondence reveals their struggle to operationalize a unidimensional measurement model with sufficient statistics for responses in a set of ordered…

  1. Universal Power Law Governing Pedestrian Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karamouzas, Ioannis; Skinner, Brian; Guy, Stephen J.

    2014-12-01

    Human crowds often bear a striking resemblance to interacting particle systems, and this has prompted many researchers to describe pedestrian dynamics in terms of interaction forces and potential energies. The correct quantitative form of this interaction, however, has remained an open question. Here, we introduce a novel statistical-mechanical approach to directly measure the interaction energy between pedestrians. This analysis, when applied to a large collection of human motion data, reveals a simple power-law interaction that is based not on the physical separation between pedestrians but on their projected time to a potential future collision, and is therefore fundamentally anticipatory inmore » nature. Remarkably, this simple law is able to describe human interactions across a wide variety of situations, speeds, and densities. We further show, through simulations, that the interaction law we identify is sufficient to reproduce many known crowd phenomena.« less

  2. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  3. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  4. The Global Error Assessment (GEA) model for the selection of differentially expressed genes in microarray data.

    PubMed

    Mansourian, Robert; Mutch, David M; Antille, Nicolas; Aubert, Jerome; Fogel, Paul; Le Goff, Jean-Marc; Moulin, Julie; Petrov, Anton; Rytz, Andreas; Voegel, Johannes J; Roberts, Matthew-Alan

    2004-11-01

    Microarray technology has become a powerful research tool in many fields of study; however, the cost of microarrays often results in the use of a low number of replicates (k). Under circumstances where k is low, it becomes difficult to perform standard statistical tests to extract the most biologically significant experimental results. Other more advanced statistical tests have been developed; however, their use and interpretation often remain difficult to implement in routine biological research. The present work outlines a method that achieves sufficient statistical power for selecting differentially expressed genes under conditions of low k, while remaining as an intuitive and computationally efficient procedure. The present study describes a Global Error Assessment (GEA) methodology to select differentially expressed genes in microarray datasets, and was developed using an in vitro experiment that compared control and interferon-gamma treated skin cells. In this experiment, up to nine replicates were used to confidently estimate error, thereby enabling methods of different statistical power to be compared. Gene expression results of a similar absolute expression are binned, so as to enable a highly accurate local estimate of the mean squared error within conditions. The model then relates variability of gene expression in each bin to absolute expression levels and uses this in a test derived from the classical ANOVA. The GEA selection method is compared with both the classical and permutational ANOVA tests, and demonstrates an increased stability, robustness and confidence in gene selection. A subset of the selected genes were validated by real-time reverse transcription-polymerase chain reaction (RT-PCR). All these results suggest that GEA methodology is (i) suitable for selection of differentially expressed genes in microarray data, (ii) intuitive and computationally efficient and (iii) especially advantageous under conditions of low k. The GEA code for R software is freely available upon request to authors.

  5. Performance analysis of optimal power allocation in wireless cooperative communication systems

    NASA Astrophysics Data System (ADS)

    Babikir Adam, Edriss E.; Samb, Doudou; Yu, Li

    2013-03-01

    Cooperative communication has been recently proposed in wireless communication systems for exploring the inherent spatial diversity in relay channels.The Amplify-and-Forward (AF) cooperation protocols with multiple relays have not been sufficiently investigated even if it has a low complexity in term of implementation. We consider in this work a cooperative diversity system in which a source transmits some information to a destination with the help of multiple relay nodes with AF protocols and investigate the optimality of allocating powers both at the source and the relays system by optimizing the symbol error rate (SER) performance in an efficient way. Firstly we derive a closedform SER formulation for MPSK signal using the concept of moment generating function and some statistical approximations in high signal to noise ratio (SNR) for the system under studied. We then find a tight corresponding lower bound which converges to the same limit as the theoretical upper bound and develop an optimal power allocation (OPA) technique with mean channel gains to minimize the SER. Simulation results show that our scheme outperforms the equal power allocation (EPA) scheme and is tight to the theoretical approximation based on the SER upper bound in high SNR for different number of relays.

  6. Statistics and topology of the COBE differential microwave radiometer first-year sky maps

    NASA Technical Reports Server (NTRS)

    Smoot, G. F.; Tenorio, L.; Banday, A. J.; Kogut, A.; Wright, E. L.; Hinshaw, G.; Bennett, C. L.

    1994-01-01

    We use statistical and topological quantities to test the Cosmic Background Explorer (COBE) Differential Microwave Radiometer (DMR) first-year sky maps against the hypothesis that the observed temperature fluctuations reflect Gaussian initial density perturbations with random phases. Recent papers discuss specific quantities as discriminators between Gaussian and non-Gaussian behavior, but the treatment of instrumental noise on the data is largely ignored. The presence of noise in the data biases many statistical quantities in a manner dependent on both the noise properties and the unknown cosmic microwave background temperature field. Appropriate weighting schemes can minimize this effect, but it cannot be completely eliminated. Analytic expressions are presented for these biases, and Monte Carlo simulations are used to assess the best strategy for determining cosmologically interesting information from noisy data. The genus is a robust discriminator that can be used to estimate the power-law quadrupole-normalized amplitude, Q(sub rms-PS), independently of the two-point correlation function. The genus of the DMR data is consistent with Gaussian initial fluctuations with Q(sub rms-PS) = (15.7 +/- 2.2) - (6.6 +/- 0.3)(n - 1) micro-K, where n is the power-law index. Fitting the rms temperature variations at various smoothing angles gives Q(sub rms-PS) = 13.2 +/- 2.5 micro-K and n = 1.7(sup (+0.3) sub (-0.6)). While consistent with Gaussian fluctuations, the first year data are only sufficient to rule out strongly non-Gaussian distributions of fluctuations.

  7. Protein abundances can distinguish between naturally-occurring and laboratory strains of Yersinia pestis, the causative agent of plague

    DOE PAGES

    Merkley, Eric D.; Sego, Landon H.; Lin, Andy; ...

    2017-08-30

    Adaptive processes in bacterial species can occur rapidly in laboratory culture, leading to genetic divergence between naturally occurring and laboratory-adapted strains. Differentiating wild and closely-related laboratory strains is clearly important for biodefense and bioforensics; however, DNA sequence data alone has thus far not provided a clear signature, perhaps due to lack of understanding of how diverse genome changes lead to adapted phenotypes. Protein abundance profiles from mass spectrometry-based proteomics analyses are a molecular measure of phenotype. Proteomics data contains sufficient information that powerful statistical methods can uncover signatures that distinguish wild strains of Yersinia pestis from laboratory-adapted strains.

  8. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling

    PubMed Central

    Wood, John

    2017-01-01

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080

  9. High-Dimensional Multivariate Repeated Measures Analysis with Unequal Covariance Matrices.

    PubMed

    Harrar, Solomon W; Kong, Xiaoli

    2015-03-01

    In this paper, test statistics for repeated measures design are introduced when the dimension is large. By large dimension is meant the number of repeated measures and the total sample size grow together but either one could be larger than the other. Asymptotic distribution of the statistics are derived for the equal as well as unequal covariance cases in the balanced as well as unbalanced cases. The asymptotic framework considered requires proportional growth of the sample sizes and the dimension of the repeated measures in the unequal covariance case. In the equal covariance case, one can grow at much faster rate than the other. The derivations of the asymptotic distributions mimic that of Central Limit Theorem with some important peculiarities addressed with sufficient rigor. Consistent and unbiased estimators of the asymptotic variances, which make efficient use of all the observations, are also derived. Simulation study provides favorable evidence for the accuracy of the asymptotic approximation under the null hypothesis. Power simulations have shown that the new methods have comparable power with a popular method known to work well in low-dimensional situation but the new methods have shown enormous advantage when the dimension is large. Data from Electroencephalograph (EEG) experiment is analyzed to illustrate the application of the results.

  10. High-Dimensional Multivariate Repeated Measures Analysis with Unequal Covariance Matrices

    PubMed Central

    Harrar, Solomon W.; Kong, Xiaoli

    2015-01-01

    In this paper, test statistics for repeated measures design are introduced when the dimension is large. By large dimension is meant the number of repeated measures and the total sample size grow together but either one could be larger than the other. Asymptotic distribution of the statistics are derived for the equal as well as unequal covariance cases in the balanced as well as unbalanced cases. The asymptotic framework considered requires proportional growth of the sample sizes and the dimension of the repeated measures in the unequal covariance case. In the equal covariance case, one can grow at much faster rate than the other. The derivations of the asymptotic distributions mimic that of Central Limit Theorem with some important peculiarities addressed with sufficient rigor. Consistent and unbiased estimators of the asymptotic variances, which make efficient use of all the observations, are also derived. Simulation study provides favorable evidence for the accuracy of the asymptotic approximation under the null hypothesis. Power simulations have shown that the new methods have comparable power with a popular method known to work well in low-dimensional situation but the new methods have shown enormous advantage when the dimension is large. Data from Electroencephalograph (EEG) experiment is analyzed to illustrate the application of the results. PMID:26778861

  11. Discrete Scale Invariance of Human Large EEG Voltage Deflections is More Prominent in Waking than Sleep Stage 2.

    PubMed

    Zorick, Todd; Mandelkern, Mark A

    2015-01-01

    Electroencephalography (EEG) is typically viewed through the lens of spectral analysis. Recently, multiple lines of evidence have demonstrated that the underlying neuronal dynamics are characterized by scale-free avalanches. These results suggest that techniques from statistical physics may be used to analyze EEG signals. We utilized a publicly available database of fourteen subjects with waking and sleep stage 2 EEG tracings per subject, and observe that power-law dynamics of critical-state neuronal avalanches are not sufficient to fully describe essential features of EEG signals. We hypothesized that this could reflect the phenomenon of discrete scale invariance (DSI) in EEG large voltage deflections (LVDs) as being more prominent in waking consciousness. We isolated LVDs, and analyzed logarithmically transformed LVD size probability density functions (PDF) to assess for DSI. We find evidence of increased DSI in waking, as opposed to sleep stage 2 consciousness. We also show that the signatures of DSI are specific for EEG LVDs, and not a general feature of fractal simulations with similar statistical properties to EEG. Removing only LVDs from waking EEG produces a reduction in power in the alpha and beta frequency bands. These findings may represent a new insight into the understanding of the cortical dynamics underlying consciousness.

  12. Characterizing multi-scale self-similar behavior and non-statistical properties of fluctuations in financial time series

    NASA Astrophysics Data System (ADS)

    Ghosh, Sayantan; Manimaran, P.; Panigrahi, Prasanta K.

    2011-11-01

    We make use of wavelet transform to study the multi-scale, self-similar behavior and deviations thereof, in the stock prices of large companies, belonging to different economic sectors. The stock market returns exhibit multi-fractal characteristics, with some of the companies showing deviations at small and large scales. The fact that, the wavelets belonging to the Daubechies’ (Db) basis enables one to isolate local polynomial trends of different degrees, plays the key role in isolating fluctuations at different scales. One of the primary motivations of this work is to study the emergence of the k-3 behavior [X. Gabaix, P. Gopikrishnan, V. Plerou, H. Stanley, A theory of power law distributions in financial market fluctuations, Nature 423 (2003) 267-270] of the fluctuations starting with high frequency fluctuations. We make use of Db4 and Db6 basis sets to respectively isolate local linear and quadratic trends at different scales in order to study the statistical characteristics of these financial time series. The fluctuations reveal fat tail non-Gaussian behavior, unstable periodic modulations, at finer scales, from which the characteristic k-3 power law behavior emerges at sufficiently large scales. We further identify stable periodic behavior through the continuous Morlet wavelet.

  13. Evaluating and Reporting Statistical Power in Counseling Research

    ERIC Educational Resources Information Center

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  14. Statistical analyses to support guidelines for marine avian sampling. Final report

    USGS Publications Warehouse

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    Interest in development of offshore renewable energy facilities has led to a need for high-quality, statistically robust information on marine wildlife distributions. A practical approach is described to estimate the amount of sampling effort required to have sufficient statistical power to identify species-specific “hotspots” and “coldspots” of marine bird abundance and occurrence in an offshore environment divided into discrete spatial units (e.g., lease blocks), where “hotspots” and “coldspots” are defined relative to a reference (e.g., regional) mean abundance and/or occurrence probability for each species of interest. For example, a location with average abundance or occurrence that is three times larger the mean (3x effect size) could be defined as a “hotspot,” and a location that is three times smaller than the mean (1/3x effect size) as a “coldspot.” The choice of the effect size used to define hot and coldspots will generally depend on a combination of ecological and regulatory considerations. A method is also developed for testing the statistical significance of possible hotspots and coldspots. Both methods are illustrated with historical seabird survey data from the USGS Avian Compendium Database. Our approach consists of five main components: 1. A review of the primary scientific literature on statistical modeling of animal group size and avian count data to develop a candidate set of statistical distributions that have been used or may be useful to model seabird counts. 2. Statistical power curves for one-sample, one-tailed Monte Carlo significance tests of differences of observed small-sample means from a specified reference distribution. These curves show the power to detect "hotspots" or "coldspots" of occurrence and abundance at a range of effect sizes, given assumptions which we discuss. 3. A model selection procedure, based on maximum likelihood fits of models in the candidate set, to determine an appropriate statistical distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

  15. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  16. Modeling Traffic on the Web Graph

    NASA Astrophysics Data System (ADS)

    Meiss, Mark R.; Gonçalves, Bruno; Ramasco, José J.; Flammini, Alessandro; Menczer, Filippo

    Analysis of aggregate and individual Web requests shows that PageRank is a poor predictor of traffic. We use empirical data to characterize properties of Web traffic not reproduced by Markovian models, including both aggregate statistics such as page and link traffic, and individual statistics such as entropy and session size. As no current model reconciles all of these observations, we present an agent-based model that explains them through realistic browsing behaviors: (1) revisiting bookmarked pages; (2) backtracking; and (3) seeking out novel pages of topical interest. The resulting model can reproduce the behaviors we observe in empirical data, especially heterogeneous session lengths, reconciling the narrowly focused browsing patterns of individual users with the extreme variance in aggregate traffic measurements. We can thereby identify a few salient features that are necessary and sufficient to interpret Web traffic data. Beyond the descriptive and explanatory power of our model, these results may lead to improvements in Web applications such as search and crawling.

  17. Are X-rays the key to integrated computational materials engineering?

    DOE PAGES

    Ice, Gene E.

    2015-11-01

    The ultimate dream of materials science is to predict materials behavior from composition and processing history. Owing to the growing power of computers, this long-time dream has recently found expression through worldwide excitement in a number of computation-based thrusts: integrated computational materials engineering, materials by design, computational materials design, three-dimensional materials physics and mesoscale physics. However, real materials have important crystallographic structures at multiple length scales, which evolve during processing and in service. Moreover, real materials properties can depend on the extreme tails in their structural and chemical distributions. This makes it critical to map structural distributions with sufficient resolutionmore » to resolve small structures and with sufficient statistics to capture the tails of distributions. For two-dimensional materials, there are high-resolution nondestructive probes of surface and near-surface structures with atomic or near-atomic resolution that can provide detailed structural, chemical and functional distributions over important length scales. Furthermore, there are no nondestructive three-dimensional probes with atomic resolution over the multiple length scales needed to understand most materials.« less

  18. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.

    PubMed

    Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P

    2017-08-23

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.

  19. Composition and energy spectra of cosmic ray nuclei above 500 GeV/nucleon from the JACEE emulsion chambers

    NASA Technical Reports Server (NTRS)

    Burnett, T. H.; Dake, S.; Derrickson, J. H.; Fountain, W. F.; Fuki, M.; Gregory, J. C.; Hayashi, T.; Holynski, R.; Iwai, J.; Jones, W. V.

    1985-01-01

    The composition and energy spectra of charge groups (C - 0), (Ne - S), and (Z approximately 17) above 500 GeV/nucleon from the experiments of JACEE series balloonborne emulsion chambers are reported. Studies of cosmic ray elemental composition at higher energies provide information on propagation through interstellar space, acceleration mechanisms, and their sources. One of the present interests is the elemental composition at energies above 100 GeV/nucleon. Statistically sufficient data in this energy region can be decisive in judgment of propagation models from the ratios of SECONDARY/PRIMARY and source spectra (acceleration mechanism), as well as speculative contributions of different sources from the ratios of PRIMARY/PRIMARY. At much higher energies, i.e., around 10 to the 15th power eV, data from direct observation will give hints on the knee problem, as to whether they favor an escape effect possibly governed by magnetic rigidity above 10 to the 16th power eV.

  20. An Ultra-Wideband Cross-Correlation Radiometer for Mesoscopic Experiments

    NASA Astrophysics Data System (ADS)

    Toonen, Ryan; Haselby, Cyrus; Qin, Hua; Eriksson, Mark; Blick, Robert

    2007-03-01

    We have designed, built and tested a cross-correlation radiometer for detecting statistical order in the quantum fluctuations of mesoscopic experiments at sub-Kelvin temperatures. Our system utilizes a fully analog front-end--operating over the X- and Ku-bands (8 to 18 GHz)--for computing the cross-correlation function. Digital signal processing techniques are used to provide robustness against instrumentation drifts and offsets. The economized version of our instrument can measure, with sufficient correlation efficiency, noise signals having power levels as low as 10 fW. We show that, if desired, we can improve this performance by including cryogenic preamplifiers which boost the signal-to-noise ratio near the signal source. By adding a few extra components, we can measure both the real and imaginary parts of the cross-correlation function--improving the overall signal-to-noise ratio by a factor of sqrt[2]. We demonstrate the utility of our cross-correlator with noise power measurements from a quantum point contact.

  1. Are the Nonparametric Person-Fit Statistics More Powerful than Their Parametric Counterparts? Revisiting the Simulations in Karabatsos (2003)

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2017-01-01

    Karabatsos compared the power of 36 person-fit statistics using receiver operating characteristics curves and found the "H[superscript T]" statistic to be the most powerful in identifying aberrant examinees. He found three statistics, "C", "MCI", and "U3", to be the next most powerful. These four statistics,…

  2. First Year Wilkinson Microwave Anisotropy Probe(WMAP)Observations: The Angular Power Spectrum

    NASA Technical Reports Server (NTRS)

    Hinshaw, G.; Spergel, D. N.; Verde, L.; Hill, R. S.; Meyer, S. S.; Barnes, C.; Bennett, C. L.; Halpern, M.; Jarosik, N.; Kogut, A.

    2003-01-01

    We present the angular power spectrum derived from the first-year Wilkinson Microwave Anisotropy Probe (WMAP) sky maps. We study a variety of power spectrum estimation methods and data combinations and demonstrate that the results are robust. The data are modestly contaminated by diffuse Galactic foreground emission, but we show that a simple Galactic template model is sufficient to remove the signal. Point sources produce a modest contamination in the low frequency data. After masking approximately 700 known bright sources from the maps, we estimate residual sources contribute approximately 3500 mu sq Kappa at 41 GHz, and approximately 130 mu sq Kappa at 94 GHz, to the power spectrum [iota(iota + 1)C(sub iota)/2pi] at iota = 1000. Systematic errors are negligible compared to the (modest) level of foreground emission. Our best estimate of the power spectrum is derived from 28 cross-power spectra of statistically independent channels. The final spectrum is essentially independent of the noise properties of an individual radiometer. The resulting spectrum provides a definitive measurement of the CMB power spectrum, with uncertainties limited by cosmic variance, up to iota approximately 350. The spectrum clearly exhibits a first acoustic peak at iota = 220 and a second acoustic peak at iota approximately 540, and it provides strong support for adiabatic initial conditions. Researchers have analyzed the CT(sup Epsilon) power spectrum, and present evidence for a relatively high optical depth, and an early period of cosmic reionization. Among other things, this implies that the temperature power spectrum has been suppressed by approximately 30% on degree angular scales, due to secondary scattering.

  3. Next-generation prognostic assessment for diffuse large B-cell lymphoma

    PubMed Central

    Staton, Ashley D; Kof, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts. PMID:26289217

  4. Next-generation prognostic assessment for diffuse large B-cell lymphoma.

    PubMed

    Staton, Ashley D; Koff, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts.

  5. Precision constraints on the top-quark effective field theory at future lepton colliders

    NASA Astrophysics Data System (ADS)

    Durieux, G.

    We examine the constraints that future lepton colliders would impose on the effective field theory describing modifications of top-quark interactions beyond the standard model, through measurements of the $e^+e^-\\to bW^+\\:\\bar bW^-$ process. Statistically optimal observables are exploited to constrain simultaneously and efficiently all relevant operators. Their constraining power is sufficient for quadratic effective-field-theory contributions to have negligible impact on limits which are therefore basis independent. This is contrasted with the measurements of cross sections and forward-backward asymmetries. An overall measure of constraints strength, the global determinant parameter, is used to determine which run parameters impose the strongest restriction on the multidimensional effective-field-theory parameter space.

  6. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  7. Are power calculations useful? A multicentre neuroimaging study

    PubMed Central

    Suckling, John; Henty, Julian; Ecker, Christine; Deoni, Sean C; Lombardo, Michael V; Baron-Cohen, Simon; Jezzard, Peter; Barnes, Anna; Chakrabarti, Bhismadev; Ooi, Cinly; Lai, Meng-Chuan; Williams, Steven C; Murphy, Declan GM; Bullmore, Edward

    2014-01-01

    There are now many reports of imaging experiments with small cohorts of typical participants that precede large-scale, often multicentre studies of psychiatric and neurological disorders. Data from these calibration experiments are sufficient to make estimates of statistical power and predictions of sample size and minimum observable effect sizes. In this technical note, we suggest how previously reported voxel-based power calculations can support decision making in the design, execution and analysis of cross-sectional multicentre imaging studies. The choice of MRI acquisition sequence, distribution of recruitment across acquisition centres, and changes to the registration method applied during data analysis are considered as examples. The consequences of modification are explored in quantitative terms by assessing the impact on sample size for a fixed effect size and detectable effect size for a fixed sample size. The calibration experiment dataset used for illustration was a precursor to the now complete Medical Research Council Autism Imaging Multicentre Study (MRC-AIMS). Validation of the voxel-based power calculations is made by comparing the predicted values from the calibration experiment with those observed in MRC-AIMS. The effect of non-linear mappings during image registration to a standard stereotactic space on the prediction is explored with reference to the amount of local deformation. In summary, power calculations offer a validated, quantitative means of making informed choices on important factors that influence the outcome of studies that consume significant resources. PMID:24644267

  8. Behavior, sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation.

    PubMed

    Eickhoff, Simon B; Nichols, Thomas E; Laird, Angela R; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T; Bzdok, Danilo; Eickhoff, Claudia R

    2016-08-15

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Machine learning for the New York City power grid.

    PubMed

    Rudin, Cynthia; Waltz, David; Anderson, Roger N; Boulanger, Albert; Salleb-Aouissi, Ansaf; Chow, Maggie; Dutta, Haimonti; Gross, Philip N; Huang, Bert; Ierome, Steve; Isaac, Delfina F; Kressner, Arthur; Passonneau, Rebecca J; Radeva, Axinia; Wu, Leon

    2012-02-01

    Power companies can benefit from the use of knowledge discovery methods and statistical machine learning for preventive maintenance. We introduce a general process for transforming historical electrical grid data into models that aim to predict the risk of failures for components and systems. These models can be used directly by power companies to assist with prioritization of maintenance and repair work. Specialized versions of this process are used to produce 1) feeder failure rankings, 2) cable, joint, terminator, and transformer rankings, 3) feeder Mean Time Between Failure (MTBF) estimates, and 4) manhole events vulnerability rankings. The process in its most general form can handle diverse, noisy, sources that are historical (static), semi-real-time, or realtime, incorporates state-of-the-art machine learning algorithms for prioritization (supervised ranking or MTBF), and includes an evaluation of results via cross-validation and blind test. Above and beyond the ranked lists and MTBF estimates are business management interfaces that allow the prediction capability to be integrated directly into corporate planning and decision support; such interfaces rely on several important properties of our general modeling approach: that machine learning features are meaningful to domain experts, that the processing of data is transparent, and that prediction results are accurate enough to support sound decision making. We discuss the challenges in working with historical electrical grid data that were not designed for predictive purposes. The “rawness” of these data contrasts with the accuracy of the statistical models that can be obtained from the process; these models are sufficiently accurate to assist in maintaining New York City’s electrical grid.

  10. Behavior, Sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation

    PubMed Central

    Eickhoff, Simon B.; Nichols, Thomas E.; Laird, Angela R.; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T.

    2016-01-01

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. PMID:27179606

  11. Bias and power in group-based epidemiologic studies of low-back pain exposure and outcome--effects of study size and exposure measurement efforts.

    PubMed

    Coenen, Pieter; Mathiassen, Svend Erik; Kingma, Idsart; Boot, Cécile R L; Bongers, Paulien M; van Dieën, Jaap H

    2015-05-01

    Exposure-outcome studies, for instance on work-related low-back pain (LBP), often classify workers into groups for which exposures are estimated from measurements on a sample of workers within or outside the specific study. The present study investigated the influence on bias and power in exposure-outcome associations of the sizes of the total study population and the sample used to estimate exposures. At baseline, lifting, trunk flexion, and trunk rotation were observed for 371 of 1131 workers allocated to 19 a-priori defined occupational groups. LBP (dichotomous) was reported by all workers during 3 years of follow-up. All three exposures were associated with LBP in this parent study (P < 0.01). All 21 combinations of n = 10, 20, 30 workers per group with an outcome, and k = 1, 2, 3, 5, 10, 15, 20 workers actually being observed were investigated using bootstrapping, repeating each combination 10000 times. Odds ratios (OR) with P values were determined for each of these virtual studies. Average OR and statistical power (P < 0.05 and P < 0.01) was determined from the bootstrap distributions at each (n, k) combination. For lifting and flexed trunk, studies including n ≥ 20 workers, with k ≥ 5 observed, led to an almost unbiased OR and a power >0.80 (P level = 0.05). A similar performance required n ≥ 30 workers for rotated trunk. Small numbers of observed workers (k) resulted in biased OR, while power was, in general, more sensitive to the total number of workers (n). In epidemiologic studies using a group-based exposure assessment strategy, statistical performance may be sufficient if outcome is obtained from a reasonably large number of workers, even if exposure is estimated from only few workers per group. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. 46 CFR 112.05-1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS... dedicated emergency power source with sufficient capacity to supply those services that are necessary for... power source, except: (1) A load required by this part to be powered from the emergency power source; (2...

  13. 46 CFR 112.05-1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS... dedicated emergency power source with sufficient capacity to supply those services that are necessary for... power source, except: (1) A load required by this part to be powered from the emergency power source; (2...

  14. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  15. Statistical Power in Meta-Analysis

    ERIC Educational Resources Information Center

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  16. Determining the Statistical Power of the Kolmogorov-Smirnov and Anderson-Darling Goodness-of-Fit Tests via Monte Carlo Simulation

    DTIC Science & Technology

    2016-12-01

    KS and AD Statistical Power via Monte Carlo Simulation Statistical power is the probability of correctly rejecting the null hypothesis when the...Select a caveat DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Determining the Statistical Power...real-world data to test the accuracy of the simulation. Statistical comparison of these metrics can be necessary when making such a determination

  17. Data analysis of gravitational-wave signals from spinning neutron stars. III. Detection statistics and computational requirements

    NASA Astrophysics Data System (ADS)

    Jaranowski, Piotr; Królak, Andrzej

    2000-03-01

    We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.

  18. An audit of the statistics and the comparison with the parameter in the population

    NASA Astrophysics Data System (ADS)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  19. Matching of energetic, mechanic and control characteristics of positioning actuator

    NASA Astrophysics Data System (ADS)

    Y Nosova, N.; Misyurin, S. Yu; Kreinin, G. V.

    2017-12-01

    The problem of preliminary choice of parameters of the automated drive power channel is discussed. The drive of the mechatronic complex divides into two main units - power and control. The first determines the energy capabilities and, as a rule, the overall dimensions of the complex. The sufficient capacity of the power unit is a necessary condition for successful solution of control tasks without excessive complication of the control system structure. Preliminary selection of parameters is carried out based on the condition of providing the necessary drive power. The proposed approach is based on: a research of a sufficiently developed but not excessive dynamic model of the power block with the help of a conditional test control system; a transition to a normalized model with the formation of similarity criteria; constructing the synthesis procedure.

  20. Infants Segment Continuous Events Using Transitional Probabilities

    ERIC Educational Resources Information Center

    Stahl, Aimee E.; Romberg, Alexa R.; Roseberry, Sarah; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathryn

    2014-01-01

    Throughout their 1st year, infants adeptly detect statistical structure in their environment. However, little is known about whether statistical learning is a primary mechanism for event segmentation. This study directly tests whether statistical learning alone is sufficient to segment continuous events. Twenty-eight 7- to 9-month-old infants…

  1. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  2. Variability and reliability analysis in self-assembled multichannel carbon nanotube field-effect transistors

    NASA Astrophysics Data System (ADS)

    Hu, Zhaoying; Tulevski, George S.; Hannon, James B.; Afzali, Ali; Liehr, Michael; Park, Hongsik

    2015-06-01

    Carbon nanotubes (CNTs) have been widely studied as a channel material of scaled transistors for high-speed and low-power logic applications. In order to have sufficient drive current, it is widely assumed that CNT-based logic devices will have multiple CNTs in each channel. Understanding the effects of the number of CNTs on device performance can aid in the design of CNT field-effect transistors (CNTFETs). We have fabricated multi-CNT-channel CNTFETs with an 80-nm channel length using precise self-assembly methods. We describe compact statistical models and Monte Carlo simulations to analyze failure probability and the variability of the on-state current and threshold voltage. The results show that multichannel CNTFETs are more resilient to process variation and random environmental fluctuations than single-CNT devices.

  3. Failure to replicate the Mehta and Zhu (2009) color-priming effect on anagram solution times.

    PubMed

    Steele, Kenneth M

    2014-06-01

    Mehta and Zhu (Science, 323, 1226-1229, 2009) hypothesized that the color red induces avoidance motivation and that the color blue induces approach motivation. In one experiment, they reported that anagrams of avoidance motivation words were solved more quickly on red backgrounds and that approach motivation anagrams were solved more quickly on blue backgrounds. Reported here is a direct replication of that experiment, using the same anagrams, instructions, and colors, with more than triple the number of participants used in the original study. The results did not show the Mehta and Zhu color-priming effects, even though statistical power was sufficient to detect the effect. The results call into question the existence of their color-priming effect on the solution of anagrams.

  4. Metaresearch for Evaluating Reproducibility in Ecology and Evolution.

    PubMed

    Fidler, Fiona; Chee, Yung En; Wintle, Bonnie C; Burgman, Mark A; McCarthy, Michael A; Gordon, Ascelin

    2017-03-01

    Recent replication projects in other disciplines have uncovered disturbingly low levels of reproducibility, suggesting that those research literatures may contain unverifiable claims. The conditions contributing to irreproducibility in other disciplines are also present in ecology. These include a large discrepancy between the proportion of "positive" or "significant" results and the average statistical power of empirical research, incomplete reporting of sampling stopping rules and results, journal policies that discourage replication studies, and a prevailing publish-or-perish research culture that encourages questionable research practices. We argue that these conditions constitute sufficient reason to systematically evaluate the reproducibility of the evidence base in ecology and evolution. In some cases, the direct replication of ecological research is difficult because of strong temporal and spatial dependencies, so here, we propose metaresearch projects that will provide proxy measures of reproducibility.

  5. Recruitment of Older Adults: Success May Be in the Details

    PubMed Central

    McHenry, Judith C.; Insel, Kathleen C.; Einstein, Gilles O.; Vidrine, Amy N.; Koerner, Kari M.; Morrow, Daniel G.

    2015-01-01

    Purpose: Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Results: Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. Implications: The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. PMID:22899424

  6. Evaluation of the Association between Persistent Organic ...

    EPA Pesticide Factsheets

    Background: Diabetes is a major threat to public health in the United States and worldwide. Understanding the role of environmental chemicals in the development or progression of diabetes is an emerging issue in environmental health.Objective: We assessed the epidemiologic literature for evidence of associations between persistent organic pollutants (POPs) and type 2 diabetes.Methods: Using a PubMed search and reference lists from relevant studies or review articles, we identified 72 epidemiological studies that investigated associations of persistent organic pollutants (POPs) with diabetes. We evaluated these studies for consistency, strengths and weaknesses of study design (including power and statistical methods), clinical diagnosis, exposure assessment, study population characteristics, and identification of data gaps and areas for future research.Conclusions: Heterogeneity of the studies precluded conducting a meta-analysis, but the overall evidence is sufficient for a positive association of some organochlorine POPs with type 2 diabetes. Collectively, these data are not sufficient to establish causality. Initial data mining revealed that the strongest positive correlation of diabetes with POPs occurred with organochlorine compounds, such as trans-nonachlor, dichlorodiphenyldichloroethylene (DDE), polychlorinated biphenyls (PCBs), and dioxins and dioxin-like chemicals. There is less indication of an association between other nonorganochlorine POPs, such as

  7. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  8. Bi-PROF

    PubMed Central

    Gries, Jasmin; Schumacher, Dirk; Arand, Julia; Lutsik, Pavlo; Markelova, Maria Rivera; Fichtner, Iduna; Walter, Jörn; Sers, Christine; Tierling, Sascha

    2013-01-01

    The use of next generation sequencing has expanded our view on whole mammalian methylome patterns. In particular, it provides a genome-wide insight of local DNA methylation diversity at single nucleotide level and enables the examination of single chromosome sequence sections at a sufficient statistical power. We describe a bisulfite-based sequence profiling pipeline, Bi-PROF, which is based on the 454 GS-FLX Titanium technology that allows to obtain up to one million sequence stretches at single base pair resolution without laborious subcloning. To illustrate the performance of the experimental workflow connected to a bioinformatics program pipeline (BiQ Analyzer HT) we present a test analysis set of 68 different epigenetic marker regions (amplicons) in five individual patient-derived xenograft tissue samples of colorectal cancer and one healthy colon epithelium sample as a control. After the 454 GS-FLX Titanium run, sequence read processing and sample decoding, the obtained alignments are quality controlled and statistically evaluated. Comprehensive methylation pattern interpretation (profiling) assessed by analyzing 102-104 sequence reads per amplicon allows an unprecedented deep view on pattern formation and methylation marker heterogeneity in tissues concerned by complex diseases like cancer. PMID:23803588

  9. Propagation of radially polarized multi-cosine Gaussian Schell-model beams in non-Kolmogorov turbulence

    NASA Astrophysics Data System (ADS)

    Tang, Miaomiao; Zhao, Daomu; Li, Xinzhong; Wang, Jingge

    2018-01-01

    Recently, we introduced a new class of radially polarized beams with multi-cosine Gaussian Schell-model(MCGSM) correlation function based on the partially coherent theory (Tang et al., 2017). In this manuscript, we extend the work to study the statistical properties such as the spectral density, the degree of coherence, the degree of polarization, and the state of polarization of the beam propagating in isotropic turbulence with a non-Kolmogorov power spectrum. Analytical formulas for the cross-spectral density matrix elements of a radially polarized MCGSM beam in non-Kolmogorov turbulence are derived. Numerical results show that lattice-like intensity pattern of the beam, which keeps propagation-invariant in free space, is destroyed by the turbulence when it passes at sufficiently large distances from the source. It is also shown that the polarization properties are mainly affected by the source correlation functions, and change in the turbulent statistics plays a relatively small effect. In addition, the polarization state exhibits self-splitting property and each beamlet evolves into radially polarized structure upon propagation.

  10. Comparison of cosmology and seabed acoustics measurements using statistical inference from maximum entropy

    NASA Astrophysics Data System (ADS)

    Knobles, David; Stotts, Steven; Sagers, Jason

    2012-03-01

    Why can one obtain from similar measurements a greater amount of information about cosmological parameters than seabed parameters in ocean waveguides? The cosmological measurements are in the form of a power spectrum constructed from spatial correlations of temperature fluctuations within the microwave background radiation. The seabed acoustic measurements are in the form of spatial correlations along the length of a spatial aperture. This study explores the above question from the perspective of posterior probability distributions obtained from maximizing a relative entropy functional. An answer is in part that the seabed in shallow ocean environments generally has large temporal and spatial inhomogeneities, whereas the early universe was a nearly homogeneous cosmological soup with small but important fluctuations. Acoustic propagation models used in shallow water acoustics generally do not capture spatial and temporal variability sufficiently well, which leads to model error dominating the statistical inference problem. This is not the case in cosmology. Further, the physics of the acoustic modes in cosmology is that of a standing wave with simple initial conditions, whereas for underwater acoustics it is a traveling wave in a strongly inhomogeneous bounded medium.

  11. An application of statistical mechanics for representing equilibrium perimeter distributions of tropical convective clouds

    NASA Astrophysics Data System (ADS)

    Garrett, T. J.; Alva, S.; Glenn, I. B.; Krueger, S. K.

    2015-12-01

    There are two possible approaches for parameterizing sub-grid cloud dynamics in a coarser grid model. The most common is to use a fine scale model to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to parameterize these behaviors cloud state for the coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical mechanics. This approach avoids any requirement to resolve time-dependent processes in order to arrive at a suitable solution. The second approach is widely used elsewhere in the atmospheric sciences: for example the Planck function for blackbody radiation is derived this way, where no mention is made of the complexities of modeling a large ensemble of time-dependent radiation-dipole interactions in order to obtain the "grid-scale" spectrum of thermal emission by the blackbody as a whole. We find that this statistical approach may be equally suitable for modeling convective clouds. Specifically, we make the physical argument that the dissipation of buoyant energy in convective clouds is done through mixing across a cloud perimeter. From thermodynamic reasoning, one might then anticipate that vertically stacked isentropic surfaces are characterized by a power law dlnN/dlnP = -1, where N(P) is the number clouds of perimeter P. In a Giga-LES simulation of convective clouds within a 100 km square domain we find that such a power law does appear to characterize simulated cloud perimeters along isentropes, provided a sufficient cloudy sample. The suggestion is that it may be possible to parameterize certain important aspects of cloud state without appealing to computationally expensive dynamic simulations.

  12. Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱

    PubMed Central

    Marcum, Christopher Steven; Butts, Carter T.

    2015-01-01

    The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488

  13. 47 CFR 80.1015 - Power supply.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Power supply. 80.1015 Section 80.1015... MARITIME SERVICES Radiotelephone Installations Required by the Bridge-to-Bridge Act § 80.1015 Power supply. (a) There must be readily available for use under normal load conditions, a power supply sufficient...

  14. 47 CFR 80.1015 - Power supply.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Power supply. 80.1015 Section 80.1015... MARITIME SERVICES Radiotelephone Installations Required by the Bridge-to-Bridge Act § 80.1015 Power supply. (a) There must be readily available for use under normal load conditions, a power supply sufficient...

  15. Forecasting volatility with neural regression: a contribution to model adequacy.

    PubMed

    Refenes, A N; Holt, W T

    2001-01-01

    Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.

  16. Combining Shapley value and statistics to the analysis of gene expression data in children exposed to air pollution

    PubMed Central

    Moretti, Stefano; van Leeuwen, Danitsja; Gmuender, Hans; Bonassi, Stefano; van Delft, Joost; Kleinjans, Jos; Patrone, Fioravante; Merlo, Domenico Franco

    2008-01-01

    Background In gene expression analysis, statistical tests for differential gene expression provide lists of candidate genes having, individually, a sufficiently low p-value. However, the interpretation of each single p-value within complex systems involving several interacting genes is problematic. In parallel, in the last sixty years, game theory has been applied to political and social problems to assess the power of interacting agents in forcing a decision and, more recently, to represent the relevance of genes in response to certain conditions. Results In this paper we introduce a Bootstrap procedure to test the null hypothesis that each gene has the same relevance between two conditions, where the relevance is represented by the Shapley value of a particular coalitional game defined on a microarray data-set. This method, which is called Comparative Analysis of Shapley value (shortly, CASh), is applied to data concerning the gene expression in children differentially exposed to air pollution. The results provided by CASh are compared with the results from a parametric statistical test for testing differential gene expression. Both lists of genes provided by CASh and t-test are informative enough to discriminate exposed subjects on the basis of their gene expression profiles. While many genes are selected in common by CASh and the parametric test, it turns out that the biological interpretation of the differences between these two selections is more interesting, suggesting a different interpretation of the main biological pathways in gene expression regulation for exposed individuals. A simulation study suggests that CASh offers more power than t-test for the detection of differential gene expression variability. Conclusion CASh is successfully applied to gene expression analysis of a data-set where the joint expression behavior of genes may be critical to characterize the expression response to air pollution. We demonstrate a synergistic effect between coalitional games and statistics that resulted in a selection of genes with a potential impact in the regulation of complex pathways. PMID:18764936

  17. Low statistical power in biomedical science: a review of three human research domains.

    PubMed

    Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R

    2017-02-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.

  18. Low statistical power in biomedical science: a review of three human research domains

    PubMed Central

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  19. 1/f2 Characteristics and Isotropy in the Fourier Power Spectra of Visual Art, Cartoons, Comics, Mangas, and Different Categories of Photographs

    PubMed Central

    Koch, Michael; Denzler, Joachim; Redies, Christoph

    2010-01-01

    Art images and natural scenes have in common that their radially averaged (1D) Fourier spectral power falls according to a power-law with increasing spatial frequency (1/f2 characteristics), which implies that the power spectra have scale-invariant properties. In the present study, we show that other categories of man-made images, cartoons and graphic novels (comics and mangas), have similar properties. Further on, we extend our investigations to 2D power spectra. In order to determine whether the Fourier power spectra of man-made images differed from those of other categories of images (photographs of natural scenes, objects, faces and plants and scientific illustrations), we analyzed their 2D power spectra by principal component analysis. Results indicated that the first fifteen principal components allowed a partial separation of the different image categories. The differences between the image categories were studied in more detail by analyzing whether the mean power and the slope of the power gradients from low to high spatial frequencies varied across orientations in the power spectra. Mean power was generally higher in cardinal orientations both in real-world photographs and artworks, with no systematic difference between the two types of images. However, the slope of the power gradients showed a lower degree of mean variability across spectral orientations (i.e., more isotropy) in art images, cartoons and graphic novels than in photographs of comparable subject matters. Taken together, these results indicate that art images, cartoons and graphic novels possess relatively uniform 1/f2 characteristics across all orientations. In conclusion, the man-made stimuli studied, which were presumably produced to evoke pleasant and/or enjoyable visual perception in human observers, form a subset of all images and share statistical properties in their Fourier power spectra. Whether these properties are necessary or sufficient to induce aesthetic perception remains to be investigated. PMID:20808863

  20. 1/f 2 Characteristics and isotropy in the fourier power spectra of visual art, cartoons, comics, mangas, and different categories of photographs.

    PubMed

    Koch, Michael; Denzler, Joachim; Redies, Christoph

    2010-08-19

    Art images and natural scenes have in common that their radially averaged (1D) Fourier spectral power falls according to a power-law with increasing spatial frequency (1/f(2) characteristics), which implies that the power spectra have scale-invariant properties. In the present study, we show that other categories of man-made images, cartoons and graphic novels (comics and mangas), have similar properties. Further on, we extend our investigations to 2D power spectra. In order to determine whether the Fourier power spectra of man-made images differed from those of other categories of images (photographs of natural scenes, objects, faces and plants and scientific illustrations), we analyzed their 2D power spectra by principal component analysis. Results indicated that the first fifteen principal components allowed a partial separation of the different image categories. The differences between the image categories were studied in more detail by analyzing whether the mean power and the slope of the power gradients from low to high spatial frequencies varied across orientations in the power spectra. Mean power was generally higher in cardinal orientations both in real-world photographs and artworks, with no systematic difference between the two types of images. However, the slope of the power gradients showed a lower degree of mean variability across spectral orientations (i.e., more isotropy) in art images, cartoons and graphic novels than in photographs of comparable subject matters. Taken together, these results indicate that art images, cartoons and graphic novels possess relatively uniform 1/f(2) characteristics across all orientations. In conclusion, the man-made stimuli studied, which were presumably produced to evoke pleasant and/or enjoyable visual perception in human observers, form a subset of all images and share statistical properties in their Fourier power spectra. Whether these properties are necessary or sufficient to induce aesthetic perception remains to be investigated.

  1. Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…

  2. Statistics and Title VII Proof: Prima Facie Case and Rebuttal.

    ERIC Educational Resources Information Center

    Whitten, David

    1978-01-01

    The method and means by which statistics can raise a prima facie case of Title VII violation are analyzed. A standard is identified that can be applied to determine whether a statistical disparity is sufficient to shift the burden to the employer to rebut a prima facie case of discrimination. (LBH)

  3. Self sufficient wireless transmitter powered by foot-pumped urine operating wearable MFC.

    PubMed

    Taghavi, M; Stinchcombe, A; Greenman, J; Mattoli, V; Beccai, L; Mazzolai, B; Melhuish, C; Ieropoulos, I A

    2015-12-10

    The first self-sufficient system, powered by a wearable energy generator based on microbial fuel cell (MFC) technology is introduced. MFCs made from compliant material were developed in the frame of a pair of socks, which was fed by urine via a manual gaiting pump. The simple and single loop cardiovascular fish circulatory system was used as the inspiration for the design of the manual pump. A wireless programmable communication module, engineered to operate within the range of the generated electricity, was employed, which opens a new avenue for research in the utilisation of waste products for powering portable as well as wearable electronics.

  4. 47 CFR 80.875 - VHF radiotelephone power supply.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false VHF radiotelephone power supply. 80.875 Section... to Subpart W § 80.875 VHF radiotelephone power supply. (a) There must be readily available for use under normal load conditions a power supply sufficient to simultaneously energize the VHF transmitter at...

  5. 47 CFR 80.875 - VHF radiotelephone power supply.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false VHF radiotelephone power supply. 80.875 Section... to Subpart W § 80.875 VHF radiotelephone power supply. (a) There must be readily available for use under normal load conditions a power supply sufficient to simultaneously energize the VHF transmitter at...

  6. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide—An Overview

    PubMed Central

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools. PMID:28257103

  7. Multivariate Welch t-test on distances

    PubMed Central

    2016-01-01

    Motivation: Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. Results: We develop a solution in the form of a distance-based Welch t-test, TW2, for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and TW2 in reanalysis of two existing microbiome datasets, where the methodology has originated. Availability and Implementation: The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2. Further guidance on application of these methods can be obtained from the author. Contact: alekseye@musc.edu PMID:27515741

  8. Multivariate Welch t-test on distances.

    PubMed

    Alekseyenko, Alexander V

    2016-12-01

    Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. We develop a solution in the form of a distance-based Welch t-test, [Formula: see text], for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and [Formula: see text] in reanalysis of two existing microbiome datasets, where the methodology has originated. The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2 Further guidance on application of these methods can be obtained from the author. alekseye@musc.edu. © The Author 2016. Published by Oxford University Press.

  9. Assessment of risk factors for death in electrical injury.

    PubMed

    Dokov, William

    2009-02-01

    Fatal high-voltage injuries present a problem which has not yet been studied sufficiently in the context of interaction between the human body and electricity, as a technical, anthropogenic and natural phenomenon. The forensic medicine records of 291 cases of death caused by high-voltage current for a 41-year-long period (1965-2006) were examined in retrospect. The descriptive statistical analyses were made using the SPSS 11.0 software. Death was found to result most commonly from contact between the deceased and elements of the power transmission and distribution grid: (41.24%), and from the action of lightning: (32.3%), the difference in their relative share being insignificant. Much more rarely, death was due to contact with construction and repair electrical devices: (7.56%), or with elements of the power transport railway infrastructure: (6.87%). Death resulting from contact with agricultural electrical devices was only occasional: (0.68%). The victims' average age was 36.19 years. Our analysis indicates that the relative share (43.98%) of the victims is the highest in the age period between 25 and 44. The ratio between women and men is 1:21.38.

  10. Retrocausal Effects As A Consequence of Orthodox Quantum Mechanics Refined To Accommodate The Principle Of Sufficient Reason

    NASA Astrophysics Data System (ADS)

    Stapp, Henry P.

    2011-11-01

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  11. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  12. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size.

    PubMed

    Heidel, R Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  13. Vibration Transmission through Rolling Element Bearings in Geared Rotor Systems

    DTIC Science & Technology

    1990-11-01

    147 4.8 Concluding Remarks ........................................................... 153 V STATISTICAL ENERGY ANALYSIS ............................................ 155...and dynamic finite element techniques are used to develop the discrete vibration models while statistical energy analysis method is used for the broad...bearing system studies, geared rotor system studies, and statistical energy analysis . Each chapter is self sufficient since it is written in a

  14. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  15. Statistical investigation of avalanches of three-dimensional small-world networks and their boundary and bulk cross-sections

    NASA Astrophysics Data System (ADS)

    Najafi, M. N.; Dashti-Naserabadi, H.

    2018-03-01

    In many situations we are interested in the propagation of energy in some portions of a three-dimensional system with dilute long-range links. In this paper, a sandpile model is defined on the three-dimensional small-world network with real dissipative boundaries and the energy propagation is studied in three dimensions as well as the two-dimensional cross-sections. Two types of cross-sections are defined in the system, one in the bulk and another in the system boundary. The motivation of this is to make clear how the statistics of the avalanches in the bulk cross-section tend to the statistics of the dissipative avalanches, defined in the boundaries as the concentration of long-range links (α ) increases. This trend is numerically shown to be a power law in a manner described in the paper. Two regimes of α are considered in this work. For sufficiently small α s the dominant behavior of the system is just like that of the regular BTW, whereas for the intermediate values the behavior is nontrivial with some exponents that are reported in the paper. It is shown that the spatial extent up to which the statistics is similar to the regular BTW model scales with α just like the dissipative BTW model with the dissipation factor (mass in the corresponding ghost model) m2˜α for the three-dimensional system as well as its two-dimensional cross-sections.

  16. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  17. The impact of the Sarbanes Oxley Act on auditing fees: An empirical study of the oil and gas industry

    NASA Astrophysics Data System (ADS)

    Ezelle, Ralph Wayne, Jr.

    2011-12-01

    This study examines auditing of energy firms prior and post Sarbanes Oxley Act of 2002. The research explores factors impacting the asset adjusted audit fee of oil and gas companies and specifically examines the effect of the Sarbanes Oxley Act. This research analyzes multiple year audit fees of the firms engaged in the oil and gas industry. Pooled samples were created to improve statistical power with sample sizes sufficient to test for medium and large effect size. The Sarbanes Oxley Act significantly increases a firm's asset adjusted audit fees. Additional findings are that part of the variance in audit fees was attributable to the market value of the enterprise, the number of subsidiaries, the receivables and inventory, debt ratio, non-profitability, and receipt of a going concern report.

  18. Measuring technical and mathematical investigation of multiple reignitions at the switching of a motor using vacuum circuit breakers

    NASA Astrophysics Data System (ADS)

    Luxa, Andreas

    The necessary conditions in switching system and vacuum circuit breaker for the occurrence of multiple re-ignitions and accompanying effects were examined. The shape of the occurring voltages was determined in relationship to other types of overvoltage. A phenomenological model of the arc, based on an extension of the Mayr equation for arcs was used with the simulation program NETOMAC for the switching transients. Factors which affect the arc parameters were analyzed. The results were statistically verified by 3000 three-phase switching tests on 3 standard vacuum circuit breakers under realistic systems conditions; the occurring overvoltage level was measured. Dimensioning criteria for motor simulation circuits in power plants were formulated on the basis of a theoretical equivalence analysis and experimental studies. The simulation model allows a sufficiently correct estimation of all effects.

  19. Metaresearch for Evaluating Reproducibility in Ecology and Evolution

    PubMed Central

    Fidler, Fiona; Chee, Yung En; Wintle, Bonnie C.; Burgman, Mark A.; McCarthy, Michael A.; Gordon, Ascelin

    2017-01-01

    Abstract Recent replication projects in other disciplines have uncovered disturbingly low levels of reproducibility, suggesting that those research literatures may contain unverifiable claims. The conditions contributing to irreproducibility in other disciplines are also present in ecology. These include a large discrepancy between the proportion of “positive” or “significant” results and the average statistical power of empirical research, incomplete reporting of sampling stopping rules and results, journal policies that discourage replication studies, and a prevailing publish-or-perish research culture that encourages questionable research practices. We argue that these conditions constitute sufficient reason to systematically evaluate the reproducibility of the evidence base in ecology and evolution. In some cases, the direct replication of ecological research is difficult because of strong temporal and spatial dependencies, so here, we propose metaresearch projects that will provide proxy measures of reproducibility. PMID:28596617

  20. Analyzing the Impacts of Increased Wind Power on Generation Revenue Sufficiency: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Wu, Hongyu; Tan, Jin

    2016-08-01

    The Revenue Sufficiency Guarantee (RSG), as part of make-whole (or uplift) payments in electricity markets, is designed to recover the generation resources' offer-based production costs that are not otherwise covered by their market revenues. Increased penetrations of wind power will bring significant impacts to the RSG payments in the markets. However, literature related to this topic is sparse. This paper first reviews the industrial practices of implementing RSG in major U.S. independent system operators (ISOs) and regional transmission operators (RTOs) and then develops a general RSG calculation method. Finally, an 18-bus test system is adopted to demonstrate the impacts ofmore » increased wind power on RSG payments.« less

  1. Analyzing the Impacts of Increased Wind Power on Generation Revenue Sufficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Wu, Hongyu; Tan, Jin

    2016-11-14

    The Revenue Sufficiency Guarantee (RSG), as part of make-whole (or uplift) payments in electricity markets, is designed to recover the generation resources' offer-based production costs that are not otherwise covered by their market revenues. Increased penetrations of wind power will bring significant impacts to the RSG payments in the markets. However, literature related to this topic is sparse. This paper first reviews the industrial practices of implementing RSG in major U.S. independent system operators (ISOs) and regional transmission operators (RTOs) and then develops a general RSG calculation method. Finally, an 18-bus test system is adopted to demonstrate the impacts ofmore » increased wind power on RSG payments.« less

  2. Improving the analysis of composite endpoints in rare disease trials.

    PubMed

    McMenamin, Martina; Berglind, Anna; Wason, James M S

    2018-05-22

    Composite endpoints are recommended in rare diseases to increase power and/or to sufficiently capture complexity. Often, they are in the form of responder indices which contain a mixture of continuous and binary components. Analyses of these outcomes typically treat them as binary, thus only using the dichotomisations of continuous components. The augmented binary method offers a more efficient alternative and is therefore especially useful for rare diseases. Previous work has indicated the method may have poorer statistical properties when the sample size is small. Here we investigate small sample properties and implement small sample corrections. We re-sample from a previous trial with sample sizes varying from 30 to 80. We apply the standard binary and augmented binary methods and determine the power, type I error rate, coverage and average confidence interval width for each of the estimators. We implement Firth's adjustment for the binary component models and a small sample variance correction for the generalized estimating equations, applying the small sample adjusted methods to each sub-sample as before for comparison. For the log-odds treatment effect the power of the augmented binary method is 20-55% compared to 12-20% for the standard binary method. Both methods have approximately nominal type I error rates. The difference in response probabilities exhibit similar power but both unadjusted methods demonstrate type I error rates of 6-8%. The small sample corrected methods have approximately nominal type I error rates. On both scales, the reduction in average confidence interval width when using the adjusted augmented binary method is 17-18%. This is equivalent to requiring a 32% smaller sample size to achieve the same statistical power. The augmented binary method with small sample corrections provides a substantial improvement for rare disease trials using composite endpoints. We recommend the use of the method for the primary analysis in relevant rare disease trials. We emphasise that the method should be used alongside other efforts in improving the quality of evidence generated from rare disease trials rather than replace them.

  3. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    PubMed

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  4. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model

    PubMed Central

    Austin, Peter C.

    2017-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694

  5. Retrocausal Effects as a Consequence of Quantum Mechanics Refined to Accommodate the Principle of Sufficient Reason

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapp, Henry P.

    2011-05-10

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less

  6. 75 FR 45197 - Notice of Buy America Waiver Request by Northern New England Passenger Rail Authority To Purchase...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-02

    ... in the United States in a sufficient and reasonably available amount or are not of a satisfactory... sufficient and reasonably available amount or are not of a satisfactory quality; (C) rolling stock or power...

  7. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…

  8. An Examination of Statistical Power in Multigroup Dynamic Structural Equation Models

    ERIC Educational Resources Information Center

    Prindle, John J.; McArdle, John J.

    2012-01-01

    This study used statistical simulation to calculate differential statistical power in dynamic structural equation models with groups (as in McArdle & Prindle, 2008). Patterns of between-group differences were simulated to provide insight into how model parameters influence power approximations. Chi-square and root mean square error of…

  9. Direct and indirect comparison meta-analysis of levetiracetam versus phenytoin or valproate for convulsive status epilepticus.

    PubMed

    Brigo, Francesco; Bragazzi, Nicola; Nardone, Raffaele; Trinka, Eugen

    2016-11-01

    The aim of this study was to conduct a meta-analysis of published studies to directly compare intravenous (IV) levetiracetam (LEV) with IV phenytoin (PHT) or IV valproate (VPA) as second-line treatment of status epilepticus (SE), to indirectly compare intravenous IV LEV with IV VPA using common reference-based indirect comparison meta-analysis, and to verify whether results of indirect comparisons are consistent with results of head-to-head randomized controlled trials (RCTs) directly comparing IV LEV with IV VPA. Random-effects Mantel-Haenszel meta-analyses to obtain odds ratios (ORs) for efficacy and safety of LEV versus VPA and LEV or VPA versus PHT were used. Adjusted indirect comparisons between LEV and VPA were used. Two RCTs comparing LEV with PHT (144 episodes of SE) and 3 RCTs comparing VPA with PHT (227 episodes of SE) were included. Direct comparisons showed no difference in clinical seizure cessation, neither between VPA and PHT (OR: 1.07; 95% CI: 0.57 to 2.03) nor between LEV and PHT (OR: 1.18; 95% CI: 0.50 to 2.79). Indirect comparisons showed no difference between LEV and VPA for clinical seizure cessation (OR: 1.16; 95% CI: 0.45 to 2.97). Results of indirect comparisons are consistent with results of a recent RCT directly comparing LEV with VPA. The absence of a statistically significant difference in direct and indirect comparisons is due to the lack of sufficient statistical power to detect a difference. Conducting a RCT that has not enough people to detect a clinically important difference or to estimate an effect with sufficient precision can be regarded a waste of time and resources and may raise several ethical concerns, especially in RCT on SE. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Analysis of Clinical Cohort Data Using Nested Case-control and Case-cohort Sampling Designs. A Powerful and Economical Tool.

    PubMed

    Ohneberg, K; Wolkewitz, M; Beyersmann, J; Palomar-Martinez, M; Olaechea-Astigarraga, P; Alvarez-Lerma, F; Schumacher, M

    2015-01-01

    Sampling from a large cohort in order to derive a subsample that would be sufficient for statistical analysis is a frequently used method for handling large data sets in epidemiological studies with limited resources for exposure measurement. For clinical studies however, when interest is in the influence of a potential risk factor, cohort studies are often the first choice with all individuals entering the analysis. Our aim is to close the gap between epidemiological and clinical studies with respect to design and power considerations. Schoenfeld's formula for the number of events required for a Cox' proportional hazards model is fundamental. Our objective is to compare the power of analyzing the full cohort and the power of a nested case-control and a case-cohort design. We compare formulas for power for sampling designs and cohort studies. In our data example we simultaneously apply a nested case-control design with a varying number of controls matched to each case, a case cohort design with varying subcohort size, a random subsample and a full cohort analysis. For each design we calculate the standard error for estimated regression coefficients and the mean number of distinct persons, for whom covariate information is required. The formula for the power of a nested case-control design and the power of a case-cohort design is directly connected to the power of a cohort study using the well known Schoenfeld formula. The loss in precision of parameter estimates is relatively small compared to the saving in resources. Nested case-control and case-cohort studies, but not random subsamples yield an attractive alternative for analyzing clinical studies in the situation of a low event rate. Power calculations can be conducted straightforwardly to quantify the loss of power compared to the savings in the num-ber of patients using a sampling design instead of analyzing the full cohort.

  11. Ignition in tokamaks with modulated source of auxiliary heating

    NASA Astrophysics Data System (ADS)

    Morozov, D. Kh

    2017-12-01

    It is shown that the ignition may be achieved in tokamaks with the modulated power source. The time-averaged source power may be smaller than the steady-state source power, which is sufficient for the ignition. Nevertheless, the maximal power must be large enough, because the ignition must be achieved within a finite time interval.

  12. Fluidic-thermochromic display device

    NASA Technical Reports Server (NTRS)

    Grafstein, D.; Hilborn, E. H.

    1968-01-01

    Fluidic decoder and display device has low-power requirements for temperature control of thermochromic materials. An electro-to-fluid converter translates incoming electrical signals into pneumatics signal of sufficient power to operate the fluidic logic elements.

  13. Electric power annual 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Electric Power Annual presents a summary of electric utility statistics at national, regional and State levels. The objective of the publication is to provide industry decisionmakers, government policymakers, analysts and the general public with historical data that may be used in understanding US electricity markets. The Electric Power Annual is prepared by the Survey Management Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. ``The US Electric Power Industry at a Glance`` section presents a profile of the electric power industry ownership and performance, and a review of key statistics formore » the year. Subsequent sections present data on generating capability, including proposed capability additions; net generation; fossil-fuel statistics; retail sales; revenue; financial statistics; environmental statistics; electric power transactions; demand-side management; and nonutility power producers. In addition, the appendices provide supplemental data on major disturbances and unusual occurrences in US electricity power systems. Each section contains related text and tables and refers the reader to the appropriate publication that contains more detailed data on the subject matter. Monetary values in this publication are expressed in nominal terms.« less

  14. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains.

    PubMed

    Tataru, Paula; Hobolth, Asger

    2011-12-05

    Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD), the second on uniformization (UNI), and the third on integrals of matrix exponentials (EXPM). The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  15. 46 CFR 58.05-5 - Astern power.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Astern power. 58.05-5 Section 58.05-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING MAIN AND AUXILIARY MACHINERY AND RELATED SYSTEMS Main Propulsion Machinery § 58.05-5 Astern power. (a) All vessels shall have sufficient...

  16. 46 CFR 58.05-5 - Astern power.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Astern power. 58.05-5 Section 58.05-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING MAIN AND AUXILIARY MACHINERY AND RELATED SYSTEMS Main Propulsion Machinery § 58.05-5 Astern power. (a) All vessels shall have sufficient...

  17. 46 CFR 58.05-5 - Astern power.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Astern power. 58.05-5 Section 58.05-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING MAIN AND AUXILIARY MACHINERY AND RELATED SYSTEMS Main Propulsion Machinery § 58.05-5 Astern power. (a) All vessels shall have sufficient...

  18. 46 CFR 58.05-5 - Astern power.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Astern power. 58.05-5 Section 58.05-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING MAIN AND AUXILIARY MACHINERY AND RELATED SYSTEMS Main Propulsion Machinery § 58.05-5 Astern power. (a) All vessels shall have sufficient...

  19. 46 CFR 58.05-5 - Astern power.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Astern power. 58.05-5 Section 58.05-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING MAIN AND AUXILIARY MACHINERY AND RELATED SYSTEMS Main Propulsion Machinery § 58.05-5 Astern power. (a) All vessels shall have sufficient...

  20. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    ERIC Educational Resources Information Center

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…

  1. The Importance of Teaching Power in Statistical Hypothesis Testing

    ERIC Educational Resources Information Center

    Olinsky, Alan; Schumacher, Phyllis; Quinn, John

    2012-01-01

    In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…

  2. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    PubMed Central

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%–155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%–71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power. PMID:28479943

  3. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power.

    PubMed

    Miciak, Jeremy; Taylor, W Pat; Stuebing, Karla K; Fletcher, Jack M; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%-155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%-71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power.

  4. Recruitment of Older Adults: Success May Be in the Details.

    PubMed

    McHenry, Judith C; Insel, Kathleen C; Einstein, Gilles O; Vidrine, Amy N; Koerner, Kari M; Morrow, Daniel G

    2015-10-01

    Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. © The Author 2012. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. The nature-disorder paradox: A perceptual study on how nature is disorderly yet aesthetically preferred.

    PubMed

    Kotabe, Hiroki P; Kardan, Omid; Berman, Marc G

    2017-08-01

    Natural environments have powerful aesthetic appeal linked to their capacity for psychological restoration. In contrast, disorderly environments are aesthetically aversive, and have various detrimental psychological effects. But in our research, we have repeatedly found that natural environments are perceptually disorderly. What could explain this paradox? We present 3 competing hypotheses: the aesthetic preference for naturalness is more powerful than the aesthetic aversion to disorder (the nature-trumps-disorder hypothesis ); disorder is trivial to aesthetic preference in natural contexts (the harmless-disorder hypothesis ); and disorder is aesthetically preferred in natural contexts (the beneficial-disorder hypothesis ). Utilizing novel methods of perceptual study and diverse stimuli, we rule in the nature-trumps-disorder hypothesis and rule out the harmless-disorder and beneficial-disorder hypotheses. In examining perceptual mechanisms, we find evidence that high-level scene semantics are both necessary and sufficient for the nature-trumps-disorder effect. Necessity is evidenced by the effect disappearing in experiments utilizing only low-level visual stimuli (i.e., where scene semantics have been removed) and experiments utilizing a rapid-scene-presentation procedure that obscures scene semantics. Sufficiency is evidenced by the effect reappearing in experiments utilizing noun stimuli which remove low-level visual features. Furthermore, we present evidence that the interaction of scene semantics with low-level visual features amplifies the nature-trumps-disorder effect-the effect is weaker both when statistically adjusting for quantified low-level visual features and when using noun stimuli which remove low-level visual features. These results have implications for psychological theories bearing on the joint influence of low- and high-level perceptual inputs on affect and cognition, as well as for aesthetic design. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. The War Powers Resolution: Intent Implementation and Impact

    DTIC Science & Technology

    1993-04-01

    separation of powers , the authority as Commander-in-Chief is also specifically delegated to the President. The clear intent of the founders of our nation was... separation of powers spelled out - in sufficient detail they thought -- so that there would be little or no ambiguity over who exercised what powers...OF CONFLICT SEPARATION OF POWERS As discussed in the opening paragraphs of this paper, the founding fathers intentionally delegated 20 separate powers

  7. New powerful statistics for alignment-free sequence comparison under a pattern transfer model.

    PubMed

    Liu, Xuemei; Wan, Lin; Li, Jing; Reinert, Gesine; Waterman, Michael S; Sun, Fengzhu

    2011-09-07

    Alignment-free sequence comparison is widely used for comparing gene regulatory regions and for identifying horizontally transferred genes. Recent studies on the power of a widely used alignment-free comparison statistic D2 and its variants D*2 and D(s)2 showed that their power approximates a limit smaller than 1 as the sequence length tends to infinity under a pattern transfer model. We develop new alignment-free statistics based on D2, D*2 and D(s)2 by comparing local sequence pairs and then summing over all the local sequence pairs of certain length. We show that the new statistics are much more powerful than the corresponding statistics and the power tends to 1 as the sequence length tends to infinity under the pattern transfer model. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. New Powerful Statistics for Alignment-free Sequence Comparison Under a Pattern Transfer Model

    PubMed Central

    Liu, Xuemei; Wan, Lin; Li, Jing; Reinert, Gesine; Waterman, Michael S.; Sun, Fengzhu

    2011-01-01

    Alignment-free sequence comparison is widely used for comparing gene regulatory regions and for identifying horizontally transferred genes. Recent studies on the power of a widely used alignment-free comparison statistic D2 and its variants D2∗ and D2s showed that their power approximates a limit smaller than 1 as the sequence length tends to infinity under a pattern transfer model. We develop new alignment-free statistics based on D2, D2∗ and D2s by comparing local sequence pairs and then summing over all the local sequence pairs of certain length. We show that the new statistics are much more powerful than the corresponding statistics and the power tends to 1 as the sequence length tends to infinity under the pattern transfer model. PMID:21723298

  9. Center for Space Power, Texas A and M University

    NASA Astrophysics Data System (ADS)

    Jones, Ken

    Johnson Controls is a 106 year old company employing 42,000 people worldwide with $4.7 billion annual sales. Though we are new to the aerospace industry we are a world leader in automobile battery manufacturing, automotive seating, plastic bottling, and facilities environment controls. The battery division produces over 24,000,000 batteries annually under private label for the new car manufacturers and the replacement market. We are entering the aerospace market with the nickel hydrogen battery with the help of NASA's Center for Space Power at Texas A&M. Unlike traditional nickel hydrogen battery manufacturers, we are reaching beyond the space applications to the higher volume markets of aircraft starting and utility load leveling. Though space applications alone will not provide sufficient volume to support the economies of scale and opportunities for statistical process control, these additional terrestrial applications will. For example, nickel hydrogen batteries do not have the environmental problems of nickel cadmium or lead acid and may someday start your car or power your electric vehicle. However you envision the future, keep in mind that no manufacturer moves into a large volume market without fine tuning their process. The Center for Space Power at Texas A&M is providing indepth technical analysis of all of the materials and fabricated parts of our battery as well as thermal and mechanical design computer modeling. Several examples of what we are doing with nickel hydrogen chemistry to lead to these production efficiencies are presented.

  10. Center for Space Power, Texas A and M University

    NASA Technical Reports Server (NTRS)

    Jones, Ken

    1991-01-01

    Johnson Controls is a 106 year old company employing 42,000 people worldwide with $4.7 billion annual sales. Though we are new to the aerospace industry we are a world leader in automobile battery manufacturing, automotive seating, plastic bottling, and facilities environment controls. The battery division produces over 24,000,000 batteries annually under private label for the new car manufacturers and the replacement market. We are entering the aerospace market with the nickel hydrogen battery with the help of NASA's Center for Space Power at Texas A&M. Unlike traditional nickel hydrogen battery manufacturers, we are reaching beyond the space applications to the higher volume markets of aircraft starting and utility load leveling. Though space applications alone will not provide sufficient volume to support the economies of scale and opportunities for statistical process control, these additional terrestrial applications will. For example, nickel hydrogen batteries do not have the environmental problems of nickel cadmium or lead acid and may someday start your car or power your electric vehicle. However you envision the future, keep in mind that no manufacturer moves into a large volume market without fine tuning their process. The Center for Space Power at Texas A&M is providing indepth technical analysis of all of the materials and fabricated parts of our battery as well as thermal and mechanical design computer modeling. Several examples of what we are doing with nickel hydrogen chemistry to lead to these production efficiencies are presented.

  11. Sample size determination for mediation analysis of longitudinal data.

    PubMed

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  12. Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?

    PubMed Central

    Tressoldi, Patrizio E.

    2012-01-01

    The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215

  13. Human metabolic profiles are stably controlled by genetic and environmental variation

    PubMed Central

    Nicholson, George; Rantalainen, Mattias; Maher, Anthony D; Li, Jia V; Malmodin, Daniel; Ahmadi, Kourosh R; Faber, Johan H; Hallgrímsdóttir, Ingileif B; Barrett, Amy; Toft, Henrik; Krestyaninova, Maria; Viksna, Juris; Neogi, Sudeshna Guha; Dumas, Marc-Emmanuel; Sarkans, Ugis; The MolPAGE Consortium; Silverman, Bernard W; Donnelly, Peter; Nicholson, Jeremy K; Allen, Maxine; Zondervan, Krina T; Lindon, John C; Spector, Tim D; McCarthy, Mark I; Holmes, Elaine; Baunsgaard, Dorrit; Holmes, Chris C

    2011-01-01

    1H Nuclear Magnetic Resonance spectroscopy (1H NMR) is increasingly used to measure metabolite concentrations in sets of biological samples for top-down systems biology and molecular epidemiology. For such purposes, knowledge of the sources of human variation in metabolite concentrations is valuable, but currently sparse. We conducted and analysed a study to create such a resource. In our unique design, identical and non-identical twin pairs donated plasma and urine samples longitudinally. We acquired 1H NMR spectra on the samples, and statistically decomposed variation in metabolite concentration into familial (genetic and common-environmental), individual-environmental, and longitudinally unstable components. We estimate that stable variation, comprising familial and individual-environmental factors, accounts on average for 60% (plasma) and 47% (urine) of biological variation in 1H NMR-detectable metabolite concentrations. Clinically predictive metabolic variation is likely nested within this stable component, so our results have implications for the effective design of biomarker-discovery studies. We provide a power-calculation method which reveals that sample sizes of a few thousand should offer sufficient statistical precision to detect 1H NMR-based biomarkers quantifying predisposition to disease. PMID:21878913

  14. Integrated Surface Power Strategy for Mars

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle

    2015-01-01

    A National Aeronautics and Space Administration (NASA) study team evaluated surface power needs for a conceptual crewed 500-day Mars mission. This study had four goals: 1. Determine estimated surface power needed to support the reference mission; 2. Explore alternatives to minimize landed power system mass; 3. Explore alternatives to minimize Mars Lander power self-sufficiency burden; and 4. Explore alternatives to minimize power system handling and surface transportation mass. The study team concluded that Mars Ascent Vehicle (MAV) oxygen propellant production drives the overall surface power needed for the reference mission. Switching to multiple, small Kilopower fission systems can potentially save four to eight metric tons of landed mass, as compared to a single, large Fission Surface Power (FSP) concept. Breaking the power system up into modular packages creates new operational opportunities, with benefits ranging from reduced lander self-sufficiency for power, to extending the exploration distance from a single landing site. Although a large FSP trades well for operational complexity, a modular approach potentially allows Program Managers more flexibility to absorb late mission changes with less schedule or mass risk, better supports small precursor missions, and allows a program to slowly build up mission capability over time. A number of Kilopower disadvantages-and mitigation strategies-were also explored.

  15. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  16. "Falsifiability is not optional": Correction to LeBel et al. (2017).

    PubMed

    2017-11-01

    Reports an error in "Falsifiability is not optional" by Etienne P. LeBel, Derek Berger, Lorne Campbell and Timothy J. Loving ( Journal of Personality and Social Psychology , 2017[Aug], Vol 113[2], 254-261). In the reply, there were two errors in the References list. The publishing year for the 14th and 21st articles was cited incorrectly as 2016. The in-text acronym associated with these citations should read instead as FER2017 and LCL2017. The correct References list citations should read as follows, respectively: Finkel, E. J., Eastwick, P. W., & Reis, H. T. (2017). Replicability and other features of a high-quality science: Toward a balanced and empirical approach. Journal of Personality and Social Psychology , 113, 244-253. http://dx.doi.org/10.1037/pspi0000075 LeBel, E. P., Campbell, L., & Loving, T. J. (2017). Benefits of open and high-powered research outweigh costs. Journal of Personality and Social Psychology , 113, 230-243. http://dx.doi.org/10 .1037/pspi0000049. The online version of this article has been corrected. (The following abstract of the original article appeared in record 2017-30567-003.) Finkel, Eastwick, and Reis (2016; FER2016) argued the post-2011 methodological reform movement has focused narrowly on replicability, neglecting other essential goals of research. We agree multiple scientific goals are essential, but argue, however, a more fine-grained language, conceptualization, and approach to replication is needed to accomplish these goals. Replication is the general empirical mechanism for testing and falsifying theory. Sufficiently methodologically similar replications, also known as direct replications, test the basic existence of phenomena and ensure cumulative progress is possible a priori. In contrast, increasingly methodologically dissimilar replications, also known as conceptual replications, test the relevance of auxiliary hypotheses (e.g., manipulation and measurement issues, contextual factors) required to productively investigate validity and generalizability. Without prioritizing replicability, a field is not empirically falsifiable. We also disagree with FER2016's position that "bigger samples are generally better, but . . . that very large samples could have the downside of commandeering resources that would have been better invested in other studies" (abstract). We identify problematic assumptions involved in FER2016's modifications of our original research-economic model, and present an improved model that quantifies when (and whether) it is reasonable to worry that increasing statistical power will engender potential trade-offs. Sufficiently powering studies (i.e., >80%) maximizes both research efficiency and confidence in the literature (research quality). Given that we are in agreement with FER2016 on all key open science points, we are eager to start seeing the accelerated rate of cumulative knowledge development of social psychological phenomena such a sufficiently transparent, powered, and falsifiable approach will generate. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  18. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  19. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    PubMed

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  20. The Power of Instructions: Proactive Configuration of Stimulus-Response Translation

    ERIC Educational Resources Information Center

    Meiran, Nachshon; Pereg, Maayan; Kessler, Yoav; Cole, Michael W.; Braver, Todd S.

    2015-01-01

    Humans are characterized by an especially highly developed ability to use instructions to prepare toward upcoming events; yet, it is unclear just how powerful instructions can be. Although prior work provides evidence that instructions can be sufficiently powerful to proactively program working memory to execute stimulus-response (S-R)…

  1. 18 CFR 12.43 - Power and communication lines and gas pipelines.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... reasonable specifications that may be provided by the Regional Engineer, to ensure that any power or... waters must be at least sufficient to conform to any applicable requirements of the National Electrical... Engineer may require a licensee or applicant to provide signs at or near power or communication lines to...

  2. 18 CFR 12.43 - Power and communication lines and gas pipelines.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... reasonable specifications that may be provided by the Regional Engineer, to ensure that any power or... waters must be at least sufficient to conform to any applicable requirements of the National Electrical... Engineer may require a licensee or applicant to provide signs at or near power or communication lines to...

  3. 18 CFR 12.43 - Power and communication lines and gas pipelines.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... reasonable specifications that may be provided by the Regional Engineer, to ensure that any power or... waters must be at least sufficient to conform to any applicable requirements of the National Electrical... Engineer may require a licensee or applicant to provide signs at or near power or communication lines to...

  4. 18 CFR 12.43 - Power and communication lines and gas pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... reasonable specifications that may be provided by the Regional Engineer, to ensure that any power or... waters must be at least sufficient to conform to any applicable requirements of the National Electrical... Engineer may require a licensee or applicant to provide signs at or near power or communication lines to...

  5. 18 CFR 12.43 - Power and communication lines and gas pipelines.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... reasonable specifications that may be provided by the Regional Engineer, to ensure that any power or... waters must be at least sufficient to conform to any applicable requirements of the National Electrical... Engineer may require a licensee or applicant to provide signs at or near power or communication lines to...

  6. 48 CFR 28.101-3 - Authority of an attorney-in-fact for a bid bond.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... responsiveness; and (2) Treat questions regarding the authenticity and enforceability of the power of attorney at..., or a photocopy or facsimile of an original, power of attorney is sufficient evidence of such... and dates on the power of attorney shall be considered original signatures, seals and dates, without...

  7. 7 CFR 1717.306 - RUS required rates.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...-emption in Rate Making in Connection With Power Supply Borrowers § 1717.306 RUS required rates. (a) Upon... of RUS that are sufficient to satisfy the requirements of the RUS wholesale power contract and other... with terms of the RUS wholesale power contract and other RUS documents in a timely fashion, RUS may...

  8. 7 CFR 1717.306 - RUS required rates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...-emption in Rate Making in Connection With Power Supply Borrowers § 1717.306 RUS required rates. (a) Upon... of RUS that are sufficient to satisfy the requirements of the RUS wholesale power contract and other... with terms of the RUS wholesale power contract and other RUS documents in a timely fashion, RUS may...

  9. Telecommunications equipment power supply in the Arctic by means of solar panels

    NASA Astrophysics Data System (ADS)

    Terekhin, Vladimir; Lagunov, Alexey

    2016-09-01

    Development of the Arctic region is one of the priorities in the Russian Federation. Amongst other things, a reliable telecommunications infrastructure in the Arctic is required. Petrol and diesel generators are traditionally employed but their use has considerable environmental impact. Solar panels can be used as an alternative power source. The electricity generated will be sufficient to supply small-sized telecommunications equipment with total the power of over 80 watts. An installation consisting of the solar modules, a charge controller, batteries, an inverter and load was designed. Tests were conducted at Cape Desire of the Novaya Zemlya (island). The solar panels provided in excess of 80 W from 7 a.m. to 11 p.m. The batteries charge during this time was sufficient to provide the power supply for the communication equipment during the night, from 11 p.m. to 7 a.m. The maximum value of 638 W of the power generation was observed at 3 p.m. The minimum value of 46 W was at 4 a.m. The solar modules thus can be used during the polar day to power the telecommunications equipment.

  10. Identifying Electricity Capacity at Risk to Changes in Climate and Water Resources in the United States

    NASA Astrophysics Data System (ADS)

    Miara, A.; Macknick, J.; Vorosmarty, C. J.; Corsi, F.; Fekete, B. M.; Newmark, R. L.; Tidwell, V. C.; Cohen, S. M.

    2016-12-01

    Thermoelectric plants supply 85% of electricity generation in the United States. Under a warming climate, the performance of these power plants may be reduced, as thermoelectric generation is dependent upon cool ambient temperatures and sufficient water supplies at adequate temperatures. In this study, we assess the vulnerability and reliability of 1,100 operational power plants (2015) across the contiguous United States under a comprehensive set of climate scenarios (five Global Circulation Models each with four Representative Concentration Pathways). We model individual power plant capacities using the Thermoelectric Power and Thermal Pollution model (TP2M) coupled with the Water Balance Model (WBM) at a daily temporal resolution and 5x5 km spatial resolution. Together, these models calculate power plant capacity losses that account for geophysical constraints and river network dynamics. Potential losses at the single-plant level are put into a regional energy security context by assessing the collective system-level reliability at the North-American Electricity Reliability Corporation (NERC) regions. Results show that the thermoelectric sector at the national level has low vulnerability under the contemporary climate and that system-level reliability in terms of available thermoelectric resources relative to thermoelectric demand is sufficient. Under future climates scenarios, changes in water availability and warm ambient temperatures lead to constraints on operational capacity and increased vulnerability at individual power plant sites across all regions in the United States. However, there is a strong disparity in regional vulnerability trends and magnitudes that arise from each region's climate, hydrology and technology mix. Despite increases in vulnerabilities at the individual power plant level, regional energy systems may still be reliable (with no system failures) due to sufficient back-up reserve capacities.

  11. Efficient Scores, Variance Decompositions and Monte Carlo Swindles.

    DTIC Science & Technology

    1984-08-28

    to ;r Then a version .of Pythagoras ’ theorem gives the variance decomposition (6.1) varT var S var o(T-S) P P0 0 0 One way to see this is to note...complete sufficient statistics for (B, a) , and that the standard- ized residuals a(y - XB) 6 are ancillary. Basu’s sufficiency- ancillarity theorem

  12. Automated medication reconciliation and complexity of care transitions.

    PubMed

    Silva, Pamela A Bozzo; Bernstam, Elmer V; Markowitz, Eliz; Johnson, Todd R; Zhang, Jiajie; Herskovic, Jorge R

    2011-01-01

    Medication reconciliation is a National Patient Safety Goal (NPSG) from The Joint Commission (TJC) that entails reviewing all medications a patient takes after a health care transition. Medication reconciliation is a resource-intensive, error-prone task, and the resources to accomplish it may not be routinely available. Computer-based methods have the potential to overcome these barriers. We designed and explored a rule-based medication reconciliation algorithm to accomplish this task across different healthcare transitions. We tested our algorithm on a random sample of 94 transitions from the Clinical Data Warehouse at the University of Texas Health Science Center at Houston. We found that the algorithm reconciled, on average, 23.4% of the potentially reconcilable medications. Our study did not have sufficient statistical power to establish whether the kind of transition affects reconcilability. We conclude that automated reconciliation is possible and will help accomplish the NPSG.

  13. Nanoscale temperature mapping in operating microelectronic devices

    DOE PAGES

    Mecklenburg, Matthew; Hubbard, William A.; White, E. R.; ...

    2015-02-05

    We report that modern microelectronic devices have nanoscale features that dissipate power nonuniformly, but fundamental physical limits frustrate efforts to detect the resulting temperature gradients. Contact thermometers disturb the temperature of a small system, while radiation thermometers struggle to beat the diffraction limit. Exploiting the same physics as Fahrenheit’s glass-bulb thermometer, we mapped the thermal expansion of Joule-heated, 80-nanometer-thick aluminum wires by precisely measuring changes in density. With a scanning transmission electron microscope (STEM) and electron energy loss spectroscopy (EELS), we quantified the local density via the energy of aluminum’s bulk plasmon. Rescaling density to temperature yields maps with amore » statistical precision of 3 kelvin/hertz ₋1/2, an accuracy of 10%, and nanometer-scale resolution. Lastly, many common metals and semiconductors have sufficiently sharp plasmon resonances to serve as their own thermometers.« less

  14. The preparation effect in task switching: carryover of SOA.

    PubMed

    Altmann, Erik M

    2004-01-01

    A common finding in task-switching studies is switch preparation (commonly known as the preparation effect), in which a longer interval between task cue and trial stimulus (i.e., a longer stimulus onset asynchrony, or SOA) reduces the cost of switching to a different task. Three experiments link switch preparation to within-subjects manipulations of SOA. In Experiment 1, SOA was randomized within subjects, producing switch preparation that was more pronounced when the SOA switched from the previous trial than when the SOA repeated. In Experiment 2, SOA was blocked within subjects, producing switch preparation but not on the first block of trials. In Experiment 3, SOA was manipulated between subjects with sufficient statistical power to detect switch preparation, but the effect was absent. The results favor an encoding view of cognitive control, but show that any putative switching mechanism reacts lazily when exposed to only one SOA.

  15. Toward a functional definition of a "rare disease" for regulatory authorities and funding agencies.

    PubMed

    Clarke, Joe T R; Coyle, Doug; Evans, Gerald; Martin, Janet; Winquist, Eric

    2014-12-01

    The designation of a disease as "rare" is associated with some substantial benefits for companies involved in new drug development, including expedited review by regulatory authorities and relaxed criteria for reimbursement. How "rare disease" is defined therefore has major financial implications, both for pharmaceutical companies and for insurers or public drug reimbursement programs. All existing definitions are based, somewhat arbitrarily, on disease incidence or prevalence. What is proposed here is a functional definition of rare based on an assessment of the feasibility of measuring the efficacy of a new treatment in conventional randomized controlled trials, to inform regulatory authorities and funding agencies charged with assessing new therapies being considered for public funding. It involves a five-step process, involving significant negotiations between patient advocacy groups, pharmaceutical companies, physicians, and public drug reimbursement programs, designed to establish the feasibility of carrying out a randomized controlled trial with sufficient statistical power to show a clinically significant treatment effect. The steps are as follows: 1) identification of a specific disease, including appropriate genetic definition; 2) identification of clinically relevant outcomes to evaluate efficacy; 3) establishment of the inherent variability of measurements of clinically relevant outcomes; 4) calculation of the sample size required to assess the efficacy of a new treatment with acceptable statistical power; and 5) estimation of the difficulty of recruiting an adequate sample size given the estimated prevalence or incidence of the disorder in the population and the inclusion criteria to be used. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Extending the scope of pooled analyses of individual patient biomarker data from heterogeneous laboratory platforms and cohorts using merging algorithms.

    PubMed

    Burke, Órlaith; Benton, Samantha; Szafranski, Pawel; von Dadelszen, Peter; Buhimschi, S Catalin; Cetin, Irene; Chappell, Lucy; Figueras, Francesc; Galindo, Alberto; Herraiz, Ignacio; Holzman, Claudia; Hubel, Carl; Knudsen, Ulla; Kronborg, Camilla; Laivuori, Hannele; Lapaire, Olav; McElrath, Thomas; Moertl, Manfred; Myers, Jenny; Ness, Roberta B; Oliveira, Leandro; Olson, Gayle; Poston, Lucilla; Ris-Stalpers, Carrie; Roberts, James M; Schalekamp-Timmermans, Sarah; Schlembach, Dietmar; Steegers, Eric; Stepan, Holger; Tsatsaris, Vassilis; van der Post, Joris A; Verlohren, Stefan; Villa, Pia M; Williams, David; Zeisler, Harald; Redman, Christopher W G; Staff, Anne Cathrine

    2016-01-01

    A common challenge in medicine, exemplified in the analysis of biomarker data, is that large studies are needed for sufficient statistical power. Often, this may only be achievable by aggregating multiple cohorts. However, different studies may use disparate platforms for laboratory analysis, which can hinder merging. Using circulating placental growth factor (PlGF), a potential biomarker for hypertensive disorders of pregnancy (HDP) such as preeclampsia, as an example, we investigated how such issues can be overcome by inter-platform standardization and merging algorithms. We studied 16,462 pregnancies from 22 study cohorts. PlGF measurements (gestational age ⩾20 weeks) analyzed on one of four platforms: R&D Systems, AlereTriage, RocheElecsys or AbbottArchitect, were available for 13,429 women. Two merging algorithms, using Z-Score and Multiple of Median transformations, were applied. Best reference curves (BRC), based on merged, transformed PlGF measurements in uncomplicated pregnancy across six gestational age groups, were estimated. Identification of HDP by these PlGF-BRCs was compared to that of platform-specific curves. We demonstrate the feasibility of merging PlGF concentrations from different analytical platforms. Overall BRC identification of HDP performed at least as well as platform-specific curves. Our method can be extended to any set of biomarkers obtained from different laboratory platforms in any field. Merged biomarker data from multiple studies will improve statistical power and enlarge our understanding of the pathophysiology and management of medical syndromes. Copyright © 2015 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.

  17. Examining the effects of birth order on personality

    PubMed Central

    Rohrer, Julia M.; Egloff, Boris; Schmukle, Stefan C.

    2015-01-01

    This study examined the long-standing question of whether a person’s position among siblings has a lasting impact on that person’s life course. Empirical research on the relation between birth order and intelligence has convincingly documented that performances on psychometric intelligence tests decline slightly from firstborns to later-borns. By contrast, the search for birth-order effects on personality has not yet resulted in conclusive findings. We used data from three large national panels from the United States (n = 5,240), Great Britain (n = 4,489), and Germany (n = 10,457) to resolve this open research question. This database allowed us to identify even very small effects of birth order on personality with sufficiently high statistical power and to investigate whether effects emerge across different samples. We furthermore used two different analytical strategies by comparing siblings with different birth-order positions (i) within the same family (within-family design) and (ii) between different families (between-family design). In our analyses, we confirmed the expected birth-order effect on intelligence. We also observed a significant decline of a 10th of a SD in self-reported intellect with increasing birth-order position, and this effect persisted after controlling for objectively measured intelligence. Most important, however, we consistently found no birth-order effects on extraversion, emotional stability, agreeableness, conscientiousness, or imagination. On the basis of the high statistical power and the consistent results across samples and analytical designs, we must conclude that birth order does not have a lasting effect on broad personality traits outside of the intellectual domain. PMID:26483461

  18. Examining the effects of birth order on personality.

    PubMed

    Rohrer, Julia M; Egloff, Boris; Schmukle, Stefan C

    2015-11-17

    This study examined the long-standing question of whether a person's position among siblings has a lasting impact on that person's life course. Empirical research on the relation between birth order and intelligence has convincingly documented that performances on psychometric intelligence tests decline slightly from firstborns to later-borns. By contrast, the search for birth-order effects on personality has not yet resulted in conclusive findings. We used data from three large national panels from the United States (n = 5,240), Great Britain (n = 4,489), and Germany (n = 10,457) to resolve this open research question. This database allowed us to identify even very small effects of birth order on personality with sufficiently high statistical power and to investigate whether effects emerge across different samples. We furthermore used two different analytical strategies by comparing siblings with different birth-order positions (i) within the same family (within-family design) and (ii) between different families (between-family design). In our analyses, we confirmed the expected birth-order effect on intelligence. We also observed a significant decline of a 10th of a SD in self-reported intellect with increasing birth-order position, and this effect persisted after controlling for objectively measured intelligence. Most important, however, we consistently found no birth-order effects on extraversion, emotional stability, agreeableness, conscientiousness, or imagination. On the basis of the high statistical power and the consistent results across samples and analytical designs, we must conclude that birth order does not have a lasting effect on broad personality traits outside of the intellectual domain.

  19. High-Density Signal Interface Electromagnetic Radiation Prediction for Electromagnetic Compatibility Evaluation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halligan, Matthew

    Radiated power calculation approaches for practical scenarios of incomplete high- density interface characterization information and incomplete incident power information are presented. The suggested approaches build upon a method that characterizes power losses through the definition of power loss constant matrices. Potential radiated power estimates include using total power loss information, partial radiated power loss information, worst case analysis, and statistical bounding analysis. A method is also proposed to calculate radiated power when incident power information is not fully known for non-periodic signals at the interface. Incident data signals are modeled from a two-state Markov chain where bit state probabilities aremore » derived. The total spectrum for windowed signals is postulated as the superposition of spectra from individual pulses in a data sequence. Statistical bounding methods are proposed as a basis for the radiated power calculation due to the statistical calculation complexity to find a radiated power probability density function.« less

  20. Laser beamed power: Satellite demonstration applications

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Westerlund, Larry H.

    1992-01-01

    It is possible to use a ground-based laser to beam light to the solar arrays of orbiting satellites, to a level sufficient to provide all or some of the operating power required. Near-term applications of this technology for providing supplemental power to existing satellites are discussed. Two missions with significant commercial pay-off are supplementing solar power for radiation-degraded arrays and providing satellite power during eclipse for satellites with failed batteries.

  1. Voids and constraints on nonlinear clustering of galaxies

    NASA Technical Reports Server (NTRS)

    Vogeley, Michael S.; Geller, Margaret J.; Park, Changbom; Huchra, John P.

    1994-01-01

    Void statistics of the galaxy distribution in the Center for Astrophysics Redshift Survey provide strong constraints on galaxy clustering in the nonlinear regime, i.e., on scales R equal to or less than 10/h Mpc. Computation of high-order moments of the galaxy distribution requires a sample that (1) densely traces the large-scale structure and (2) covers sufficient volume to obtain good statistics. The CfA redshift survey densely samples structure on scales equal to or less than 10/h Mpc and has sufficient depth and angular coverage to approach a fair sample on these scales. In the nonlinear regime, the void probability function (VPF) for CfA samples exhibits apparent agreement with hierarchical scaling (such scaling implies that the N-point correlation functions for N greater than 2 depend only on pairwise products of the two-point function xi(r)) However, simulations of cosmological models show that this scaling in redshift space does not necessarily imply such scaling in real space, even in the nonlinear regime; peculiar velocities cause distortions which can yield erroneous agreement with hierarchical scaling. The underdensity probability measures the frequency of 'voids' with density rho less than 0.2 -/rho. This statistic reveals a paucity of very bright galaxies (L greater than L asterisk) in the 'voids.' Underdensities are equal to or greater than 2 sigma more frequent in bright galaxy samples than in samples that include fainter galaxies. Comparison of void statistics of CfA samples with simulations of a range of cosmological models favors models with Gaussian primordial fluctuations and Cold Dark Matter (CDM)-like initial power spectra. Biased models tend to produce voids that are too empty. We also compare these data with three specific models of the Cold Dark Matter cosmogony: an unbiased, open universe CDM model (omega = 0.4, h = 0.5) provides a good match to the VPF of the CfA samples. Biasing of the galaxy distribution in the 'standard' CDM model (omega = 1, b = 1.5; see below for definitions) and nonzero cosmological constant CDM model (omega = 0.4, h = 0.6 lambda(sub 0) = 0.6, b = 1.3) produce voids that are too empty. All three simulations match the observed VPF and underdensity probability for samples of very bright (M less than M asterisk = -19.2) galaxies, but produce voids that are too empty when compared with samples that include fainter galaxies.

  2. Relative risk estimates from spatial and space-time scan statistics: Are they biased?

    PubMed Central

    Prates, Marcos O.; Kulldorff, Martin; Assunção, Renato M.

    2014-01-01

    The purely spatial and space-time scan statistics have been successfully used by many scientists to detect and evaluate geographical disease clusters. Although the scan statistic has high power in correctly identifying a cluster, no study has considered the estimates of the cluster relative risk in the detected cluster. In this paper we evaluate whether there is any bias on these estimated relative risks. Intuitively, one may expect that the estimated relative risks has upward bias, since the scan statistic cherry picks high rate areas to include in the cluster. We show that this intuition is correct for clusters with low statistical power, but with medium to high power the bias becomes negligible. The same behaviour is not observed for the prospective space-time scan statistic, where there is an increasing conservative downward bias of the relative risk as the power to detect the cluster increases. PMID:24639031

  3. Energy-efficient lighting system for television

    DOEpatents

    Cawthorne, Duane C.

    1987-07-21

    A light control system for a television camera comprises an artificial light control system which is cooperative with an iris control system. This artificial light control system adjusts the power to lamps illuminating the camera viewing area to provide only sufficient artificial illumination necessary to provide a sufficient video signal when the camera iris is substantially open.

  4. RERTR-7 Irradiation Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. M. Perez; M. A. Lillo; G. S. Chang

    2011-12-01

    The Reduced Enrichment for Research and Test Reactor (RERTR) experiment RERTR-7A, was designed to test several modified fuel designs to target fission densities representative of a peak low enriched uranium (LEU) burnup in excess of 90% U-235 at peak experiment power sufficient to generate a peak surface heat flux of approximately 300 W/cm2. The RERTR-7B experiment was designed as a high power test of 'second generation' dispersion fuels at peak experiment power sufficient to generate a surface heat flux on the order of 230 W/cm2.1 The following report summarizes the life of the RERTR-7A and RERTR-7B experiments through end ofmore » irradiation, including as-run neutronic analyses, thermal analyses and hydraulic testing results.« less

  5. The statistical challenge of constraining the low-mass IMF in Local Group dwarf galaxies

    NASA Astrophysics Data System (ADS)

    El-Badry, Kareem; Weisz, Daniel R.; Quataert, Eliot

    2017-06-01

    We use Monte Carlo simulations to explore the statistical challenges of constraining the characteristic mass (mc) and width (σ) of a lognormal sub-solar initial mass function (IMF) in Local Group dwarf galaxies using direct star counts. For a typical Milky Way (MW) satellite (MV = -8), jointly constraining mc and σ to a precision of ≲ 20 per cent requires that observations be complete to ≲ 0.2 M⊙, if the IMF is similar to the MW IMF. A similar statistical precision can be obtained if observations are only complete down to 0.4 M⊙, but this requires measurement of nearly 100× more stars, and thus, a significantly more massive satellite (MV ˜ -12). In the absence of sufficiently deep data to constrain the low-mass turnover, it is common practice to fit a single-sloped power law to the low-mass IMF, or to fit mc for a lognormal while holding σ fixed. We show that the former approximation leads to best-fitting power-law slopes that vary with the mass range observed and can largely explain existing claims of low-mass IMF variations in MW satellites, even if satellite galaxies have the same IMF as the MW. In addition, fixing σ during fitting leads to substantially underestimated uncertainties in the recovered value of mc (by a factor of ˜4 for typical observations). If the IMFs of nearby dwarf galaxies are lognormal and do vary, observations must reach down to ˜mc in order to robustly detect these variations. The high-sensitivity, near-infrared capabilities of the James Webb Space Telescope and Wide-Field Infrared Survey Telescope have the potential to dramatically improve constraints on the low-mass IMF. We present an efficient observational strategy for using these facilities to measure the IMFs of Local Group dwarf galaxies.

  6. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.

    PubMed

    Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z

    2016-04-01

    Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  8. Power transfer for rotating medical machine.

    PubMed

    Sofia, A; Tavilla, A C; Gardenghi, R; Nicolis, D; Stefanini, I

    2016-08-01

    Very often biological tissues need to be treated inside of a biomedical centrifuge even during the centrifugation step without process interruption. In this paper an advantageous energy transfer method capable of providing sufficient electric power for the rotating and active part is presented.

  9. Role of sufficient phosphorus in biodiesel production from diatom Phaeodactylum tricornutum.

    PubMed

    Yu, Shi-Jin; Shen, Xiao-Fei; Ge, Huo-Qing; Zheng, Hang; Chu, Fei-Fei; Hu, Hao; Zeng, Raymond J

    2016-08-01

    In order to study the role of sufficient phosphorus (P) in biodiesel production by microalgae, Phaeodactylum tricornutum were cultivated in six different media treatments with combination of nitrogen (N) sufficiency/deprivation and phosphorus sufficiency/limitation/deprivation. Profiles of N and P, biomass, and fatty acids (FAs) content and compositions were measured during a 7-day cultivation period. The results showed that the FA content in microalgae biomass was promoted by P deprivation. However, statistical analysis showed that FA productivity had no significant difference (p = 0.63, >0.05) under the treatments of N deprivation with P sufficiency (N-P) and N deprivation with P deprivation (N-P-), indicating P sufficiency in N deprivation medium has little effect on increasing biodiesel productivity from P. triornutum. It was also found that the P absorption in N-P medium was 1.41 times higher than that in N sufficiency and P sufficiency (NP) medium. N deprivation with P limitation (N-P-l) was the optimal treatment for producing biodiesel from P. triornutum because of both the highest FA productivity and good biodiesel quality.

  10. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  11. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  12. STATE EXECUTIVE AUTHORITY TO PROMOTE CIVIL RIGHTS, AN ACTION PROGRAM FOR THE 1960'S.

    ERIC Educational Resources Information Center

    SILARD, JOHN

    THE QUESTIONS OF THE GOVERNOR'S POWER REGARDING CIVIL RIGHTS ISSUES WAS DISCUSSED. THROUGH THE "GOVERNOR'S CODE OF FAIR PRACTICES," WHICH BRIEFLY STATED THAT THE STATE'S BASIC POLICY WAS AGAINST DISCRIMINATION, THE GOVERNOR AS WELL AS ALL STATE OFFICIALS HAD SUFFICIENT POWER TO FIGHT DISCRIMINATION. THE OFFICIALS HAD FURTHER POWER WITH…

  13. 77 FR 12885 - Millstone Power Station, Units 1, 2 and 3, Dominion Nuclear Connecticut, Inc.; Exemption

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... high wind conditions pass, wind damage to the plant and surrounding area might preclude a sufficient... Power Station, Units 1, 2 and 3, Dominion Nuclear Connecticut, Inc.; Exemption 1.0 Background Dominion..., DPR-65 and NPF-49, which authorize operation of the Millstone Power Station, Unit Nos. 1, 2 and 3...

  14. Stationary conditions for stochastic differential equations

    NASA Technical Reports Server (NTRS)

    Adomian, G.; Walker, W. W.

    1972-01-01

    This is a preliminary study of possible necessary and sufficient conditions to insure stationarity in the solution process for a stochastic differential equation. It indirectly sheds some light on ergodicity properties and shows that the spectral density is generally inadequate as a statistical measure of the solution. Further work is proceeding on a more general theory which gives necessary and sufficient conditions in a form useful for applications.

  15. Quantum fluctuation theorems and power measurements

    NASA Astrophysics Data System (ADS)

    Prasanna Venkatesh, B.; Watanabe, Gentaro; Talkner, Peter

    2015-07-01

    Work in the paradigm of the quantum fluctuation theorems of Crooks and Jarzynski is determined by projective measurements of energy at the beginning and end of the force protocol. In analogy to classical systems, we consider an alternative definition of work given by the integral of the supplied power determined by integrating up the results of repeated measurements of the instantaneous power during the force protocol. We observe that such a definition of work, in spite of taking account of the process dependence, has different possible values and statistics from the work determined by the conventional two energy measurement approach (TEMA). In the limit of many projective measurements of power, the system’s dynamics is frozen in the power measurement basis due to the quantum Zeno effect leading to statistics only trivially dependent on the force protocol. In general the Jarzynski relation is not satisfied except for the case when the instantaneous power operator commutes with the total Hamiltonian at all times. We also consider properties of the joint statistics of power-based definition of work and TEMA work in protocols where both values are determined. This allows us to quantify their correlations. Relaxing the projective measurement condition, weak continuous measurements of power are considered within the stochastic master equation formalism. Even in this scenario the power-based work statistics is in general not able to reproduce qualitative features of the TEMA work statistics.

  16. Statistical Power of Psychological Research: What Have We Gained in 20 Years?

    ERIC Educational Resources Information Center

    Rossi, Joseph S.

    1990-01-01

    Calculated power for 6,155 statistical tests in 221 journal articles published in 1982 volumes of "Journal of Abnormal Psychology,""Journal of Consulting and Clinical Psychology," and "Journal of Personality and Social Psychology." Power to detect small, medium, and large effects was .17, .57, and .83, respectively. Concluded that power of…

  17. Photovoltaic receivers for laser beamed power in space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    1991-01-01

    There has recently been a resurgence of interest in the use of beamed power to support space exploration activities. One of the most promising beamed power concepts uses a laser beam to transmit power to a remote photovoltaic array. Large lasers can be located on cloud-free sites at one or more ground locations and illuminate solar arrays to a level sufficient to provide operating power. Issues involved in providing photovoltaic receivers for such applications are discussed.

  18. Convexity of Energy-Like Functions: Theoretical Results and Applications to Power System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dvijotham, Krishnamurthy; Low, Steven; Chertkov, Michael

    2015-01-12

    Power systems are undergoing unprecedented transformations with increased adoption of renewables and distributed generation, as well as the adoption of demand response programs. All of these changes, while making the grid more responsive and potentially more efficient, pose significant challenges for power systems operators. Conventional operational paradigms are no longer sufficient as the power system may no longer have big dispatchable generators with sufficient positive and negative reserves. This increases the need for tools and algorithms that can efficiently predict safe regions of operation of the power system. In this paper, we study energy functions as a tool to designmore » algorithms for various operational problems in power systems. These have a long history in power systems and have been primarily applied to transient stability problems. In this paper, we take a new look at power systems, focusing on an aspect that has previously received little attention: Convexity. We characterize the domain of voltage magnitudes and phases within which the energy function is convex in these variables. We show that this corresponds naturally with standard operational constraints imposed in power systems. We show that power of equations can be solved using this approach, as long as the solution lies within the convexity domain. We outline various desirable properties of solutions in the convexity domain and present simple numerical illustrations supporting our results.« less

  19. Microresonator Frequency Comb Optical Clock

    DTIC Science & Technology

    2014-07-22

    lithic construction with small size and power consumption. Microcomb development has included frequency control of their spectra [8–11...frequency f eo and amplified to a maximum of 140 mW. The first-order sideband powers are approximately 3 dB lower than the pump, and the piece of highly...resonator offers sufficient peak power for our experiments and is stable and repeatable even for different settings of pump frequency and power

  20. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  1. Distribution pattern of benthic invertebrates in Danish estuaries: The use of Taylor's power law as a species-specific indicator of dispersion and behavior

    NASA Astrophysics Data System (ADS)

    Kristensen, Erik; Delefosse, Matthieu; Quintana, Cintia O.; Banta, Gary T.; Petersen, Hans Christian; Jørgensen, Bent

    2013-03-01

    The lack of a common statistical approach describing the distribution and dispersion pattern of marine benthic animals has often hampered the comparability among studies. The purpose of this study is therefore to apply an alternative approach, Taylor's power law, to data on spatial and temporal distribution of 9 dominating benthic invertebrate species from two study areas, the estuaries Odense Fjord and Roskilde Fjord, Denmark. The slope (b) obtained from the power relationship of sample variance (s2) versus mean (μ) appears to be species-specific and independent of location and time. It ranges from a low of ~ 1 for large-bodied (> 1 mg AFDW) species (e.g. Marenzelleria viridis, Nereis diversicolor) to a high of 1.6-1.9 for small-bodied (< 1 mg AFDW) species (e.g. Pygospio elegans and Tubificoides benedii). Accordingly, b is apparently a valuable species-specific dispersion index based on biological factors such as behavior and intraspecific interactions. Thus, at the examined spatial scale, the more intense intraspecific interactions (e.g. territoriality) cause less aggregated distribution patterns among large- than small-bodied invertebrates. The species-specific interactions seem sufficiently strong to override environmental influences (e.g. water depth and sediment type). The strong linear relationship between the slope b and intercept log(a) from the power relationship is remarkably similar for all surveys providing a common slope of - 1.63 with the present sampling approach. We suggest that this relationship is an inherent characteristic of Taylor's power law, and that b as a dispersion index may be biased by e.g. sampling errors when this relationship is weak. The correlation strength between b and log(a) could therefore be envisioned as a data quality check.

  2. Electrolytic plating apparatus for discrete microsized particles

    DOEpatents

    Mayer, Anton

    1976-11-30

    Method and apparatus are disclosed for electrolytically producing very uniform coatings of a desired material on discrete microsized particles. Agglomeration or bridging of the particles during the deposition process is prevented by imparting a sufficiently random motion to the particles that they are not in contact with a powered cathode for a time sufficient for such to occur.

  3. Statistical power analysis of cardiovascular safety pharmacology studies in conscious rats.

    PubMed

    Bhatt, Siddhartha; Li, Dingzhou; Flynn, Declan; Wisialowski, Todd; Hemkens, Michelle; Steidl-Nichols, Jill

    2016-01-01

    Cardiovascular (CV) toxicity and related attrition are a major challenge for novel therapeutic entities and identifying CV liability early is critical for effective derisking. CV safety pharmacology studies in rats are a valuable tool for early investigation of CV risk. Thorough understanding of data analysis techniques and statistical power of these studies is currently lacking and is imperative for enabling sound decision-making. Data from 24 crossover and 12 parallel design CV telemetry rat studies were used for statistical power calculations. Average values of telemetry parameters (heart rate, blood pressure, body temperature, and activity) were logged every 60s (from 1h predose to 24h post-dose) and reduced to 15min mean values. These data were subsequently binned into super intervals for statistical analysis. A repeated measure analysis of variance was used for statistical analysis of crossover studies and a repeated measure analysis of covariance was used for parallel studies. Statistical power analysis was performed to generate power curves and establish relationships between detectable CV (blood pressure and heart rate) changes and statistical power. Additionally, data from a crossover CV study with phentolamine at 4, 20 and 100mg/kg are reported as a representative example of data analysis methods. Phentolamine produced a CV profile characteristic of alpha adrenergic receptor antagonism, evidenced by a dose-dependent decrease in blood pressure and reflex tachycardia. Detectable blood pressure changes at 80% statistical power for crossover studies (n=8) were 4-5mmHg. For parallel studies (n=8), detectable changes at 80% power were 6-7mmHg. Detectable heart rate changes for both study designs were 20-22bpm. Based on our results, the conscious rat CV model is a sensitive tool to detect and mitigate CV risk in early safety studies. Furthermore, these results will enable informed selection of appropriate models and study design for early stage CV studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Identifying natural flow regimes using fish communities

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Tsai, Wen-Ping; Wu, Tzu-Ching; Chen, Hung-kwai; Herricks, Edwin E.

    2011-10-01

    SummaryModern water resources management has adopted natural flow regimes as reasonable targets for river restoration and conservation. The characterization of a natural flow regime begins with the development of hydrologic statistics from flow records. However, little guidance exists for defining the period of record needed for regime determination. In Taiwan, the Taiwan Eco-hydrological Indicator System (TEIS), a group of hydrologic statistics selected for fisheries relevance, is being used to evaluate ecological flows. The TEIS consists of a group of hydrologic statistics selected to characterize the relationships between flow and the life history of indigenous species. Using the TEIS and biosurvey data for Taiwan, this paper identifies the length of hydrologic record sufficient for natural flow regime characterization. To define the ecological hydrology of fish communities, this study connected hydrologic statistics to fish communities by using methods to define antecedent conditions that influence existing community composition. A moving average method was applied to TEIS statistics to reflect the effects of antecedent flow condition and a point-biserial correlation method was used to relate fisheries collections with TEIS statistics. The resulting fish species-TEIS (FISH-TEIS) hydrologic statistics matrix takes full advantage of historical flows and fisheries data. The analysis indicates that, in the watersheds analyzed, averaging TEIS statistics for the present year and 3 years prior to the sampling date, termed MA(4), is sufficient to develop a natural flow regime. This result suggests that flow regimes based on hydrologic statistics for the period of record can be replaced by regimes developed for sampled fish communities.

  5. Report on cancer risks associated with the ingestion of asbestos. DHHS Committee to Coordinate Environmental and Related Programs.

    PubMed Central

    1987-01-01

    This report is an assessment of all available literature that pertains to the potential risk of cancer associated with ingestion of asbestos. It was compiled by a working group to assist policy makers in the Department of Health and Human Services determine if adequate information was available for a definitive risk assessment on this potential problem and evaluate if the weight of evidence was sufficient to prioritize this issue for new policy recommendations. The work group considered the basis for concern over this problem, the body of toxicology experiments, the individual epidemiologic studies which have attempted to investigate this issue, and the articles that discuss components of risk assessment pertaining to the ingestion of asbestos. In the report, the work group concluded: that no direct, definitive risk assessment can be conducted at this time; that further epidemiologic investigations will be very costly and only possess sufficient statistical power to detect relatively large excesses in cancers related to asbestos ingestion; and that probably the most pertinent toxicologic experiments relate to resolving the differences in how inhaled asbestos, which is eventually swallowed, is biologically processed by humans, compared to how ingested asbestos is processed. The work group believes that the cancer risk associated with asbestos ingestion should not be perceived as one of the most pressing potential public health hazards facing the nation. However, the work group does not believe that information was sufficient to assess the level of cancer risk associated with the ingestion and therefore, this potential hazard should not be discounted, and ingestion exposure to asbestos should be eliminated whenever possible. PMID:3304998

  6. How Many Studies Do You Need? A Primer on Statistical Power for Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Pigott, Therese D.; Rothstein, Hannah R.

    2010-01-01

    In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide…

  7. Vibration harvesting in traffic tunnels to power wireless sensor nodes

    NASA Astrophysics Data System (ADS)

    Wischke, M.; Masur, M.; Kröner, M.; Woias, P.

    2011-08-01

    Monitoring the traffic and the structural health of traffic tunnels requires numerous sensors. Powering these remote and partially embedded sensors from ambient energies will reduce maintenance costs, and improve the sensor network performance. This work reports on vibration levels detected in railway and road tunnels as a potential energy source for embedded sensors. The measurement results showed that the vibrations at any location in the road tunnel and at the wall in the railway tunnel are too small for useful vibration harvesting. In contrast, the railway sleeper features usable vibrations and sufficient mounting space. For this application site, a robust piezoelectric vibration harvester was designed and equipped with a power interface circuit. Within the field test, it is demonstrated that sufficient energy is harvested to supply a microcontroller with a radio frequency (RF) interface.

  8. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  9. Boundary Work and Power in the Controversy over Therapeutic Touch in Finnish Nursing Science

    ERIC Educational Resources Information Center

    Vuolanto, Pia

    2015-01-01

    The boundary work approach has been established as one of the main ways to study controversies in science. However, it has been proposed that it does not meet the power dynamics of the scientific field sufficiently. This article concentrates on the intertwining of boundary work and power. It combines the boundary work approach developed by Thomas…

  10. Modelling and stability analysis of switching impulsive power systems with multiple equilibria

    NASA Astrophysics Data System (ADS)

    Zhu, Liying; Qiu, Jianbin; Chadli, Mohammed

    2017-12-01

    This paper tries to model power systems accompanied with a series of faults in the form of switched impulsive Hamiltonian systems (SIHSs) with multiple equilibria (ME) and unstable subsystems (US), and then analyze long-term stability issues of the power systems from the viewpoint of mathematics. According to the complex phenomena of switching actions of stages and generators, impulses of state, and existence of multiple equilibria, this paper first introduces an SIHS with ME and US to formulate a switching impulsive power system composed of an active generator, a standby generator, and an infinite load. Then, based on special system structures, a unique compact region containing all ME is determined, and novel stability concepts of region stability (RS), asymptotic region stability (ARS), and exponential region stability (ERS) are defined for such SIHS with respect to the region. Third, based on the introduced stability concepts, this paper proposes a necessary and sufficient condition of RS and ARS and a sufficient condition of ERS for the power system with respect to the region via the maximum energy function method. Finally, numerical simulations are carried out for a power system to show the effectiveness and practicality of the obained novel results.

  11. The 1993 Mississippi river flood: A one hundred or a one thousand year event?

    USGS Publications Warehouse

    Malamud, B.D.; Turcotte, D.L.; Barton, C.C.

    1996-01-01

    Power-law (fractal) extreme-value statistics are applicable to many natural phenomena under a wide variety of circumstances. Data from a hydrologic station in Keokuk, Iowa, shows the great flood of the Mississippi River in 1993 has a recurrence interval on the order of 100 years using power-law statistics applied to partial-duration flood series and on the order of 1,000 years using a log-Pearson type 3 (LP3) distribution applied to annual series. The LP3 analysis is the federally adopted probability distribution for flood-frequency estimation of extreme events. We suggest that power-law statistics are preferable to LP3 analysis. As a further test of the power-law approach we consider paleoflood data from the Colorado River. We compare power-law and LP3 extrapolations of historical data with these paleo-floods. The results are remarkably similar to those obtained for the Mississippi River: Recurrence intervals from power-law statistics applied to Lees Ferry discharge data are generally consistent with inferred 100- and 1,000-year paleofloods, whereas LP3 analysis gives recurrence intervals that are orders of magnitude longer. For both the Keokuk and Lees Ferry gauges, the use of an annual series introduces an artificial curvature in log-log space that leads to an underestimate of severe floods. Power-law statistics are predicting much shorter recurrence intervals than the federally adopted LP3 statistics. We suggest that if power-law behavior is applicable, then the likelihood of severe floods is much higher. More conservative dam designs and land-use restrictions Nay be required.

  12. Cost Analysis of the Discrete Address Beacon System for the Low-Performance General Aviation Aircraft Community.

    DTIC Science & Technology

    1981-09-01

    power supplies of the transponder to pro - vide a maximum 23.5 dBW power output. Tables 3-5 and 3-6 present the cost development for this configuration...configurations studied the cavity oscillator tube provides the necessary output characteristics for proper operation of the DABS transponder. Power supplies ...however, are affected by each configuration. The power supply was designed to provide 141 watts peak power at the antenna and sufficient capacity in

  13. Photovoltaic power system for a lunar base

    NASA Astrophysics Data System (ADS)

    Karia, Kris

    An assessment is provided of the viability of using photovoltaic power technology for lunar base application during the initial phase of the mission. The initial user power demands were assumed to be 25 kW (daytime) and 12.5 kW (night time). The effect of lunar adverse environmental conditions were also considered in deriving the photovoltaic power system concept. The solar cell array was found to impose no more design constraints than those solar arrays currently being designed for spacecraft and the Space Station Freedom. The long lunar night and the need to store sufficient energy to sustain a lunar facility during this period was found to be a major design driver. A photovoltaic power system concept was derived using high efficiency thin GaAs solar cells on a deployable flexible Kapton blanket. The solar array design was sized to generate sufficient power for daytime use and for a regenerative fuel cell (RFC) energy storage system to provide power during the night. Solar array sun-tracking is also proposed to maximize the array power output capability. The system launch mass was estimated to be approximately 10 metric tons. For mission application of photovoltaic technology other issues have to be addressed including the constraints imposed by launch vehicle, safety, and cost. For the initial phase of the mission a photovoltaic power system offers a safe option.

  14. 46 CFR 112.05-1 - Purpose; preemptive effect.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS General § 112.05-1 Purpose; preemptive effect. (a) The purpose of this part is to ensure a dependable, independent, and dedicated emergency power source with sufficient capacity to supply...

  15. 46 CFR 112.05-1 - Purpose; preemptive effect.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS General § 112.05-1 Purpose; preemptive effect. (a) The purpose of this part is to ensure a dependable, independent, and dedicated emergency power source with sufficient capacity to supply...

  16. 46 CFR 112.05-1 - Purpose; preemptive effect.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS General § 112.05-1 Purpose; preemptive effect. (a) The purpose of this part is to ensure a dependable, independent, and dedicated emergency power source with sufficient capacity to supply...

  17. Toward Self Sufficiency: Social Issues in the Nineties. Proceedings of the National Association for Welfare Research and Statistics (33rd, Scottsdale, Arizona, August 7-11, 1993).

    ERIC Educational Resources Information Center

    National Association for Welfare Research and Statistics, Olympia, WA.

    The presentations compiled in these proceedings on welfare and self-sufficiency reflect much of the current research in areas of housing, health, employment and training, welfare and reform, nutrition, child support, child care, and youth. The first section provides information on the conference and on the National Association for Welfare Research…

  18. Radar System Characterization Extended to Hardware-in-the-Loop Simulation for the Lab-Volt (Trademark) Training System

    DTIC Science & Technology

    2007-09-01

    devices such as klystrons , magnetrons, and traveling wave tubes. These microwave devices produce high power levels but may have limited bandwidths [20...diagram. The specific arrangement of components within a RADAR transmitter varies with operational specifications. Two options exist to produce high power ...cascading to generate sufficient power [20]. The second option to generate high power levels is to replace RF oscillators and amplifiers with microwave

  19. Solar Power Generation in Extreme Space Environments

    NASA Technical Reports Server (NTRS)

    Elliott, Frederick W.; Piszczor, Michael F.

    2016-01-01

    The exploration of space requires power for guidance, navigation, and control; instrumentation; thermal control; communications and data handling; and many subsystems and activities. Generating sufficient and reliable power in deep space through the use of solar arrays becomes even more challenging as solar intensity decreases and high radiation levels begin to degrade the performance of photovoltaic devices. The Extreme Environments Solar Power (EESP) project goal is to develop advanced photovoltaic technology to address these challenges.

  20. Which Variables Associated with Data-Driven Instruction Are Believed to Best Predict Urban Student Achievement?

    ERIC Educational Resources Information Center

    Greer, Wil

    2013-01-01

    This study identified the variables associated with data-driven instruction (DDI) that are perceived to best predict student achievement. Of the DDI variables discussed in the literature, 51 of them had a sufficient enough research base to warrant statistical analysis. Of them, 26 were statistically significant. Multiple regression and an…

  1. Explanatory power does not equal clinical importance: study of the use of the Brief ICF Core Sets for Spinal Cord Injury with a purely statistical approach.

    PubMed

    Ballert, C; Oberhauser, C; Biering-Sørensen, F; Stucki, G; Cieza, A

    2012-10-01

    Psychometric study analyzing the data of a cross-sectional, multicentric study with 1048 persons with spinal cord injury (SCI). To shed light on how to apply the Brief Core Sets for SCI of the International Classification of Functioning, Disability and Health (ICF) by determining whether the ICF categories contained in the Core Sets capture differences in overall health. Lasso regression was applied using overall health, rated by the patients and health professionals, as dependent variables and the ICF categories of the Comprehensive ICF Core Sets for SCI as independent variables. The ICF categories that best capture differences in overall health refer to areas of life such as self-care, relationships, economic self-sufficiency and community life. Only about 25% of the ICF categories of the Brief ICF Core Sets for the early post-acute and for long-term contexts were selected in the Lasso regression and differentiate, therefore, among levels of overall health. ICF categories such as d570 Looking after one's health, d870 Economic self-sufficiency, d620 Acquisition of goods and services and d910 Community life, which capture changes in overall health in patients with SCI, should be considered in addition to those of the Brief ICF Core Sets in clinical and epidemiological studies in persons with SCI.

  2. [Effect of vitamin beverages on vitamin sufficiency of the workers of Pskov Hydroelectric Power-Plant].

    PubMed

    Spiricheva, T V; Vrezhesinskaia, O A; Beketova, N A; Pereverzeva, O G; Kosheleva, O V; Kharitonchik, L A; Kodentsova, V M; Iudina, A V; Spirichev, V B

    2010-01-01

    The research of influence of vitamin complexes in the form of a drink or kissel on vitamin sufficiency of working persons has been carried out. Long inclusion (6,5 months) in a diet of vitamin drinks containing about 80% from recommended daily consumption of vitamins, was accompanied by trustworthy improvement of vitamins C and B6 sufficiency and prevention of seasonal deterioration of beta-carotene status. As initially surveyed have been well provided with vitamins A and E, their blood serum level increase had not occurred.

  3. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Exploiting excess sharing: a more powerful test of linkage for affected sib pairs than the transmission/disequilibrium test.

    PubMed Central

    Wicks, J

    2000-01-01

    The transmission/disequilibrium test (TDT) is a popular, simple, and powerful test of linkage, which can be used to analyze data consisting of transmissions to the affected members of families with any kind pedigree structure, including affected sib pairs (ASPs). Although it is based on the preferential transmission of a particular marker allele across families, it is not a valid test of association for ASPs. Martin et al. devised a similar statistic for ASPs, Tsp, which is also based on preferential transmission of a marker allele but which is a valid test of both linkage and association for ASPs. It is, however, less powerful than the TDT as a test of linkage for ASPs. What I show is that the differences between the TDT and Tsp are due to the fact that, although both statistics are based on preferential transmission of a marker allele, the TDT also exploits excess sharing in identity-by-descent transmissions to ASPs. Furthermore, I show that both of these statistics are members of a family of "TDT-like" statistics for ASPs. The statistics in this family are based on preferential transmission but also, to varying extents, exploit excess sharing. From this family of statistics, we see that, although the TDT exploits excess sharing to some extent, it is possible to do so to a greater extent-and thus produce a more powerful test of linkage, for ASPs, than is provided by the TDT. Power simulations conducted under a number of disease models are used to verify that the most powerful member of this family of TDT-like statistics is more powerful than the TDT for ASPs. PMID:10788332

  5. Exploiting excess sharing: a more powerful test of linkage for affected sib pairs than the transmission/disequilibrium test.

    PubMed

    Wicks, J

    2000-06-01

    The transmission/disequilibrium test (TDT) is a popular, simple, and powerful test of linkage, which can be used to analyze data consisting of transmissions to the affected members of families with any kind pedigree structure, including affected sib pairs (ASPs). Although it is based on the preferential transmission of a particular marker allele across families, it is not a valid test of association for ASPs. Martin et al. devised a similar statistic for ASPs, Tsp, which is also based on preferential transmission of a marker allele but which is a valid test of both linkage and association for ASPs. It is, however, less powerful than the TDT as a test of linkage for ASPs. What I show is that the differences between the TDT and Tsp are due to the fact that, although both statistics are based on preferential transmission of a marker allele, the TDT also exploits excess sharing in identity-by-descent transmissions to ASPs. Furthermore, I show that both of these statistics are members of a family of "TDT-like" statistics for ASPs. The statistics in this family are based on preferential transmission but also, to varying extents, exploit excess sharing. From this family of statistics, we see that, although the TDT exploits excess sharing to some extent, it is possible to do so to a greater extent-and thus produce a more powerful test of linkage, for ASPs, than is provided by the TDT. Power simulations conducted under a number of disease models are used to verify that the most powerful member of this family of TDT-like statistics is more powerful than the TDT for ASPs.

  6. 29 CFR 1918.86 - Roll-on roll-off (Ro-Ro) operations (see also § 1918.2, Ro-Ro operations, and § 1918.25).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Ro-Ro operations, and § 1918.25). 9 [Reserved] (a) Traffic control system. An organized system of... simultaneous use of the ramp by vehicles and pedestrians. (d) Ramp maintenance. Ramps shall be properly...: (1) Sufficient power to ascend ramp inclines safely; and (2) Sufficient braking capacity to descend...

  7. 18 CFR 33.5 - Proposed accounting entries.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Proposed accounting entries. 33.5 Section 33.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... present proposed accounting entries showing the effect of the transaction with sufficient detail to...

  8. 18 CFR 33.5 - Proposed accounting entries.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Proposed accounting entries. 33.5 Section 33.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... present proposed accounting entries showing the effect of the transaction with sufficient detail to...

  9. 18 CFR 33.5 - Proposed accounting entries.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Proposed accounting entries. 33.5 Section 33.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... present proposed accounting entries showing the effect of the transaction with sufficient detail to...

  10. 18 CFR 33.5 - Proposed accounting entries.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Proposed accounting entries. 33.5 Section 33.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION... present proposed accounting entries showing the effect of the transaction with sufficient detail to...

  11. 76 FR 42567 - Reporting Requirements for U.S. Providers of International Telecommunications Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... or transfers, unjust enrichment issues are implicated. 25. Wireless Telecommunications Carriers... they: (1) Have sufficient market power at the foreign end of an international route to affect... concerns that overseas incumbent or monopoly telecommunications providers might use their market power to...

  12. 14 CFR 29.1357 - Circuit protective devices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... devices in the generating system must be designed to de-energize and disconnect faulty power sources and power transmission equipment from their associated buses with sufficient rapidity to provide protection... be designed so that, when an overload or circuit fault exists, it will open the circuit regardless of...

  13. 14 CFR 29.1357 - Circuit protective devices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... devices in the generating system must be designed to de-energize and disconnect faulty power sources and power transmission equipment from their associated buses with sufficient rapidity to provide protection... be designed so that, when an overload or circuit fault exists, it will open the circuit regardless of...

  14. 14 CFR 29.1357 - Circuit protective devices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... devices in the generating system must be designed to de-energize and disconnect faulty power sources and power transmission equipment from their associated buses with sufficient rapidity to provide protection... be designed so that, when an overload or circuit fault exists, it will open the circuit regardless of...

  15. 14 CFR 29.1357 - Circuit protective devices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... devices in the generating system must be designed to de-energize and disconnect faulty power sources and power transmission equipment from their associated buses with sufficient rapidity to provide protection... be designed so that, when an overload or circuit fault exists, it will open the circuit regardless of...

  16. Solar powered actuator with continuously variable auxiliary power control

    NASA Technical Reports Server (NTRS)

    Nola, F. J. (Inventor)

    1984-01-01

    A solar powered system is disclosed in which a load such as a compressor is driven by a main induction motor powered by a solar array. An auxiliary motor shares the load with the solar powered motor in proportion to the amount of sunlight available, is provided with a power factor controller for controlling voltage applied to the auxiliary motor in accordance with the loading on that motor. In one embodiment, when sufficient power is available from the solar cell, the auxiliary motor is driven as a generator by excess power from the main motor so as to return electrical energy to the power company utility lines.

  17. An Analysis Pipeline with Statistical and Visualization-Guided Knowledge Discovery for Michigan-Style Learning Classifier Systems

    PubMed Central

    Urbanowicz, Ryan J.; Granizo-Mackenzie, Ambrose; Moore, Jason H.

    2014-01-01

    Michigan-style learning classifier systems (M-LCSs) represent an adaptive and powerful class of evolutionary algorithms which distribute the learned solution over a sizable population of rules. However their application to complex real world data mining problems, such as genetic association studies, has been limited. Traditional knowledge discovery strategies for M-LCS rule populations involve sorting and manual rule inspection. While this approach may be sufficient for simpler problems, the confounding influence of noise and the need to discriminate between predictive and non-predictive attributes calls for additional strategies. Additionally, tests of significance must be adapted to M-LCS analyses in order to make them a viable option within fields that require such analyses to assess confidence. In this work we introduce an M-LCS analysis pipeline that combines uniquely applied visualizations with objective statistical evaluation for the identification of predictive attributes, and reliable rule generalizations in noisy single-step data mining problems. This work considers an alternative paradigm for knowledge discovery in M-LCSs, shifting the focus from individual rules to a global, population-wide perspective. We demonstrate the efficacy of this pipeline applied to the identification of epistasis (i.e., attribute interaction) and heterogeneity in noisy simulated genetic association data. PMID:25431544

  18. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space. While this approach helps address conceptual and parametric uncertainties, the ensemble datasets produced by this technique present a special challenge to visualization researchers as the ensemble dataset records a distribution of possible values for each location in the domain. Contemporary visualization approaches that rely solely on summary statistics (e.g., mean and variance) cannot convey the detailed information encoded in ensemble distributions that are paramount to ensemble analysis; summary statistics provide no informationmore » about modality classification and modality persistence. To address this problem, we propose a novel technique that classifies high-variance locations based on the modality of the distribution of ensemble predictions. Additionally, we develop a set of confidence metrics to inform the end-user of the quality of fit between the distribution at a given location and its assigned class. We apply a similar method to time-varying ensembles to illustrate the relationship between peak variance and bimodal or multimodal behavior. These classification schemes enable a deeper understanding of the behavior of the ensemble members by distinguishing between distributions that can be described by a single tendency and distributions which reflect divergent trends in the ensemble.« less

  19. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  20. Assistive technology: a health care reform for people with disabilities.

    PubMed

    Santiago-Pintor, Jorge; Hernández-Maldonado, María; Correa-Colón, Angela; Méndez-Fernández, Héctor L

    2009-03-01

    Assistive technology has become one of the most powerful tools in assisting people with disabilities fight for social equality both in Puerto Rico as well as in other cities worldwide. In spite of this, the availability of assistive technology equipment does not constitute reason enough for people with disabilities to have all the technology resources for making them independent and productive in a society as competitive as ours. An assistive technology evaluation process is recommended in order to achieve an optimum level of self-sufficiency in people with disabilities. The evaluation process should take into consideration both the individual's needs and strength and the advantages and disadvantages of the equipment. The main purpose of this research was to determine the satisfaction level of 69 consumers evaluated at the Assistive Technology Integrated Services Center. These evaluations were conducted during 2001-2005. Statistical tests including distribution of frequencies, chi-square, bivariate and variance analysis were produced in order to determine if a scientific association existed between the consumers' level of satisfaction with the services and the assisted conditions. The data analysis results showed a significant difference between the satisfaction level with consumer's age, type of disability, and recommended equipment acquisition. Besides, statistical associations were established between general satisfaction concept dimensions, type of disability, and consumers' particular characteristics.

  1. A Large Scale (N=400) Investigation of Gray Matter Differences in Schizophrenia Using Optimized Voxel-based Morphometry

    PubMed Central

    Meda, Shashwath A.; Giuliani, Nicole R.; Calhoun, Vince D.; Jagannathan, Kanchana; Schretlen, David J.; Pulver, Anne; Cascella, Nicola; Keshavan, Matcheri; Kates, Wendy; Buchanan, Robert; Sharma, Tonmoy; Pearlson, Godfrey D.

    2008-01-01

    Background Many studies have employed voxel-based morphometry (VBM) of MRI images as an automated method of investigating cortical gray matter differences in schizophrenia. However, results from these studies vary widely, likely due to different methodological or statistical approaches. Objective To use VBM to investigate gray matter differences in schizophrenia in a sample significantly larger than any published to date, and to increase statistical power sufficiently to reveal differences missed in smaller analyses. Methods Magnetic resonance whole brain images were acquired from four geographic sites, all using the same model 1.5T scanner and software version, and combined to form a sample of 200 patients with both first episode and chronic schizophrenia and 200 healthy controls, matched for age, gender and scanner location. Gray matter concentration was assessed and compared using optimized VBM. Results Compared to the healthy controls, schizophrenia patients showed significantly less gray matter concentration in multiple cortical and subcortical regions, some previously unreported. Overall, we found lower concentrations of gray matter in regions identified in prior studies, most of which reported only subsets of the affected areas. Conclusions Gray matter differences in schizophrenia are most comprehensively elucidated using a large, diverse and representative sample. PMID:18378428

  2. Spurious correlations and inference in landscape genetics

    Treesearch

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  3. Estimation of thyroid radiation doses for the hanford thyroid disease study: results and implications for statistical power of the epidemiological analyses.

    PubMed

    Kopecky, Kenneth J; Davis, Scott; Hamilton, Thomas E; Saporito, Mark S; Onstad, Lynn E

    2004-07-01

    Residents of eastern Washington, northeastern Oregon, and western Idaho were exposed to I released into the atmosphere from operations at the Hanford Nuclear Site from 1944 through 1972, especially in the late 1940's and early 1950's. This paper describes the estimated doses to the thyroid glands of the 3,440 evaluable participants in the Hanford Thyroid Disease Study, which investigated whether thyroid morbidity was increased in people exposed to radioactive iodine from Hanford during 1944-1957. The participants were born during 1940-1946 to mothers living in Benton, Franklin, Walla Walla, Adams, Okanogan, Ferry, or Stevens Counties in Washington State. Whenever possible someone with direct knowledge of the participant's early life (preferably the participant's mother) was interviewed about the participant's individual dose-determining characteristics (residence history, sources and quantities of food, milk, and milk products consumed, production and processing techniques for home-grown food and milk products). Default information was used if no interview respondent was available. Thyroid doses were estimated using the computer program Calculation of Individual Doses from Environmental Radionuclides (CIDER) developed by the Hanford Environmental Dose Reconstruction Project. CIDER provided 100 sets of doses to represent uncertainty of the estimates. These sets were not generated independently for each participant, but reflected the effects of uncertainties in characteristics shared by participants. Estimated doses (medians of each participant's 100 realizations) ranged from 0.0029 mGy to 2823 mGy, with mean and median of 174 and 97 mGy, respectively. The distribution of estimated doses provided the Hanford Thyroid Disease Study with sufficient statistical power to test for dose-response relationships between thyroid outcomes and exposure to Hanford's I.

  4. Association of Epsilon-Aminocaproic Acid With Blood Loss and Risk of Transfusion After Periacetabular Osteotomy: A Retrospective Cohort Study.

    PubMed

    McLawhorn, Alexander S; Levack, Ashley E; Fields, Kara G; Sheha, Evan D; DelPizzo, Kathryn R; Sink, Ernest L

    2016-03-01

    Periacetabular osteotomy (PAO) reorients the acetabular cartilage through a complex series of pelvic osteotomies, which risks significant blood loss often necessitating blood transfusion. Therefore, it is important to identify effective strategies to manage blood loss and decrease morbidity after PAO. The purpose of this study was to determine the association of epsilon-aminocaproic acid (EACA), an antifibrinolytic agent, with blood loss from PAO. Ninety-three patients out of 110 consecutive patients that underwent unilateral PAO for acetabular dysplasia met inclusion criteria. Fifty patients received EACA intraoperatively. Demographics, autologous blood predonation, anesthetic type, intraoperative estimated blood loss (EBL), cell-saver utilization, and transfusions were recorded. Total blood loss was calculated. Two-sample t-test and chi-square or Fisher's exact test were used as appropriate. The associations between EACA administration and calculated EBL, cell-saver utilization, intraoperative EBL, and maximum difference in postoperative hemoglobin were assessed via multiple regression, adjusting for confounders. Post hoc power analysis demonstrated sufficient power to detect a 250-mL difference in calculated EBL between groups. Alpha level was 0.05 for all tests. No demographic differences existed between groups. Mean blood loss and allogeneic transfusion rates were not statistically significant between groups (P = .093 and .170, respectively). There were no differences in cell-saver utilization, intraoperative EBL, and/or postoperative hemoglobin. There was a higher rate of autologous blood utilization in the group not receiving EACA because of a clinical practice change. EACA administration was not associated with a statistically significant reduction in blood loss or allogeneic transfusion in patients undergoing PAO. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Computer simulation of magnetization-controlled shunt reactors for calculating electromagnetic transients in power systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpov, A. S.

    2013-01-15

    A computer procedure for simulating magnetization-controlled dc shunt reactors is described, which enables the electromagnetic transients in electric power systems to be calculated. It is shown that, by taking technically simple measures in the control system, one can obtain high-speed reactors sufficient for many purposes, and dispense with the use of high-power devices for compensating higher harmonic components.

  6. Generalizing Terwilliger's likelihood approach: a new score statistic to test for genetic association.

    PubMed

    el Galta, Rachid; Uitte de Willige, Shirley; de Visser, Marieke C H; Helmer, Quinta; Hsu, Li; Houwing-Duistermaat, Jeanine J

    2007-09-24

    In this paper, we propose a one degree of freedom test for association between a candidate gene and a binary trait. This method is a generalization of Terwilliger's likelihood ratio statistic and is especially powerful for the situation of one associated haplotype. As an alternative to the likelihood ratio statistic, we derive a score statistic, which has a tractable expression. For haplotype analysis, we assume that phase is known. By means of a simulation study, we compare the performance of the score statistic to Pearson's chi-square statistic and the likelihood ratio statistic proposed by Terwilliger. We illustrate the method on three candidate genes studied in the Leiden Thrombophilia Study. We conclude that the statistic follows a chi square distribution under the null hypothesis and that the score statistic is more powerful than Terwilliger's likelihood ratio statistic when the associated haplotype has frequency between 0.1 and 0.4 and has a small impact on the studied disorder. With regard to Pearson's chi-square statistic, the score statistic has more power when the associated haplotype has frequency above 0.2 and the number of variants is above five.

  7. Optoelectronic Devices Based on Novel Semiconductor Structures

    DTIC Science & Technology

    2006-06-14

    superlattices 4. TEM study and band -filling effects in quantum-well dots 5. Improvements on tuning ranges and output powers for widely-tunable THz sources...the pump power increases the relative strength for the QW emission in the QWD sample also increases. Eventually at the sufficiently- high pump power ...Ahopelto, Appl. Phys. Lett. 66, 2364 (1995). 5. A monochromatic and high - power THz source tunable in the ranges of 2.7-38.4 ptm and 58.2-3540 ptm for

  8. Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.

    PubMed

    Pandolfi, Maurizio; Carreras, Giulia

    2018-06-07

    It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.

  9. Integrated Cognitive-neuroscience Architectures for Understanding Sensemaking (ICArUS): A Computational Basis for ICArUS Challenge Problem Design

    DTIC Science & Technology

    2014-11-01

    Kullback , S., & Leibler , R. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22, 79...cognitive challenges of sensemaking only informally using conceptual notions like "framing" and "re-framing", which are not sufficient to support T&E in...appropriate frame(s) from memory. Assess the Frame: Evaluate the quality of fit between data and frame. Generate Hypotheses: Use the current

  10. 29 CFR 1910.66 - Powered platforms for building maintenance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... used to supply electrical power and/or control current for equipment or to provide voice communication... access to, and egress from, the equipment and sufficient space to conduct necessary maintenance of the... in use; and (vi) An effective two-way voice communication system shall be provided between the...

  11. 29 CFR 1910.66 - Powered platforms for building maintenance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... used to supply electrical power and/or control current for equipment or to provide voice communication... access to, and egress from, the equipment and sufficient space to conduct necessary maintenance of the... in use; and (vi) An effective two-way voice communication system shall be provided between the...

  12. 46 CFR 197.332 - PVHO-Decompression chambers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... dogs, from both sides of a closed hatch; (e) Have interior illumination sufficient to allow visual... (m) Have a sound-powered headset or telephone as a backup to the communications system required by § 197.328(c) (5) and (6), except when that communications system is a sound-powered system. ...

  13. Power Politics of Family Psychotherapy.

    ERIC Educational Resources Information Center

    Whitaker, Carl A.

    It is postulated that the standard framework for psychotherapy, a cooperative transference neurosis, does not validly carry over to the successful psychotherapy of a two-generation family group. In many disturbed families, the necessary and sufficient dynamics for change must be initiated, controlled, and augmented by a group dynamic power-play,…

  14. 29 CFR 1926.303 - Abrasive wheels and tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and tools. (a) Power. All grinding machines shall be supplied with sufficient power to maintain the spindle speed at safe levels under all conditions of normal operation. (b) Guarding. (1) Grinding machines..., nut, and outer flange may be exposed on machines designed as portable saws. (c) Use of abrasive wheels...

  15. 29 CFR 1926.303 - Abrasive wheels and tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and tools. (a) Power. All grinding machines shall be supplied with sufficient power to maintain the spindle speed at safe levels under all conditions of normal operation. (b) Guarding. (1) Grinding machines..., nut, and outer flange may be exposed on machines designed as portable saws. (c) Use of abrasive wheels...

  16. 29 CFR 1926.303 - Abrasive wheels and tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and tools. (a) Power. All grinding machines shall be supplied with sufficient power to maintain the spindle speed at safe levels under all conditions of normal operation. (b) Guarding. (1) Grinding machines..., nut, and outer flange may be exposed on machines designed as portable saws. (c) Use of abrasive wheels...

  17. Low Power Switching for Antenna Reconfiguration

    NASA Technical Reports Server (NTRS)

    Bauhahn, Paul E. (Inventor); Becker, Robert C. (Inventor); Meyers, David W. (Inventor); Muldoon, Kelly P. (Inventor)

    2008-01-01

    Methods and systems for low power switching are provided. In one embodiment, an optical switching system is provided. The system comprises at least one optically controlled switch adapted to maintain one of an open state and a closed state based on an associated light signal; and at least one light source adapted to output the associated light signal to the at least one switch, wherein the at least one light source cycles the light signal on and off, wherein the at least one light source is cycled on for a sufficient duration of time and with a sufficient periodicity to maintain the optically controlled switch in one of an open state and a closed state.

  18. Evaluating sufficient similarity for drinking-water disinfection by-product (DBP) mixtures with bootstrap hypothesis test procedures.

    PubMed

    Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn

    2009-01-01

    In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.

  19. Multiplicative point process as a model of trading activity

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  20. Association between arsenic exposure from a coal-burning power plant and urinary arsenic concentrations in Prievidza District, Slovakia.

    PubMed

    Ranft, Ulrich; Miskovic, Peter; Pesch, Beate; Jakubis, Pavel; Fabianova, Elenora; Keegan, Tom; Hergemöller, Andre; Jakubis, Marian; Nieuwenhuijsen, Mark J

    2003-06-01

    To assess the arsenic exposure of a population living in the vicinity of a coal-burning power plant with high arsenic emission in the Prievidza District, Slovakia, 548 spot urine samples were speciated for inorganic As (Asinorg), monomethylarsonic acid (MMA), dimethylarsinic acid (DMA), and their sum (Assum). The urine samples were collected from the population of a case-control study on nonmelanoma skin cancer (NMSC). A total of 411 samples with complete As speciations and sufficient urine quality and without fish consumption were used for statistical analysis. Although current environmental As exposure and urinary As concentrations were low (median As in soil within 5 km distance to the power plant, 41 micro g/g; median urinary Assum, 5.8 microg/L), there was a significant but weak association between As in soil and urinary Assum(r = 0.21, p < 0.01). We performed a multivariate regression analysis to calculate adjusted regression coefficients for environmental As exposure and other determinants of urinary As. Persons living in the vicinity of the plant had 27% higher Assum values (p < 0.01), based on elevated concentrations of the methylated species. A 32% increase of MMA occurred among subjects who consumed homegrown food (p < 0.001). NMSC cases had significantly higher levels of Assum, DMA, and Asinorg. The methylation index Asinorg/(MMA + DMA) was about 20% lower among cases (p < 0.05) and in men (p < 0.05) compared with controls and females, respectively.

  1. Association between arsenic exposure from a coal-burning power plant and urinary arsenic concentrations in Prievidza District, Slovakia.

    PubMed Central

    Ranft, Ulrich; Miskovic, Peter; Pesch, Beate; Jakubis, Pavel; Fabianova, Elenora; Keegan, Tom; Hergemöller, Andre; Jakubis, Marian; Nieuwenhuijsen, Mark J

    2003-01-01

    To assess the arsenic exposure of a population living in the vicinity of a coal-burning power plant with high arsenic emission in the Prievidza District, Slovakia, 548 spot urine samples were speciated for inorganic As (Asinorg), monomethylarsonic acid (MMA), dimethylarsinic acid (DMA), and their sum (Assum). The urine samples were collected from the population of a case-control study on nonmelanoma skin cancer (NMSC). A total of 411 samples with complete As speciations and sufficient urine quality and without fish consumption were used for statistical analysis. Although current environmental As exposure and urinary As concentrations were low (median As in soil within 5 km distance to the power plant, 41 micro g/g; median urinary Assum, 5.8 microg/L), there was a significant but weak association between As in soil and urinary Assum(r = 0.21, p < 0.01). We performed a multivariate regression analysis to calculate adjusted regression coefficients for environmental As exposure and other determinants of urinary As. Persons living in the vicinity of the plant had 27% higher Assum values (p < 0.01), based on elevated concentrations of the methylated species. A 32% increase of MMA occurred among subjects who consumed homegrown food (p < 0.001). NMSC cases had significantly higher levels of Assum, DMA, and Asinorg. The methylation index Asinorg/(MMA + DMA) was about 20% lower among cases (p < 0.05) and in men (p < 0.05) compared with controls and females, respectively. PMID:12782488

  2. Ft. McHenry tunnel study: Source profiles and mercury emissions from diesel and gasoline powered vehicles

    NASA Astrophysics Data System (ADS)

    Landis, Matthew S.; Lewis, Charles W.; Stevens, Robert K.; Keeler, Gerald J.; Dvonch, J. Timothy; Tremblay, Raphael T.

    During the fall of 1998, the US Environmental Protection Agency and the Florida Department of Environmental Protection sponsored a 7-day study at the Ft. McHenry tunnel in Baltimore, MD with the objective of obtaining PM 2.5 vehicle source profiles for use in atmospheric mercury source apportionment studies. PM 2.5 emission profiles from gasoline and diesel powered vehicles were developed from analysis of trace elements, polycyclic aromatic hydrocarbons (PAH), and condensed aliphatic hydrocarbons. PM 2.5 samples were collected using commercially available sampling systems and were extracted and analyzed using conventional well-established methods. Both inorganic and organic profiles were sufficiently unique to mathematically discriminate the contributions from each source type using a chemical mass balance source apportionment approach. However, only the organic source profiles provided unique PAH tracers (e.g., fluoranthene, pyrene, and chrysene) for diesel combustion that could be used to identify source contributions generated using multivariate statistical receptor modeling approaches. In addition, the study found significant emission of gaseous elemental mercury (Hg 0), divalent reactive gaseous mercury (RGM), and particulate mercury (Hg(p)) from gasoline but not from diesel powered motor vehicles. Fuel analysis supported the tunnel measurement results showing that total mercury content in all grades of gasoline (284±108 ng L -1) was substantially higher than total mercury content in diesel fuel (62±37 ng L -1) collected contemporaneously at local Baltimore retailers.

  3. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  4. Back-side hydrogenation technique for defect passivation in silicon solar cells

    DOEpatents

    Sopori, Bhushan L.

    1994-01-01

    A two-step back-side hydrogenation process includes the steps of first bombarding the back side of the silicon substrate with hydrogen ions with intensities and for a time sufficient to implant enough hydrogen atoms into the silicon substrate to potentially passivate substantially all of the defects and impurities in the silicon substrate, and then illuminating the silicon substrate with electromagnetic radiation to activate the implanted hydrogen, so that it can passivate the defects and impurities in the substrate. The illumination step also annihilates the hydrogen-induced defects. The illumination step is carried out according to a two-stage illumination schedule, the first or low-power stage of which subjects the substrate to electromagnetic radiation that has sufficient intensity to activate the implanted hydrogen, yet not drive the hydrogen from the substrate. The second or high-power illumination stage subjects the substrate to higher intensity electromagnetic radiation, which is sufficient to annihilate the hydrogen-induced defects and sinter/alloy the metal contacts.

  5. Back-side hydrogenation technique for defect passivation in silicon solar cells

    DOEpatents

    Sopori, B.L.

    1994-04-19

    A two-step back-side hydrogenation process includes the steps of first bombarding the back side of the silicon substrate with hydrogen ions with intensities and for a time sufficient to implant enough hydrogen atoms into the silicon substrate to potentially passivate substantially all of the defects and impurities in the silicon substrate, and then illuminating the silicon substrate with electromagnetic radiation to activate the implanted hydrogen, so that it can passivate the defects and impurities in the substrate. The illumination step also annihilates the hydrogen-induced defects. The illumination step is carried out according to a two-stage illumination schedule, the first or low-power stage of which subjects the substrate to electromagnetic radiation that has sufficient intensity to activate the implanted hydrogen, yet not drive the hydrogen from the substrate. The second or high-power illumination stage subjects the substrate to higher intensity electromagnetic radiation, which is sufficient to annihilate the hydrogen-induced defects and sinter/alloy the metal contacts. 3 figures.

  6. Statistical power as a function of Cronbach alpha of instrument questionnaire items.

    PubMed

    Heo, Moonseong; Kim, Namhee; Faith, Myles S

    2015-10-14

    In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless of research designs or settings, in order to increase statistical power, development and use of instruments with greater C(α), or equivalently with greater inter-item correlations, is crucial for trials that intend to use questionnaire items for measuring research outcomes. Further development of the power functions for binary or ordinal item scores and under more general item correlation strutures reflecting more real world situations would be a valuable future study.

  7. Evaluation of oral hygiene products: science is true; don't be misled by the facts.

    PubMed

    Addy, M; Moran, J M

    1997-10-01

    Most people in industrialized countries use oral hygiene products. When an oral health benefit is expected, it is important that sufficient scientific evidence exist to support such claims. Ideally, data should be cumulative derived from studies in vitro and in vivo. The data should be available to the profession for evaluation by publication in refereed scientific journals. Terms and phrases require clarification, and claims made by implication or derived by inference must be avoided. Similarity in products is not necessarily proof per se of efficacy. Studies in vitro and in vivo should follow the basic principles of scientific research. Studies must be ethical, avoid bias and be suitably controlled. A choice of controls will vary depending on whether an agent or a whole product is evaluated and the development stage of a formulation. Where appropriate, new products should be compared with products already available and used by the general public. Conformity with the guidelines for good clinical practice appears to be a useful way of validating studies and a valuable guide to the profession. Studies should be designed with sufficient power to detect statistically significant differences if these exist. However, consideration must be given to the clinical significance of statistically significant differences between formulations since these are not necessarily the same. Studies in vitro provide supportive data but extrapolation to clinical effect is difficult and even misleading, and such data should not stand alone as proof of efficacy of a product. Short-term studies in vivo provide useful information, particularly at the development stage. Ideally, however, products should be proved effective when used in the circumstances for which they are developed. Nevertheless, a variety of variable influence the outcome of home-use studies, and the influence of the variable cannot usually be calculated. Although rarely considered, the cost-benefit ratio of some oral hygiene products needs to be considered.

  8. Laser beamed power - Satellite demonstration applications

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Westerlund, Larry H.

    1992-01-01

    Feasibility of using a ground-based laser to beam light to the solar arrays of orbiting satellites to a level sufficient to provide the operating power required is discussed. An example case of a GEO communications satellite near the end of life due to radiation damage of the solar arrays or battery failure is considered. It is concluded that the commercial satellite industry should be able to reap significant economic benefits through the use of power beaming which is capable of providing supplemental power for satellites with failing arrays, or primary power for failed batteries.

  9. An approach of ionic liquids/lithium salts based microwave irradiation pretreatment followed by ultrasound-microwave synergistic extraction for two coumarins preparation from Cortex fraxini.

    PubMed

    Liu, Zaizhi; Gu, Huiyan; Yang, Lei

    2015-10-23

    Ionic liquids/lithium salts solvent system was successfully introduced into the separation technique for the preparation of two coumarins (aesculin and aesculetin) from Cortex fraxini. Ionic liquids/lithium salts based microwave irradiation pretreatment followed by ultrasound-microwave synergy extraction (ILSMP-UMSE) procedure was developed and optimized for the sufficient extraction of these two analytes. Several variables which can potentially influence the extraction yields, including pretreatment time and temperature, [C4mim]Br concentration, LiAc content, ultrasound-microwave synergy extraction (UMSE) time, liquid-solid ratio, and UMSE power were optimized by Plackett-Burman design. Among seven variables, UMSE time, liquid-solid ratio, and UMSE power were the statistically significant variables and these three factors were further optimized by Box-Behnken design to predict optimal extraction conditions and find out operability ranges with maximum extraction yields. Under optimum operating conditions, ILSMP-UMSE showed higher extraction yields of two target compounds than those obtained by reference extraction solvents. Method validation studies also evidenced that ILSMP-UMSE is credible for the preparation of two coumarins from Cortex fraxini. This study is indicative of the proposed procedure that has huge application prospects for the preparation of natural products from plant materials. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Statistical power, the Belmont report, and the ethics of clinical trials.

    PubMed

    Vollmer, Sara H; Howard, George

    2010-12-01

    Achieving a good clinical trial design increases the likelihood that a trial will take place as planned, including that data will be obtained from a sufficient number of participants, and the total number of participants will be the minimal required to gain the knowledge sought. A good trial design also increases the likelihood that the knowledge sought by the experiment will be forthcoming. Achieving such a design is more than good sense-it is ethically required in experiments when participants are at risk of harm. This paper argues that doing a power analysis effectively contributes to ensuring that a trial design is good. The ethical importance of good trial design has long been recognized for trials in which there is risk of serious harm to participants. However, whether the quality of a trial design, when the risk to participants is only minimal, is an ethical issue is rarely discussed. This paper argues that even in cases when the risk is minimal, the quality of the trial design is an ethical issue, and that this is reflected in the emphasis the Belmont Report places on the importance of the benefit of knowledge gained by society. The paper also argues that good trial design is required for true informed consent.

  11. Unraveling the Mysteries of Turbulence Transport in a Wind Farm

    DOE PAGES

    Jha, Pankaj K.; Duque, Earl P. N.; Bashioum, Jessica L.; ...

    2015-06-26

    A true physical understanding of the mysteries involved in the recovery process of the wake momentum deficit, downstream of utility-scale wind turbines in the atmosphere, has not been obtained to date. Field data are not acquired at sufficient spatial and temporal resolutions to dissect some of the mysteries of wake turbulence. It is here that the actuator line method has evolved to become the technology standard in the wind energy community. This work presents the actuator line method embedded into an Open source Field Operation and Manipulation (OpenFOAM) large-eddy simulation solver and applies it to two small wind farms, themore » first one consisting of an array of two National Renewable Energy Laboratory 5 Megawatt (NREL 5-MW) turbines separated by seven rotor diameters in neutral and unstable atmospheric boundary-layer flow and the second one consisting of five NREL 5-MW wind turbines in unstable atmospheric conditions arranged in two staggered arrays of two and three turbines, respectively. Detailed statistics involving power spectral density (PSD) of turbine power along with standard deviations reveal the effects of atmospheric turbulence and its space and time scales. In conclusion, high-resolution surface data extracts provide new insight into the complex recovery process of the wake momentum deficit governed by turbulence transport phenomena.« less

  12. Experimental Investigation of Very Large Model Wind Turbine Arrays

    NASA Astrophysics Data System (ADS)

    Charmanski, Kyle; Wosnik, Martin

    2013-11-01

    The decrease in energy yield in large wind farms (array losses) and associated revenue losses can be significant. When arrays are sufficiently large they can reach what is known as a fully developed wind turbine array boundary layer, or fully developed wind farm condition. This occurs when the turbulence statistics and the structure of the turbulence, within and above a wind farm, as well as the performance of the turbines remain the same from one row to the next. The study of this condition and how it is affected by parameters such as turbine spacing, power extraction, tip speed ratio, etc. is important for the optimization of large wind farms. An experimental investigation of the fully developed wind farm condition was conducted using a large array of porous disks (upstream) and realistically scaled 3-bladed wind turbines with a diameter of 0.25m. The turbines and porous disks were placed inside a naturally grown turbulent boundary layer in the 6m × 2.5m × 72m test section of the UNH Flow Physics Facility which can achieve test section velocities of up to 14 m/s and Reynolds numbers δ+ = δuτ / ν ~ 20 , 000 . Power, rate of rotation and rotor thrust were measured for select turbines, and hot-wire anemometry was used for flow measurements.

  13. Unraveling the Mysteries of Turbulence Transport in a Wind Farm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jha, Pankaj K.; Duque, Earl P. N.; Bashioum, Jessica L.

    A true physical understanding of the mysteries involved in the recovery process of the wake momentum deficit, downstream of utility-scale wind turbines in the atmosphere, has not been obtained to date. Field data are not acquired at sufficient spatial and temporal resolutions to dissect some of the mysteries of wake turbulence. It is here that the actuator line method has evolved to become the technology standard in the wind energy community. This work presents the actuator line method embedded into an Open source Field Operation and Manipulation (OpenFOAM) large-eddy simulation solver and applies it to two small wind farms, themore » first one consisting of an array of two National Renewable Energy Laboratory 5 Megawatt (NREL 5-MW) turbines separated by seven rotor diameters in neutral and unstable atmospheric boundary-layer flow and the second one consisting of five NREL 5-MW wind turbines in unstable atmospheric conditions arranged in two staggered arrays of two and three turbines, respectively. Detailed statistics involving power spectral density (PSD) of turbine power along with standard deviations reveal the effects of atmospheric turbulence and its space and time scales. In conclusion, high-resolution surface data extracts provide new insight into the complex recovery process of the wake momentum deficit governed by turbulence transport phenomena.« less

  14. The power and robustness of maximum LOD score statistics.

    PubMed

    Yoo, Y J; Mendell, N R

    2008-07-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.

  15. Power Enhancement in High Dimensional Cross-Sectional Tests

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Yao, Jiawei

    2016-01-01

    We propose a novel technique to boost the power of testing a high-dimensional vector H : θ = 0 against sparse alternatives where the null hypothesis is violated only by a couple of components. Existing tests based on quadratic forms such as the Wald statistic often suffer from low powers due to the accumulation of errors in estimating high-dimensional parameters. More powerful tests for sparse alternatives such as thresholding and extreme-value tests, on the other hand, require either stringent conditions or bootstrap to derive the null distribution and often suffer from size distortions due to the slow convergence. Based on a screening technique, we introduce a “power enhancement component”, which is zero under the null hypothesis with high probability, but diverges quickly under sparse alternatives. The proposed test statistic combines the power enhancement component with an asymptotically pivotal statistic, and strengthens the power under sparse alternatives. The null distribution does not require stringent regularity conditions, and is completely determined by that of the pivotal statistic. As specific applications, the proposed methods are applied to testing the factor pricing models and validating the cross-sectional independence in panel data models. PMID:26778846

  16. Statistical Power of Alternative Structural Models for Comparative Effectiveness Research: Advantages of Modeling Unreliability.

    PubMed

    Coman, Emil N; Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J; Suggs, Suzanne; Barbour, Russell

    2014-05-01

    The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power.

  17. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    USGS Publications Warehouse

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  18. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  19. Challenges for future space power systems

    NASA Technical Reports Server (NTRS)

    Brandhorst, Henry W., Jr.

    1989-01-01

    Forecasts of space power needs are presented. The needs fall into three broad categories: survival, self-sufficiency, and industrialization. The cost of delivering payloads to orbital locations and from Low Earth Orbit (LEO) to Mars are determined. Future launch cost reductions are predicted. From these projections the performances necessary for future solar and nuclear space power options are identified. The availability of plentiful cost effective electric power and of low cost access to space are identified as crucial factors in the future extension of human presence in space.

  20. Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions.

    PubMed

    Liu, Hongcheng; Yao, Tao; Li, Runze; Ye, Yinyu

    2017-11-01

    This paper concerns the folded concave penalized sparse linear regression (FCPSLR), a class of popular sparse recovery methods. Although FCPSLR yields desirable recovery performance when solved globally, computing a global solution is NP-complete. Despite some existing statistical performance analyses on local minimizers or on specific FCPSLR-based learning algorithms, it still remains open questions whether local solutions that are known to admit fully polynomial-time approximation schemes (FPTAS) may already be sufficient to ensure the statistical performance, and whether that statistical performance can be non-contingent on the specific designs of computing procedures. To address the questions, this paper presents the following threefold results: (i) Any local solution (stationary point) is a sparse estimator, under some conditions on the parameters of the folded concave penalties. (ii) Perhaps more importantly, any local solution satisfying a significant subspace second-order necessary condition (S 3 ONC), which is weaker than the second-order KKT condition, yields a bounded error in approximating the true parameter with high probability. In addition, if the minimal signal strength is sufficient, the S 3 ONC solution likely recovers the oracle solution. This result also explicates that the goal of improving the statistical performance is consistent with the optimization criteria of minimizing the suboptimality gap in solving the non-convex programming formulation of FCPSLR. (iii) We apply (ii) to the special case of FCPSLR with minimax concave penalty (MCP) and show that under the restricted eigenvalue condition, any S 3 ONC solution with a better objective value than the Lasso solution entails the strong oracle property. In addition, such a solution generates a model error (ME) comparable to the optimal but exponential-time sparse estimator given a sufficient sample size, while the worst-case ME is comparable to the Lasso in general. Furthermore, to guarantee the S 3 ONC admits FPTAS.

  1. Output power fluctuations due to different weights of macro particles used in particle-in-cell simulations of Cerenkov devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Rong; Li, Yongdong; Liu, Chunliang

    2016-07-15

    The output power fluctuations caused by weights of macro particles used in particle-in-cell (PIC) simulations of a backward wave oscillator and a travelling wave tube are statistically analyzed. It is found that the velocities of electrons passed a specific slow-wave structure form a specific electron velocity distribution. The electron velocity distribution obtained in PIC simulation with a relative small weight of macro particles is considered as an initial distribution. By analyzing this initial distribution with a statistical method, the estimations of the output power fluctuations caused by different weights of macro particles are obtained. The statistical method is verified bymore » comparing the estimations with the simulation results. The fluctuations become stronger with increasing weight of macro particles, which can also be determined reversely from estimations of the output power fluctuations. With the weights of macro particles optimized by the statistical method, the output power fluctuations in PIC simulations are relatively small and acceptable.« less

  2. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Treesearch

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  3. Analysis and model on space-time characteristics of wind power output based on the measured wind speed data

    NASA Astrophysics Data System (ADS)

    Shi, Wenhui; Feng, Changyou; Qu, Jixian; Zha, Hao; Ke, Dan

    2018-02-01

    Most of the existing studies on wind power output focus on the fluctuation of wind farms and the spatial self-complementary of wind power output time series was ignored. Therefore the existing probability models can’t reflect the features of power system incorporating wind farms. This paper analyzed the spatial self-complementary of wind power and proposed a probability model which can reflect temporal characteristics of wind power on seasonal and diurnal timescales based on sufficient measured data and improved clustering method. This model could provide important reference for power system simulation incorporating wind farms.

  4. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  5. The risk of paradoxical embolism (RoPE) study: initial description of the completed database.

    PubMed

    Thaler, David E; Di Angelantonio, Emanuele; Di Tullio, Marco R; Donovan, Jennifer S; Griffith, John; Homma, Shunichi; Jaigobin, Cheryl; Mas, Jean-Louis; Mattle, Heinrich P; Michel, Patrik; Mono, Marie-Luise; Nedeltchev, Krassen; Papetti, Federica; Ruthazer, Robin; Serena, Joaquín; Weimar, Christian; Elkind, Mitchell S V; Kent, David M

    2013-12-01

    Detecting a benefit from closure of patent foramen ovale in patients with cryptogenic stroke is hampered by low rates of stroke recurrence and uncertainty about the causal role of patent foramen ovale in the index event. A method to predict patent foramen ovale-attributable recurrence risk is needed. However, individual databases generally have too few stroke recurrences to support risk modeling. Prior studies of this population have been limited by low statistical power for examining factors related to recurrence. The aim of this study was to develop a database to support modeling of patent foramen ovale-attributable recurrence risk by combining extant data sets. We identified investigators with extant databases including subjects with cryptogenic stroke investigated for patent foramen ovale, determined the availability and characteristics of data in each database, collaboratively specified the variables to be included in the Risk of Paradoxical Embolism database, harmonized the variables across databases, and collected new primary data when necessary and feasible. The Risk of Paradoxical Embolism database has individual clinical, radiologic, and echocardiographic data from 12 component databases, including subjects with cryptogenic stroke both with (n = 1925) and without (n = 1749) patent foramen ovale. In the patent foramen ovale subjects, a total of 381 outcomes (stroke, transient ischemic attack, death) occurred (median follow-up 2·2 years). While there were substantial variations in data collection between studies, there was sufficient overlap to define a common set of variables suitable for risk modeling. While individual studies are inadequate for modeling patent foramen ovale-attributable recurrence risk, collaboration between investigators has yielded a database with sufficient power to identify those patients at highest risk for a patent foramen ovale-related stroke recurrence who may have the greatest potential benefit from patent foramen ovale closure. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  6. Quantum Monte Carlo: Faster, More Reliable, And More Accurate

    NASA Astrophysics Data System (ADS)

    Anderson, Amos Gerald

    2010-06-01

    The Schrodinger Equation has been available for about 83 years, but today, we still strain to apply it accurately to molecules of interest. The difficulty is not theoretical in nature, but practical, since we're held back by lack of sufficient computing power. Consequently, effort is applied to find acceptable approximations to facilitate real time solutions. In the meantime, computer technology has begun rapidly advancing and changing the way we think about efficient algorithms. For those who can reorganize their formulas to take advantage of these changes and thereby lift some approximations, incredible new opportunities await. Over the last decade, we've seen the emergence of a new kind of computer processor, the graphics card. Designed to accelerate computer games by optimizing quantity instead of quality in processor, they have become of sufficient quality to be useful to some scientists. In this thesis, we explore the first known use of a graphics card to computational chemistry by rewriting our Quantum Monte Carlo software into the requisite "data parallel" formalism. We find that notwithstanding precision considerations, we are able to speed up our software by about a factor of 6. The success of a Quantum Monte Carlo calculation depends on more than just processing power. It also requires the scientist to carefully design the trial wavefunction used to guide simulated electrons. We have studied the use of Generalized Valence Bond wavefunctions to simply, and yet effectively, captured the essential static correlation in atoms and molecules. Furthermore, we have developed significantly improved two particle correlation functions, designed with both flexibility and simplicity considerations, representing an effective and reliable way to add the necessary dynamic correlation. Lastly, we present our method for stabilizing the statistical nature of the calculation, by manipulating configuration weights, thus facilitating efficient and robust calculations. Our combination of Generalized Valence Bond wavefunctions, improved correlation functions, and stabilized weighting techniques for calculations run on graphics cards, represents a new way for using Quantum Monte Carlo to study arbitrarily sized molecules.

  7. The development of the Pictorial Thai Quality of Life.

    PubMed

    Phattharayuttawat, Sucheera; Ngamthipwatthana, Thienchai; Pitiyawaranun, Buncha

    2005-11-01

    "Quality of life" has become a main focus of interest in medicine. The Pictorial Thai Quality of Life (PTQL) was developed in order to measure the Thai mental illness both in a clinical setting and community. The purpose of this study was to develop the Pictorial Thai Quality of Life (PTQL), having adequate and sufficient construct validity, discriminant power, concurrent validity, and reliability. To develop the Pictorial Thai Quality of Life Test, two samples groups were used in the present study: (1) pilot study samples: 30 samples and (2) survey samples were 672 samples consisting of normal, and psychiatric patients. The developing tests items were collected from a review of the literature in which all the items were based on the WHO definition of Quality of Life. Then, experts judgment by the Delphi technique was used in the first stage. After that a pilot study was used to evaluate the testing administration, and wording of the tests items. The final stage was collected data from the survey samples. The results of the present study showed that the final test was composed 25 items. The construct validity of this test consists of six domains: Physical, Cognitive, Affective, Social Function, Economic and Self-Esteem. All the PTQL items have sufficient discriminant power It was found to be statistically significant different at the. 001 level between those people with mental disorders and normal people. There was a high level of concurrent validity association with WHOQOL-BREF, Pearson correlation coefficient and Area under ROC curve were 0.92 and 0.97 respectively. The reliability coefficients for the Alpha coefficients of the PTQL total test was 0.88. The values of the six scales were from 0.81 to 0:91. The present study was directed at developing an effective psychometric properties pictorial quality of life questionnaire. The result will be a more direct and meaningful application of an instrument to detect the mental health illness poor quality of life in Thai communities.

  8. 46 CFR 38.20-10 - Ventilation-T/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... equipped with power ventilation of the exhaust type having capacity sufficient to effect a complete change of air in not more than 3 minutes equal to the volume of the compartment and associated trunks. (b) The power ventilation units shall not produce a source of vapor ignition in either the compartment or...

  9. 78 FR 42323 - Pilot Certification and Qualification Requirements for Air Carrier Operations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... sufficient. \\4\\ In addition, military PIC time (up to 500 hours) in a multiengine turbine-powered, fixed-wing... aerodynamic stall (insufficient airflow over the wings). The flightcrew's response to the stall warning system.... Military PIC time in a multiengine turbine-powered, fixed-wing airplane in an operation requiring more than...

  10. 10 CFR 431.325 - Units to be tested.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... EQUIPMENT Metal Halide Lamp Ballasts and Fixtures Test Procedures § 431.325 Units to be tested. For each basic model of metal halide lamp ballast selected for testing, a sample of sufficient size, no less than... energy efficiency calculated as the measured output power to the lamp divided by the measured input power...

  11. Power in urban social-ecological systems: Processes and practices of governance and marginalization

    Treesearch

    Lindsay K. Campbell; Nate Gabriel

    2016-01-01

    Historically, the urban forestry literature, including the workfeatured in Urban Forestry and Urban Greening, has focused primarily on either quantitative, positivistic analyses of human-environment dynamics, or applied research to inform the management of natural resources, without sufficiently problematizing the effects of power within these processes (Bentsen et al...

  12. Comprehensive Genome-wide Screen for Genes with Cis-acting Regulatory Elements That Respond to Marek's Disease Virus Infection

    USDA-ARS?s Scientific Manuscript database

    The comprehensive identification of genes underlying phenotypic variation of complex traits such as disease resistance remains one of the greatest challenges in biology despite having genome sequences and more powerful tools. Most genome-wide screens lack sufficient resolving power as they typically...

  13. A Monte Carlo Simulation Comparing the Statistical Precision of Two High-Stakes Teacher Evaluation Methods: A Value-Added Model and a Composite Measure

    ERIC Educational Resources Information Center

    Spencer, Bryden

    2016-01-01

    Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…

  14. The Ironic Effect of Significant Results on the Credibility of Multiple-Study Articles

    ERIC Educational Resources Information Center

    Schimmack, Ulrich

    2012-01-01

    Cohen (1962) pointed out the importance of statistical power for psychology as a science, but statistical power of studies has not increased, while the number of studies in a single article has increased. It has been overlooked that multiple studies with modest power have a high probability of producing nonsignificant results because power…

  15. The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study

    ERIC Educational Resources Information Center

    Dong, Nianbo; Lipsey, Mark

    2010-01-01

    This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…

  16. Asking Sensitive Questions: A Statistical Power Analysis of Randomized Response Models

    ERIC Educational Resources Information Center

    Ulrich, Rolf; Schroter, Hannes; Striegel, Heiko; Simon, Perikles

    2012-01-01

    This article derives the power curves for a Wald test that can be applied to randomized response models when small prevalence rates must be assessed (e.g., detecting doping behavior among elite athletes). These curves enable the assessment of the statistical power that is associated with each model (e.g., Warner's model, crosswise model, unrelated…

  17. Avalanches and diffusion in bubble rafts

    NASA Astrophysics Data System (ADS)

    Maloney, C. E.

    2015-07-01

    Energy dissipation distributions and particle displacement statistics are studied in the mean-field version of Durian's bubble model. A two-dimensional (2D) bi-disperse mixture is simulated at various strain rates, \\dotγ , and packing ratios, ϕ, above the rigidity onset at φ=φc . Well above φc , and at sufficiently low \\dotγ , the system responds in a highly bursty way, reminiscent of other dynamically critical systems with a power-law distribution of energy dissipation. As one increases \\dotγ at fixed ϕ or tunes φ→ φc at fixed \\dotγ , the bursty behavior vanishes. Displacement distributions are non-Fickian at short times but cross to a Fickian regime at a universal strain, Δγ* , independent of \\dotγ and ϕ. Despite the profound differences in short-time dynamics, at intermediate Δγ the systems exhibit qualitatively similar spatial patterns of deformation with lines of slip extending across large fractions of the simulation cell. These deformation patterns explain the observed diffusion constants and the universal crossover time to Fickian behavior.

  18. Indoor radon and childhood leukaemia.

    PubMed

    Raaschou-Nielsen, Ole

    2008-01-01

    This paper summarises the epidemiological literature on domestic exposure to radon and risk for childhood leukaemia. The results of 12 ecological studies show a consistent pattern of higher incidence and mortality rates for childhood leukaemia in areas with higher average indoor radon concentrations. Although the results of such studies are useful to generate hypotheses, they must be interpreted with caution, as the data were aggregated and analysed for geographical areas and not for individuals. The seven available case-control studies of childhood leukaemia with measurement of radon concentrations in the residences of cases and controls gave mixed results, however, with some indication of a weak (relative risk < 2) association with acute lymphoblastic leukaemia. The epidemiological evidence to date suggests that an association between indoor exposure to radon and childhood leukaemia might exist, but is weak. More case-control studies are needed, with sufficient statistical power to detect weak associations and based on designs and methods that minimise misclassification of exposure and provide a high participation rate and low potential selection bias.

  19. Vascular surgical data registries for small computers.

    PubMed

    Kaufman, J L; Rosenberg, N

    1984-08-01

    Recent designs for computer-based vascular surgical registries and clinical data bases have employed large centralized systems with formal programming and mass storage. Small computers, of the types created for office use or for word processing, now contain sufficient speed and memory storage capacity to allow construction of decentralized office-based registries. Using a standardized dictionary of terms and a method of data organization adapted to word processing, we have created a new vascular surgery data registry, "VASREG." Data files are organized without programming, and a limited number of powerful logical statements in English are used for sorting. The capacity is 25,000 records with current inexpensive memory technology. VASREG is adaptable to computers made by a variety of manufacturers, and interface programs are available for conversion of the word processor formated registry data into forms suitable for analysis by programs written in a standard programming language. This is a low-cost clinical data registry available to any physician. With a standardized dictionary, preparation of regional and national statistical summaries may be facilitated.

  20. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards.

    PubMed

    Bornstein, Marc H; Jager, Justin; Putnick, Diane L

    2013-12-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study's target population, whether they yield representative and generalizable estimates of subsamples within a study's target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce "noise" related to variation in subsamples and whether that "noise" can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting.

  1. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    PubMed Central

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2014-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s target population, whether they yield representative and generalizable estimates of subsamples within a study’s target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce “noise” related to variation in subsamples and whether that “noise” can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting. PMID:25580049

  2. Efficient model for low-energy transverse beam dynamics in a nine-cell 1.3 GHz cavity

    NASA Astrophysics Data System (ADS)

    Hellert, Thorsten; Dohlus, Martin; Decking, Winfried

    2017-10-01

    FLASH and the European XFEL are SASE-FEL user facilities, at which superconducting TESLA cavities are operated in a pulsed mode to accelerate long bunch-trains. Several cavities are powered by one klystron. While the low-level rf system is able to stabilize the vector sum of the accelerating gradient of one rf station sufficiently, the rf parameters of individual cavities vary within the bunch-train. In correlation with misalignments, intrabunch-train trajectory variations are induced. An efficient model is developed to describe the effect at low beam energy, using numerically adjusted transfer matrices and discrete coupler kick coefficients, respectively. Comparison with start-to-end tracking and dedicated experiments at the FLASH injector will be shown. The short computation time of the derived model allows for comprehensive numerical studies on the impact of misalignments and variable rf parameters on the transverse intra-bunch-train beam stability at the injector module. Results from both, statistical multibunch performance studies and the deduction of misalignments from multibunch experiments are presented.

  3. Flexible, fast and accurate sequence alignment profiling on GPGPU with PaSWAS.

    PubMed

    Warris, Sven; Yalcin, Feyruz; Jackson, Katherine J L; Nap, Jan Peter

    2015-01-01

    To obtain large-scale sequence alignments in a fast and flexible way is an important step in the analyses of next generation sequencing data. Applications based on the Smith-Waterman (SW) algorithm are often either not fast enough, limited to dedicated tasks or not sufficiently accurate due to statistical issues. Current SW implementations that run on graphics hardware do not report the alignment details necessary for further analysis. With the Parallel SW Alignment Software (PaSWAS) it is possible (a) to have easy access to the computational power of NVIDIA-based general purpose graphics processing units (GPGPUs) to perform high-speed sequence alignments, and (b) retrieve relevant information such as score, number of gaps and mismatches. The software reports multiple hits per alignment. The added value of the new SW implementation is demonstrated with two test cases: (1) tag recovery in next generation sequence data and (2) isotype assignment within an immunoglobulin 454 sequence data set. Both cases show the usability and versatility of the new parallel Smith-Waterman implementation.

  4. Robust short-term memory without synaptic learning.

    PubMed

    Johnson, Samuel; Marro, J; Torres, Joaquín J

    2013-01-01

    Short-term memory in the brain cannot in general be explained the way long-term memory can--as a gradual modification of synaptic weights--since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.

  5. Determinants of Whether or not Mixtures of Disinfection By-products are Similar

    EPA Science Inventory

    This project summary and its related publications provide information on the development of chemical, toxicological and statistical criteria for determining the sufficient similarity of complex chemical mixtures.

  6. The Angular Correlation Function of Galaxies from Early Sloan Digital Sky Survey Data

    NASA Astrophysics Data System (ADS)

    Connolly, Andrew J.; Scranton, Ryan; Johnston, David; Dodelson, Scott; Eisenstein, Daniel J.; Frieman, Joshua A.; Gunn, James E.; Hui, Lam; Jain, Bhuvnesh; Kent, Stephen; Loveday, Jon; Nichol, Robert C.; O'Connell, Liam; Postman, Marc; Scoccimarro, Roman; Sheth, Ravi K.; Stebbins, Albert; Strauss, Michael A.; Szalay, Alexander S.; Szapudi, István; Tegmark, Max; Vogeley, Michael S.; Zehavi, Idit; Annis, James; Bahcall, Neta; Brinkmann, J.; Csabai, István; Doi, Mamoru; Fukugita, Masataka; Hennessy, G. S.; Hindsley, Robert; Ichikawa, Takashi; Ivezić, Željko; Kim, Rita S. J.; Knapp, Gillian R.; Kunszt, Peter; Lamb, D. Q.; Lee, Brian C.; Lupton, Robert H.; McKay, Timothy A.; Munn, Jeff; Peoples, John; Pier, Jeff; Rockosi, Constance; Schlegel, David; Stoughton, Christopher; Tucker, Douglas L.; Yanny, Brian; York, Donald G.

    2002-11-01

    The Sloan Digital Sky Survey is one of the first multicolor photometric and spectroscopic surveys designed to measure the statistical properties of galaxies within the local universe. In this paper we present some of the initial results on the angular two-point correlation function measured from the early SDSS galaxy data. The form of the correlation function, over the magnitude interval 18

  7. On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship.

    PubMed

    Koyama, Shinsuke

    2015-07-01

    We propose a statistical method for modeling the non-Poisson variability of spike trains observed in a wide range of brain regions. Central to our approach is the assumption that the variance and the mean of interspike intervals are related by a power function characterized by two parameters: the scale factor and exponent. It is shown that this single assumption allows the variability of spike trains to have an arbitrary scale and various dependencies on the firing rate in the spike count statistics, as well as in the interval statistics, depending on the two parameters of the power function. We also propose a statistical model for spike trains that exhibits the variance-to-mean power relationship. Based on this, a maximum likelihood method is developed for inferring the parameters from rate-modulated spike trains. The proposed method is illustrated on simulated and experimental spike trains.

  8. Joint probability of statistical success of multiple phase III trials.

    PubMed

    Zhang, Jianliang; Zhang, Jenny J

    2013-01-01

    In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Physics Leads to Free Elections in the Nuclear Age.

    NASA Astrophysics Data System (ADS)

    Synek, Miroslav

    2001-10-01

    ------------- Complex historical development on our planet, utilizing the knowledge of physics, has reached a powerful technology of nuclear intercontinental missiles, conceivably controllable through a computerized "push-button". Whenever this technology falls under the control of an irresponsible, miscalculating, or, insane dictator, with sufficiently powerful means of a huge, mass-produced, nuclear-missile built-up, anywhere on our planet, the very survival of all humanity on our planet could be threatened. Therefore, it is a historical urgency that this technology be under the control by a government of the people, by the people and for the people, based on a sufficiently reliable system of free elections, in any country on our planet, wherever and whenever a total nuclear holocaust could originate.

  10. An Example of an Improvable Rao-Blackwell Improvement, Inefficient Maximum Likelihood Estimator, and Unbiased Generalized Bayes Estimator.

    PubMed

    Galili, Tal; Meilijson, Isaac

    2016-01-02

    The Rao-Blackwell theorem offers a procedure for converting a crude unbiased estimator of a parameter θ into a "better" one, in fact unique and optimal if the improvement is based on a minimal sufficient statistic that is complete. In contrast, behind every minimal sufficient statistic that is not complete, there is an improvable Rao-Blackwell improvement. This is illustrated via a simple example based on the uniform distribution, in which a rather natural Rao-Blackwell improvement is uniformly improvable. Furthermore, in this example the maximum likelihood estimator is inefficient, and an unbiased generalized Bayes estimator performs exceptionally well. Counterexamples of this sort can be useful didactic tools for explaining the true nature of a methodology and possible consequences when some of the assumptions are violated. [Received December 2014. Revised September 2015.].

  11. Volcanic ash and daily mortality in Sweden after the Icelandic volcano eruption of May 2011.

    PubMed

    Oudin, Anna; Carlsen, Hanne K; Forsberg, Bertil; Johansson, Christer

    2013-12-10

    In the aftermath of the Icelandic volcano Grimsvötn's eruption on 21 May 2011, volcanic ash reached Northern Europe. Elevated levels of ambient particles (PM) were registered in mid Sweden. The aim of the present study was to investigate if the Grimsvötn eruption had an effect on mortality in Sweden. Based on PM measurements at 16 sites across Sweden, data were classified into an ash exposed data set (Ash area) and an unexposed data set (No ash area). Data on daily all-cause mortality were obtained from Statistics Sweden for the time period 1 April through 31 July 2011. Mortality ratios were calculated as the ratio between the daily number of deaths in the Ash area and the No ash area. The exposure period was defined as the week following the days with elevated particle concentrations, namely 24 May through 31 May. The control period was defined as 1 April through 23 May and 1 June through 31 July. There was no absolute increase in mortality during the exposure period. However, during the exposure period the mean mortality ratio was 2.42 compared with 2.17 during the control period, implying a relatively higher number of deaths in the Ash area than in the No ash area. The differences in ratios were mostly due to a single day, 31 May, and were not statistically significant when tested with a Mann-Whitney non-parametric test (p > 0.3). The statistical power was low with only 8 days in the exposure period (24 May through 31 May). Assuming that the observed relative differences were not due to chance, the results would imply an increase of 128 deaths during the exposure period 24-31 May. If 31 May was excluded, the number of extra deaths was reduced to 20. The results of the present study are contradicting and inconclusive, but may indicate that all-cause mortality was increased by the ash-fall from the Grimsvötn eruption. Meta-analysis or pooled analysis of data from neighboring countries might make it possible to reach sufficient statistical power to study effects of the Grimsvötn ash on morbidity and mortality. Such studies would be of particular importance for European societies preparing for future large scale volcanic eruptions in Iceland.

  12. Volcanic Ash and Daily Mortality in Sweden after the Icelandic Volcano Eruption of May 2011

    PubMed Central

    Oudin, Anna; Carlsen, Hanne K.; Forsberg, Bertil; Johansson, Christer

    2013-01-01

    In the aftermath of the Icelandic volcano Grimsvötn’s eruption on 21 May 2011, volcanic ash reached Northern Europe. Elevated levels of ambient particles (PM) were registered in mid Sweden. The aim of the present study was to investigate if the Grimsvötn eruption had an effect on mortality in Sweden. Based on PM measurements at 16 sites across Sweden, data were classified into an ash exposed data set (Ash area) and an unexposed data set (No ash area). Data on daily all-cause mortality were obtained from Statistics Sweden for the time period 1 April through 31 July 2011. Mortality ratios were calculated as the ratio between the daily number of deaths in the Ash area and the No ash area. The exposure period was defined as the week following the days with elevated particle concentrations, namely 24 May through 31 May. The control period was defined as 1 April through 23 May and 1 June through 31 July. There was no absolute increase in mortality during the exposure period. However, during the exposure period the mean mortality ratio was 2.42 compared with 2.17 during the control period, implying a relatively higher number of deaths in the Ash area than in the No ash area. The differences in ratios were mostly due to a single day, 31 May, and were not statistically significant when tested with a Mann-Whitney non-parametric test (p > 0.3). The statistical power was low with only 8 days in the exposure period (24 May through 31 May). Assuming that the observed relative differences were not due to chance, the results would imply an increase of 128 deaths during the exposure period 24–31 May. If 31 May was excluded, the number of extra deaths was reduced to 20. The results of the present study are contradicting and inconclusive, but may indicate that all-cause mortality was increased by the ash-fall from the Grimsvötn eruption. Meta-analysis or pooled analysis of data from neighboring countries might make it possible to reach sufficient statistical power to study effects of the Grimsvötn ash on morbidity and mortality. Such studies would be of particular importance for European societies preparing for future large scale volcanic eruptions in Iceland. PMID:24336019

  13. Power of tests for comparing trend curves with application to national immunization survey (NIS).

    PubMed

    Zhao, Zhen

    2011-02-28

    To develop statistical tests for comparing trend curves of study outcomes between two socio-demographic strata across consecutive time points, and compare statistical power of the proposed tests under different trend curves data, three statistical tests were proposed. For large sample size with independent normal assumption among strata and across consecutive time points, the Z and Chi-square test statistics were developed, which are functions of outcome estimates and the standard errors at each of the study time points for the two strata. For small sample size with independent normal assumption, the F-test statistic was generated, which is a function of sample size of the two strata and estimated parameters across study period. If two trend curves are approximately parallel, the power of Z-test is consistently higher than that of both Chi-square and F-test. If two trend curves cross at low interaction, the power of Z-test is higher than or equal to the power of both Chi-square and F-test; however, at high interaction, the powers of Chi-square and F-test are higher than that of Z-test. The measurement of interaction of two trend curves was defined. These tests were applied to the comparison of trend curves of vaccination coverage estimates of standard vaccine series with National Immunization Survey (NIS) 2000-2007 data. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Performance of an 8 kW Hall Thruster

    DTIC Science & Technology

    2000-01-12

    For the purpose of either orbit raising and/or repositioning the Hall thruster must be capable of delivering sufficient thrust to minimize transfer...time. This coupled with the increasing on-board electric power capacity of military and commercial satellites, requires a high power Hall thruster that...development of a novel, high power Hall thruster , capable of efficient operation over a broad range of Isp and thrust. We call such a thruster the bi

  15. Solid-state resistor for pulsed power machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoltzfus, Brian; Savage, Mark E.; Hutsel, Brian Thomas

    2016-12-06

    A flexible solid-state resistor comprises a string of ceramic resistors that can be used to charge the capacitors of a linear transformer driver (LTD) used in a pulsed power machine. The solid-state resistor is able to absorb the energy of a switch prefire, thereby limiting LTD cavity damage, yet has a sufficiently low RC charge time to allow the capacitor to be recharged without disrupting the operation of the pulsed power machine.

  16. Translations from the Soviet Journal of Atomic Energy

    DTIC Science & Technology

    1962-02-15

    constructing a new communist society. Atomic energy, i4n its role of a new and powerful source of highly con •entrated energy, can effect a con- siderable...problem have provided sufficient evidence of the perni- cious effects of radioactive contamination on humanbeings and require the development of special...be necessary to effect a considerable decrease in the cost of electrical power pro- duced at atomic electric power, stations. One of the most

  17. Interventions for reducing self-stigma in people with mental illnesses: a systematic review of randomized controlled trials.

    PubMed

    Büchter, Roland Brian; Messer, Melanie

    2017-01-01

    Background: Self-stigma occurs when people with mental illnesses internalize negative stereotypes and prejudices about their condition. It can reduce help-seeking behaviour and treatment adherence. The effectiveness of interventions aimed at reducing self-stigma in people with mental illness is systematically reviewed. Results are discussed in the context of a logic model of the broader social context of mental illness stigma. Methods: Medline, Embase, PsycINFO, ERIC, and CENTRAL were searched for randomized controlled trials in November 2013. Studies were assessed with the Cochrane risk of bias tool. Results: Five trials were eligible for inclusion, four of which provided data for statistical analyses. Four studies had a high risk of bias. The quality of evidence was very low for each set of interventions and outcomes. The interventions studied included various group based anti-stigma interventions and an anti-stigma booklet. The intensity and fidelity of most interventions was high. Two studies were considered to be sufficiently homogeneous to be pooled for the outcome self-stigma. The meta-analysis did not find a statistically significant effect (SMD [95% CI] at 3 months: -0.26 [-0.64, 0.12], I 2 =0%, n=108). None of the individual studies found sustainable effects on other outcomes, including recovery, help-seeking behaviour and self-stigma. Conclusions: The effectiveness of interventions against self-stigma is uncertain. Previous studies lacked statistical power, used questionable outcome measures and had a high risk of bias. Future studies should be based on robust methods and consider practical implications regarding intervention development (relevance, implementability, and placement in routine services).

  18. Mathematical aspects of assessing extreme events for the safety of nuclear plants

    NASA Astrophysics Data System (ADS)

    Potempski, Slawomir; Borysiewicz, Mieczyslaw

    2015-04-01

    In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.

  19. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    ERIC Educational Resources Information Center

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  20. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  1. Enhanced statistical tests for GWAS in admixed populations: assessment using African Americans from CARe and a Breast Cancer Consortium.

    PubMed

    Pasaniuc, Bogdan; Zaitlen, Noah; Lettre, Guillaume; Chen, Gary K; Tandon, Arti; Kao, W H Linda; Ruczinski, Ingo; Fornage, Myriam; Siscovick, David S; Zhu, Xiaofeng; Larkin, Emma; Lange, Leslie A; Cupples, L Adrienne; Yang, Qiong; Akylbekova, Ermeg L; Musani, Solomon K; Divers, Jasmin; Mychaleckyj, Joe; Li, Mingyao; Papanicolaou, George J; Millikan, Robert C; Ambrosone, Christine B; John, Esther M; Bernstein, Leslie; Zheng, Wei; Hu, Jennifer J; Ziegler, Regina G; Nyante, Sarah J; Bandera, Elisa V; Ingles, Sue A; Press, Michael F; Chanock, Stephen J; Deming, Sandra L; Rodriguez-Gil, Jorge L; Palmer, Cameron D; Buxbaum, Sarah; Ekunwe, Lynette; Hirschhorn, Joel N; Henderson, Brian E; Myers, Simon; Haiman, Christopher A; Reich, David; Patterson, Nick; Wilson, James G; Price, Alkes L

    2011-04-01

    While genome-wide association studies (GWAS) have primarily examined populations of European ancestry, more recent studies often involve additional populations, including admixed populations such as African Americans and Latinos. In admixed populations, linkage disequilibrium (LD) exists both at a fine scale in ancestral populations and at a coarse scale (admixture-LD) due to chromosomal segments of distinct ancestry. Disease association statistics in admixed populations have previously considered SNP association (LD mapping) or admixture association (mapping by admixture-LD), but not both. Here, we introduce a new statistical framework for combining SNP and admixture association in case-control studies, as well as methods for local ancestry-aware imputation. We illustrate the gain in statistical power achieved by these methods by analyzing data of 6,209 unrelated African Americans from the CARe project genotyped on the Affymetrix 6.0 chip, in conjunction with both simulated and real phenotypes, as well as by analyzing the FGFR2 locus using breast cancer GWAS data from 5,761 African-American women. We show that, at typed SNPs, our method yields an 8% increase in statistical power for finding disease risk loci compared to the power achieved by standard methods in case-control studies. At imputed SNPs, we observe an 11% increase in statistical power for mapping disease loci when our local ancestry-aware imputation framework and the new scoring statistic are jointly employed. Finally, we show that our method increases statistical power in regions harboring the causal SNP in the case when the causal SNP is untyped and cannot be imputed. Our methods and our publicly available software are broadly applicable to GWAS in admixed populations.

  2. Application and verification of ECMWF seasonal forecast for wind energy

    NASA Astrophysics Data System (ADS)

    Žagar, Mark; Marić, Tomislav; Qvist, Martin; Gulstad, Line

    2015-04-01

    A good understanding of long-term annual energy production (AEP) is crucial when assessing the business case of investing in green energy like wind power. The art of wind-resource assessment has emerged into a scientific discipline on its own, which has advanced at high pace over the last decade. This has resulted in continuous improvement of the AEP accuracy and, therefore, increase in business case certainty. Harvesting the full potential output of a wind farm or a portfolio of wind farms depends heavily on optimizing operation and management strategy. The necessary information for short-term planning (up to 14 days) is provided by standard weather and power forecasting services, and the long-term plans are based on climatology. However, the wind-power industry is lacking quality information on intermediate scales of the expected variability in seasonal and intra-annual variations and their geographical distribution. The seasonal power forecast presented here is designed to bridge this gap. The seasonal power production forecast is based on the ECMWF seasonal weather forecast and the Vestas' high-resolution, mesoscale weather library. The seasonal weather forecast is enriched through a layer of statistical post-processing added to relate large-scale wind speed anomalies to mesoscale climatology. The resulting predicted energy production anomalies, thus, include mesoscale effects not captured by the global forecasting systems. The turbine power output is non-linearly related to the wind speed, which has important implications for the wind power forecast. In theory, the wind power is proportional to the cube of wind speed. However, due to the nature of turbine design, this exponent is close to 3 only at low wind speeds, becomes smaller as the wind speed increases, and above 11-13 m/s the power output remains constant, called the rated power. The non-linear relationship between wind speed and the power output generally increases sensitivity of the forecasted power to the wind speed anomalies. On the other hand, in some cases and areas where turbines operate close to, or above the rated power, the sensitivity of power forecast is reduced. Thus, the seasonal power forecasting system requires good knowledge of the changes in frequency of events with sufficient wind speeds to have acceptable skill. The scientific background for the Vestas seasonal power forecasting system is described and the relationship between predicted monthly wind speed anomalies and observed wind energy production are investigated for a number of operating wind farms in different climate zones. Current challenges will be discussed and some future research and development areas identified.

  3. The Scientific Method, Diagnostic Bayes, and How to Detect Epistemic Errors

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2015-12-01

    In the past decades, Bayesian methods have found widespread application and use in environmental systems modeling. Bayes theorem states that the posterior probability, P(H|D) of a hypothesis, H is proportional to the product of the prior probability, P(H) of this hypothesis and the likelihood, L(H|hat{D}) of the same hypothesis given the new/incoming observations, \\hat {D}. In science and engineering, H often constitutes some numerical simulation model, D = F(x,.) which summarizes using algebraic, empirical, and differential equations, state variables and fluxes, all our theoretical and/or practical knowledge of the system of interest, and x are the d unknown parameters which are subject to inference using some data, \\hat {D} of the observed system response. The Bayesian approach is intimately related to the scientific method and uses an iterative cycle of hypothesis formulation (model), experimentation and data collection, and theory/hypothesis refinement to elucidate the rules that govern the natural world. Unfortunately, model refinement has proven to be very difficult in large part because of the poor diagnostic power of residual based likelihood functions tep{gupta2008}. This has inspired te{vrugt2013} to advocate the use of 'likelihood-free' inference using approximate Bayesian computation (ABC). This approach uses one or more summary statistics, S(\\hat {D}) of the original data, \\hat {D} designed ideally to be sensitive only to one particular process in the model. Any mismatch between the observed and simulated summary metrics is then easily linked to a specific model component. A recurrent issue with the application of ABC is self-sufficiency of the summary statistics. In theory, S(.) should contain as much information as the original data itself, yet complex systems rarely admit sufficient statistics. In this article, we propose to combine the ideas of ABC and regular Bayesian inference to guarantee that no information is lost in diagnostic model evaluation. This hybrid approach, coined diagnostic Bayes, uses the summary metrics as prior distribution and original data in the likelihood function, or P(x|\\hat {D}) ∝ P(x|S(\\hat {D})) L(x|\\hat {D}). A case study illustrates the ability of the proposed methodology to diagnose epistemic errors and provide guidance on model refinement.

  4. Long Working Hours and Subsequent Use of Psychotropic Medicine: A Study Protocol

    PubMed Central

    Albertsen, Karen

    2014-01-01

    Background Mental ill health is the most frequent cause of long-term sickness absence and disability retirement in Denmark. Some instances of mental ill health might be due to long working hours. A recent large cross-sectional study of a general working population in Norway found that not only “very much overtime”, but also “moderate overtime” (41-48 work hours/week) was significantly associated with increased levels of both anxiety and depression. These findings have not been sufficiently confirmed in longitudinal studies. Objective The objective of the study is to give a detailed plan for a research project aimed at investigating the possibility of a prospective association between weekly working hours and use of psychotropic medicine in the general working population of Denmark. Methods People from the general working population of Denmark have been surveyed, at various occasions in the time period 1995-2010, and interviewed about their work environment. The present study will link interview data from these surveys to national registers covering all inhabitants of Denmark. The participants will be followed for the first occurrence of redeemed prescriptions for psychotropic medicine. Poisson regression will be used to analyze incidence rates as a function of weekly working hours (32-40; 41-48; > 48 hours/week). The analyses will be controlled for gender, age, sample, shift work, and socioeconomic status. According to our feasibility studies, the statistical power is sufficient and the exposure is stable enough to make the study worth the while. Results The publication of the present study protocol ends the design phase of the project. In the next phase, the questionnaire data will be forwarded to Statistics Denmark where they will be linked to data on deaths, migrations, socioeconomic status, and redeemed prescriptions for psychotropic medication. We expect the analysis to be completed by the end of 2014 and the results to be published mid 2015. Conclusions The proposed project will be free from hindsight bias, since all hypotheses and statistical models are completely defined, peer-reviewed, and published before we link the exposure data to the outcome data. The results of the project will indicate to what extent and in what direction the national burden of mental ill health in Denmark has been influenced by long working hours. PMID:25239125

  5. Long working hours and subsequent use of psychotropic medicine: a study protocol.

    PubMed

    Hannerz, Harald; Albertsen, Karen

    2014-09-19

    Mental ill health is the most frequent cause of long-term sickness absence and disability retirement in Denmark. Some instances of mental ill health might be due to long working hours. A recent large cross-sectional study of a general working population in Norway found that not only "very much overtime", but also "moderate overtime" (41-48 work hours/week) was significantly associated with increased levels of both anxiety and depression. These findings have not been sufficiently confirmed in longitudinal studies. The objective of the study is to give a detailed plan for a research project aimed at investigating the possibility of a prospective association between weekly working hours and use of psychotropic medicine in the general working population of Denmark. People from the general working population of Denmark have been surveyed, at various occasions in the time period 1995-2010, and interviewed about their work environment. The present study will link interview data from these surveys to national registers covering all inhabitants of Denmark. The participants will be followed for the first occurrence of redeemed prescriptions for psychotropic medicine. Poisson regression will be used to analyze incidence rates as a function of weekly working hours (32-40; 41-48; > 48 hours/week). The analyses will be controlled for gender, age, sample, shift work, and socioeconomic status. According to our feasibility studies, the statistical power is sufficient and the exposure is stable enough to make the study worth the while. The publication of the present study protocol ends the design phase of the project. In the next phase, the questionnaire data will be forwarded to Statistics Denmark where they will be linked to data on deaths, migrations, socioeconomic status, and redeemed prescriptions for psychotropic medication. We expect the analysis to be completed by the end of 2014 and the results to be published mid 2015. The proposed project will be free from hindsight bias, since all hypotheses and statistical models are completely defined, peer-reviewed, and published before we link the exposure data to the outcome data. The results of the project will indicate to what extent and in what direction the national burden of mental ill health in Denmark has been influenced by long working hours.

  6. Design of pilot studies to inform the construction of composite outcome measures.

    PubMed

    Edland, Steven D; Ard, M Colin; Li, Weiwei; Jiang, Lingjing

    2017-06-01

    Composite scales have recently been proposed as outcome measures for clinical trials. For example, the Prodromal Alzheimer's Cognitive Composite (PACC) is the sum of z-score normed component measures assessing episodic memory, timed executive function, and global cognition. Alternative methods of calculating composite total scores using the weighted sum of the component measures that maximize signal-to-noise of the resulting composite score have been proposed. Optimal weights can be estimated from pilot data, but it is an open question how large a pilot trial is required to calculate reliably optimal weights. In this manuscript, we describe the calculation of optimal weights, and use large-scale computer simulations to investigate the question of how large a pilot study sample is required to inform the calculation of optimal weights. The simulations are informed by the pattern of decline observed in cognitively normal subjects enrolled in the Alzheimer's Disease Cooperative Study (ADCS) Prevention Instrument cohort study, restricting to n=75 subjects age 75 and over with an ApoE E4 risk allele and therefore likely to have an underlying Alzheimer neurodegenerative process. In the context of secondary prevention trials in Alzheimer's disease, and using the components of the PACC, we found that pilot studies as small as 100 are sufficient to meaningfully inform weighting parameters. Regardless of the pilot study sample size used to inform weights, the optimally weighted PACC consistently outperformed the standard PACC in terms of statistical power to detect treatment effects in a clinical trial. Pilot studies of size 300 produced weights that achieved near-optimal statistical power, and reduced required sample size relative to the standard PACC by more than half. These simulations suggest that modestly sized pilot studies, comparable to that of a phase 2 clinical trial, are sufficient to inform the construction of composite outcome measures. Although these findings apply only to the PACC in the context of prodromal AD, the observation that weights only have to approximate the optimal weights to achieve near-optimal performance should generalize. Performing a pilot study or phase 2 trial to inform the weighting of proposed composite outcome measures is highly cost-effective. The net effect of more efficient outcome measures is that smaller trials will be required to test novel treatments. Alternatively, second generation trials can use prior clinical trial data to inform weighting, so that greater efficiency can be achieved as we move forward.

  7. Effect size and statistical power in the rodent fear conditioning literature - A systematic review.

    PubMed

    Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.

  8. Effect size and statistical power in the rodent fear conditioning literature – A systematic review

    PubMed Central

    Macleod, Malcolm R.

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451

  9. On information, negentropy and H-theorem

    NASA Astrophysics Data System (ADS)

    Chakrabarti, C. G.; Sarker, N. G.

    1983-09-01

    The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.

  10. Purpose Restrictions on Information Use

    DTIC Science & Technology

    2013-06-03

    Employees are authorized to access Customer Information for business purposes only.” [5]. The HIPAA Privacy Rule requires that healthcare providers in the...outcomes can be probabilistic since the network does not know what ad will be best for each visitor but does have statistical information about various...beliefs as such beliefs are a sufficient statistic . Thus, the agent need only consider for each possible belief β it can have, what action it would

  11. 47 CFR 43.51 - Contracts and concessions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... market power on the foreign end of one or more of the U.S.-international routes included in the contract... International Bureau's World Wide Web site at http://www.fcc.gov/ib. The Commission will include on the list of... markets on the foreign end of the route or that it nevertheless lacks sufficient market power on the...

  12. 47 CFR 43.51 - Contracts and concessions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... market power on the foreign end of one or more of the U.S.-international routes included in the contract... International Bureau's World Wide Web site at http://www.fcc.gov/ib. The Commission will include on the list of... markets on the foreign end of the route or that it nevertheless lacks sufficient market power on the...

  13. 47 CFR 43.51 - Contracts and concessions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... market power on the foreign end of one or more of the U.S.-international routes included in the contract... International Bureau's World Wide Web site at http://www.fcc.gov/ib. The Commission will include on the list of... markets on the foreign end of the route or that it nevertheless lacks sufficient market power on the...

  14. Identifying Specific Genes Controlling Complex Traits Through A Genome-Wide Screen For cis-Acting Regulatory Elements - An Example Using Marek's Disease

    USDA-ARS?s Scientific Manuscript database

    The identification of specific genes underlying phenotypic variation of complex traits remains one of the greatest challenges in biology despite having genome sequences and more powerful tools. Most genome-wide screens lack sufficient resolving power as they typically depend on linkage. One altern...

  15. Detecting Genomic Clustering of Risk Variants from Sequence Data: Cases vs. Controls

    PubMed Central

    Schaid, Daniel J.; Sinnwell, Jason P.; McDonnell, Shannon K.; Thibodeau, Stephen N.

    2013-01-01

    As the ability to measure dense genetic markers approaches the limit of the DNA sequence itself, taking advantage of possible clustering of genetic variants in, and around, a gene would benefit genetic association analyses, and likely provide biological insights. The greatest benefit might be realized when multiple rare variants cluster in a functional region. Several statistical tests have been developed, one of which is based on the popular Kulldorff scan statistic for spatial clustering of disease. We extended another popular spatial clustering method – Tango’s statistic – to genomic sequence data. An advantage of Tango’s method is that it is rapid to compute, and when single test statistic is computed, its distribution is well approximated by a scaled chi-square distribution, making computation of p-values very rapid. We compared the Type-I error rates and power of several clustering statistics, as well as the omnibus sequence kernel association test (SKAT). Although our version of Tango’s statistic, which we call “Kernel Distance” statistic, took approximately half the time to compute than the Kulldorff scan statistic, it had slightly less power than the scan statistic. Our results showed that the Ionita-Laza version of Kulldorff’s scan statistic had the greatest power over a range of clustering scenarios. PMID:23842950

  16. Multifunctional Inflatable Structure Being Developed for the PowerSphere Concept

    NASA Technical Reports Server (NTRS)

    Peterson, Todd T.

    2003-01-01

    The continuing development of microsatellites and nanosatellites for low Earth orbits requires the collection of sufficient power for instruments onboard a low-weight, low-volume spacecraft. Because the overall surface area of a microsatellite or nanosatellite is small, body-mounted solar cells cannot provide enough power. The deployment of traditional, rigid, solar arrays necessitates larger satellite volumes and weights, and also requires extra apparatus for pointing. One solution to this power choke problem is the deployment of a spherical, inflatable power system. This power system, termed the "PowerSphere," has several advantages, including a high collection area, low weight and stowage volume, and the elimination of solar array pointing mechanisms.

  17. FUSION WELDING METHOD AND APPARATUS

    DOEpatents

    Wyman, W.L.; Steinkamp, W.I.

    1961-01-17

    An apparatus for the fusion welding of metal pieces at a joint is described. The apparatus comprises a highvacuum chamber enclosing the metal pieces and a thermionic filament emitter. Sufficient power is applied to the emitter so that when the electron emission therefrom is focused on the joint it has sufficient energy to melt the metal pieces, ionize the metallic vapor abcve the molten metal, and establish an arc discharge between the joint and the emitter.

  18. Sufficient condition for a finite-time singularity in a high-symmetry Euler flow: Analysis and statistics

    NASA Astrophysics Data System (ADS)

    Ng, C. S.; Bhattacharjee, A.

    1996-08-01

    A sufficient condition is obtained for the development of a finite-time singularity in a highly symmetric Euler flow, first proposed by Kida [J. Phys. Soc. Jpn. 54, 2132 (1995)] and recently simulated by Boratav and Pelz [Phys. Fluids 6, 2757 (1994)]. It is shown that if the second-order spatial derivative of the pressure (pxx) is positive following a Lagrangian element (on the x axis), then a finite-time singularity must occur. Under some assumptions, this Lagrangian sufficient condition can be reduced to an Eulerian sufficient condition which requires that the fourth-order spatial derivative of the pressure (pxxxx) at the origin be positive for all times leading up to the singularity. Analytical as well as direct numerical evaluation over a large ensemble of initial conditions demonstrate that for fixed total energy, pxxxx is predominantly positive with the average value growing with the numbers of modes.

  19. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  20. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  1. On the structure and phase transitions of power-law Poissonian ensembles

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Oshanin, Gleb

    2012-10-01

    Power-law Poissonian ensembles are Poisson processes that are defined on the positive half-line, and that are governed by power-law intensities. Power-law Poissonian ensembles are stochastic objects of fundamental significance; they uniquely display an array of fractal features and they uniquely generate a span of important applications. In this paper we apply three different methods—oligarchic analysis, Lorenzian analysis and heterogeneity analysis—to explore power-law Poissonian ensembles. The amalgamation of these analyses, combined with the topology of power-law Poissonian ensembles, establishes a detailed and multi-faceted picture of the statistical structure and the statistical phase transitions of these elemental ensembles.

  2. Capacity Adequacy and Revenue Sufficiency in Electricity Markets With Wind Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levin, Todd; Botterud, Audun

    2015-05-01

    We present a computationally efficient mixed-integer program (MIP) that determines optimal generator expansion decisions, as well as periodic unit commitment and dispatch. The model is applied to analyze the impact of increasing wind power capacity on the optimal generation mix and the profitability of thermal generators. In a case study, we find that increasing wind penetration reduces energy prices while the prices for operating reserves increase. Moreover, scarcity pricing for operating reserves through reserve shortfall penalties significantly impacts the prices and profitability of thermal generators. Without scarcity pricing, no thermal units are profitable, however scarcity pricing can ensure profitability formore » peaking units at high wind penetration levels. Capacity payments can also ensure profitability, but the payments required for baseload units to break even increase with the amount of wind power. The results indicate that baseload units are most likely to experience revenue sufficiency problems when wind penetration increases and new baseload units are only developed when natural gas prices are high and wind penetration is low.« less

  3. Visual and Statistical Analysis of Digital Elevation Models Generated Using Idw Interpolator with Varying Powers

    NASA Astrophysics Data System (ADS)

    Asal, F. F.

    2012-07-01

    Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical analysis has proven good potential in the DEM quality assessment.

  4. Evolution of coalitionary killing.

    PubMed

    Wrangham, R W

    1999-01-01

    Warfare has traditionally been considered unique to humans. It has, therefore, often been explained as deriving from features that are unique to humans, such as the possession of weapons or the adoption of a patriarchal ideology. Mounting evidence suggests, however, that coalitional killing of adults in neighboring groups also occurs regularly in other species, including wolves and chimpanzees. This implies that selection can favor components of intergroup aggression important to human warfare, including lethal raiding. Here I present the principal adaptive hypothesis for explaining the species distribution of intergroup coalitional killing. This is the "imbalance-of-power hypothesis," which suggests that coalitional killing is the expression of a drive for dominance over neighbors. Two conditions are proposed to be both necessary and sufficient to account for coalitional killing of neighbors: (1) a state of intergroup hostility; (2) sufficient imbalances of power between parties that one party can attack the other with impunity. Under these conditions, it is suggested, selection favors the tendency to hunt and kill rivals when the costs are sufficiently low. The imbalance-of-power hypothesis has been criticized on a variety of empirical and theoretical grounds which are discussed. To be further tested, studies of the proximate determinants of aggression are needed. However, current evidence supports the hypothesis that selection has favored a hunt-and-kill propensity in chimpanzees and humans, and that coalitional killing has a long history in the evolution of both species.

  5. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany A.

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about themore » suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.« less

  6. Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany A.; Milligan, Michael; Brinkman, Greg

    Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about themore » suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.« less

  7. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will be well described by statistical theories. If, however, the power spectrum maintains its discrete, isolated character, as is the case for 1,2-difluoroethane, the opposite conclusion is suggested. Since power spectra are very easily computed, this diagnostic method may prove to be useful.

  8. Anomalous Fluctuations in Autoregressive Models with Long-Term Memory

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Hidetsugu; Honjo, Haruo

    2015-10-01

    An autoregressive model with a power-law type memory kernel is studied as a stochastic process that exhibits a self-affine-fractal-like behavior for a small time scale. We find numerically that the root-mean-square displacement Δ(m) for the time interval m increases with a power law as mα with α < 1/2 for small m but saturates at sufficiently large m. The exponent α changes with the power exponent of the memory kernel.

  9. The Army Communications Objectives Measurement System (ACOMS): Survey Design

    DTIC Science & Technology

    1988-04-01

    monthly basis so that the annual sample includes sufficient Hispanics to detect at the .80 power level: (1) Year-to-year changes of 3% in item...Hispanics. The requirements are listed in terms of power level and must be translated into requisite sample sizes. The requirements are expressed as the...annual samples needed to detect certain differences at the 80% power level. Differences in both directions are to be examined, so that a two-tailed

  10. A powerful approach for association analysis incorporating imprinting effects

    PubMed Central

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-01-01

    Motivation: For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. Results: In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy–Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. Contact: wingfung@hku.hk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21798962

  11. A powerful approach for association analysis incorporating imprinting effects.

    PubMed

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-09-15

    For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy-Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. wingfung@hku.hk Supplementary data are available at Bioinformatics online.

  12. Experimental investigation of fan-folded piezoelectric energy harvesters for powering pacemakers

    PubMed Central

    Ansari, M H; Karami, M Amin

    2018-01-01

    This paper studies the fabrication and testing of a magnet free piezoelectric energy harvester (EH) for powering biomedical devices and sensors inside the body. The design for the EH is a fan-folded structure consisting of bimorph piezoelectric beams folding on top of each other. An actual size experimental prototype is fabricated to verify the developed analytical models. The model is verified by matching the analytical results of the tip acceleration frequency response functions (FRF) and voltage FRF with the experimental results. The generated electricity is measured when the EH is excited by the heartbeat. A closed loop shaker system is utilized to reproduce the heartbeat vibrations. Achieving low fundamental natural frequency is a key factor to generate sufficient energy for pacemakers using heartbeat vibrations. It is shown that the natural frequency of the small-scale device is less than 20 Hz due to its unique fan-folded design. The experimental results show that the small-scale EH generates sufficient power for state of the art pacemakers. The 1 cm3 EH with18.4 gr tip mass generates more than16 μW of power from a normal heartbeat waveform. The robustness of the device to the heart rate is also studied by measuring the relation between the power output and the heart rate. PMID:29674807

  13. Experimental investigation of fan-folded piezoelectric energy harvesters for powering pacemakers.

    PubMed

    Ansari, M H; Karami, M Amin

    2017-06-01

    This paper studies the fabrication and testing of a magnet free piezoelectric energy harvester (EH) for powering biomedical devices and sensors inside the body. The design for the EH is a fan-folded structure consisting of bimorph piezoelectric beams folding on top of each other. An actual size experimental prototype is fabricated to verify the developed analytical models. The model is verified by matching the analytical results of the tip acceleration frequency response functions (FRF) and voltage FRF with the experimental results. The generated electricity is measured when the EH is excited by the heartbeat. A closed loop shaker system is utilized to reproduce the heartbeat vibrations. Achieving low fundamental natural frequency is a key factor to generate sufficient energy for pacemakers using heartbeat vibrations. It is shown that the natural frequency of the small-scale device is less than 20 Hz due to its unique fan-folded design. The experimental results show that the small-scale EH generates sufficient power for state of the art pacemakers. The 1 cm 3 EH with18.4 gr tip mass generates more than16 μ W of power from a normal heartbeat waveform. The robustness of the device to the heart rate is also studied by measuring the relation between the power output and the heart rate.

  14. Design and simulation of front end power converter for a microgrid with fuel cells and solar power sources

    NASA Astrophysics Data System (ADS)

    Jeevargi, Chetankumar; Lodhi, Anuj; Sateeshkumar, Allu; Elangovan, D.; Arunkumar, G.

    2017-11-01

    The need for Renewable Energy Sources (RES) is increasing due to increased demand for the supply of power and it is also environment friendly.In the recent few years, the cost of generation of the power from the RES has been decreased. This paper aims to design the front end power converter which is required for integrating the fuel cells and solar power sources to the micro grid. The simulation of the designed front end converter is carried out in the PSIM 9.1.1 software. The results show that the designed front end power converter is sufficient for integrating the micro grid with fuel cells and solar power sources.

  15. Density dependence of the nuclear energy-density functional

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Panagiota; Park, Tae-Sun; Lim, Yeunhwan; Hyun, Chang Ho

    2018-01-01

    Background: The explicit density dependence in the coupling coefficients entering the nonrelativistic nuclear energy-density functional (EDF) is understood to encode effects of three-nucleon forces and dynamical correlations. The necessity for the density-dependent coupling coefficients to assume the form of a preferably small fractional power of the density ρ is empirical and the power is often chosen arbitrarily. Consequently, precision-oriented parametrizations risk overfitting in the regime of saturation and extrapolations in dilute or dense matter may lose predictive power. Purpose: Beginning with the observation that the Fermi momentum kF, i.e., the cubic root of the density, is a key variable in the description of Fermi systems, we first wish to examine if a power hierarchy in a kF expansion can be inferred from the properties of homogeneous matter in a domain of densities, which is relevant for nuclear structure and neutron stars. For subsequent applications we want to determine a functional that is of good quality but not overtrained. Method: For the EDF, we fit systematically polynomial and other functions of ρ1 /3 to existing microscopic, variational calculations of the energy of symmetric and pure neutron matter (pseudodata) and analyze the behavior of the fits. We select a form and a set of parameters, which we found robust, and examine the parameters' naturalness and the quality of resulting extrapolations. Results: A statistical analysis confirms that low-order terms such as ρ1 /3 and ρ2 /3 are the most relevant ones in the nuclear EDF beyond lowest order. It also hints at a different power hierarchy for symmetric vs. pure neutron matter, supporting the need for more than one density-dependent term in nonrelativistic EDFs. The functional we propose easily accommodates known or adopted properties of nuclear matter near saturation. More importantly, upon extrapolation to dilute or asymmetric matter, it reproduces a range of existing microscopic results, to which it has not been fitted. It also predicts a neutron-star mass-radius relation consistent with observations. The coefficients display naturalness. Conclusions: Having been already determined for homogeneous matter, a functional of the present form can be mapped onto extended Skyrme-type functionals in a straightforward manner, as we outline here, for applications to finite nuclei. At the same time, the statistical analysis can be extended to higher orders and for different microscopic (ab initio) calculations with sufficient pseudodata points and for polarized matter.

  16. A novel approach for choosing summary statistics in approximate Bayesian computation.

    PubMed

    Aeschbacher, Simon; Beaumont, Mark A; Futschik, Andreas

    2012-11-01

    The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θ(anc) = 4N(e)u) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L(2)-loss performs best. Applying that method to the ibex data, we estimate θ(anc)≈ 1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10(-4) and 3.5 × 10(-3) per locus per generation. The proportion of males with access to matings is estimated as ω≈ 0.21, which is in good agreement with recent independent estimates.

  17. A Novel Approach for Choosing Summary Statistics in Approximate Bayesian Computation

    PubMed Central

    Aeschbacher, Simon; Beaumont, Mark A.; Futschik, Andreas

    2012-01-01

    The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θanc = 4Neu) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L2-loss performs best. Applying that method to the ibex data, we estimate θ^anc≈1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10−4 and 3.5 × 10−3 per locus per generation. The proportion of males with access to matings is estimated as ω^≈0.21, which is in good agreement with recent independent estimates. PMID:22960215

  18. Determinants of 25(OH)D sufficiency in obese minority children: selecting outcome measures and analytic approaches.

    PubMed

    Zhou, Ping; Schechter, Clyde; Cai, Ziyong; Markowitz, Morri

    2011-06-01

    To highlight complexities in defining vitamin D sufficiency in children. Serum 25-(OH) vitamin D [25(OH)D] levels from 140 healthy obese children age 6 to 21 years living in the inner city were compared with multiple health outcome measures, including bone biomarkers and cardiovascular risk factors. Several statistical analytic approaches were used, including Pearson correlation, analysis of covariance (ANCOVA), and "hockey stick" regression modeling. Potential threshold levels for vitamin D sufficiency varied by outcome variable and analytic approach. Only systolic blood pressure (SBP) was significantly correlated with 25(OH)D (r = -0.261; P = .038). ANCOVA revealed that SBP and triglyceride levels were statistically significant in the test groups [25(OH)D <10, <15 and <20 ng/mL] compared with the reference group [25(OH)D >25 ng/mL]. ANCOVA also showed that only children with severe vitamin D deficiency [25(OH)D <10 ng/mL] had significantly higher parathyroid hormone levels (Δ = 15; P = .0334). Hockey stick model regression analyses found evidence of a threshold level in SBP, with a 25(OH)D breakpoint of 27 ng/mL, along with a 25(OH)D breakpoint of 18 ng/mL for triglycerides, but no relationship between 25(OH)D and parathyroid hormone. Defining vitamin D sufficiency should take into account different vitamin D-related health outcome measures and analytic methodologies. Copyright © 2011 Mosby, Inc. All rights reserved.

  19. Single photon laser altimeter simulator and statistical signal processing

    NASA Astrophysics Data System (ADS)

    Vacek, Michael; Prochazka, Ivan

    2013-05-01

    Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.

  20. Impact of a family medicine resident wellness curriculum: a feasibility study.

    PubMed

    Runyan, Christine; Savageau, Judith A; Potts, Stacy; Weinreb, Linda

    2016-01-01

    Background Up to 60% of practicing physicians report symptoms of burnout, which often peak during residency. Residency is also a relevant time for habits of self-care and resiliency to be emphasized. A growing literature underscores the importance of this; however, evidence about effective burnout prevention curriculum during residency remains limited. Objectives The purpose of this project is to evaluate the impact of a new, 1-month wellness curriculum for 12 second-year family medicine residents on burnout, empathy, stress, and self-compassion. Methods The pilot program, introduced during a new rotation emphasizing competencies around leadership, focused on teaching skills to cultivate mindfulness and self-compassion in order to enhance empathy and reduce stress. Pre-assessments and 3-month follow-up assessments on measures of burnout, empathy, self-compassion, and perceived stress were collected to evaluate the impact of the curriculum. It was hypothesized that this curriculum would enhance empathy and self-compassion as well as reduce stress and burnout among family medicine residents. Results Descriptive statistics revealed positive trends on the mean scores of all the measures, particularly the Mindfulness Scale of the Self-Compassion Inventory and the Jefferson Empathy Scale. However, the small sample size and lack of sufficient power to detect meaningful differences limited the use of inferential statistics. Conclusions This feasibility study demonstrates how a residency wellness curriculum can be developed, implemented, and evaluated with promising results, including high participant satisfaction.

  1. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE PAGES

    Bonnett, C.; Troxel, M. A.; Hartley, W.; ...

    2016-08-30

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  2. Automated use of mutagenesis data in structure prediction.

    PubMed

    Nanda, Vikas; DeGrado, William F

    2005-05-15

    In the absence of experimental structural determination, numerous methods are available to indirectly predict or probe the structure of a target molecule. Genetic modification of a protein sequence is a powerful tool for identifying key residues involved in binding reactions or protein stability. Mutagenesis data is usually incorporated into the modeling process either through manual inspection of model compatibility with empirical data, or through the generation of geometric constraints linking sensitive residues to a binding interface. We present an approach derived from statistical studies of lattice models for introducing mutation information directly into the fitness score. The approach takes into account the phenotype of mutation (neutral or disruptive) and calculates the energy for a given structure over an ensemble of sequences. The structure prediction procedure searches for the optimal conformation where neutral sequences either have no impact or improve stability and disruptive sequences reduce stability relative to wild type. We examine three types of sequence ensembles: information from saturation mutagenesis, scanning mutagenesis, and homologous proteins. Incorporating multiple sequences into a statistical ensemble serves to energetically separate the native state and misfolded structures. As a result, the prediction of structure with a poor force field is sufficiently enhanced by mutational information to improve accuracy. Furthermore, by separating misfolded conformations from the target score, the ensemble energy serves to speed up conformational search algorithms such as Monte Carlo-based methods. Copyright 2005 Wiley-Liss, Inc.

  3. First Results on the Variability of Mid- and High-Latitude Ionospheric Electric Fields at 1- Second Time Scales

    NASA Astrophysics Data System (ADS)

    Ruohoniemi, J. M.; Greenwald, R. A.; Oksavik, K.; Baker, J. B.

    2007-12-01

    The electric fields at high latitudes are often modeled as a static pattern in the absence of variation in solar wind parameters or geomagnetic disturbance. However, temporal variability in the local electric fields on time scales of minutes for stable conditions has been reported and characterized statistically as an intrinsic property amounting to turbulence. We describe the results of applying a new technique to SuperDARN HF radar observations of ionospheric plasma convection at middle and high latitudes that gives views of the variability of the electric fields at sub-second time scales. We address the question of whether there is a limit to the temporal scale of the electric field variability and consider whether the turbulence on minute time scales is due to organized but unresolved behavior. The basis of the measurements is the ability to record raw samples from the individual multipulse sequences that are transmitted during the standard 3 or 6-second SuperDARN integration period; a backscattering volume is then effectively sampled at a cadence of 200 ms. The returns from the individual sequences are often sufficiently well-ordered to permit a sequence-by-sequence characterization of the electric field and backscattered power. We attempt a statistical characterization of the variability at these heretofore inaccessible time scales and consider how variability is influenced by solar wind and magentospheric factors.

  4. Impact of a family medicine resident wellness curriculum: a feasibility study.

    PubMed

    Runyan, Christine; Savageau, Judith A; Potts, Stacy; Weinreb, Linda

    2016-01-01

    Up to 60% of practicing physicians report symptoms of burnout, which often peak during residency. Residency is also a relevant time for habits of self-care and resiliency to be emphasized. A growing literature underscores the importance of this; however, evidence about effective burnout prevention curriculum during residency remains limited. The purpose of this project is to evaluate the impact of a new, 1-month wellness curriculum for 12 second-year family medicine residents on burnout, empathy, stress, and self-compassion. The pilot program, introduced during a new rotation emphasizing competencies around leadership, focused on teaching skills to cultivate mindfulness and self-compassion in order to enhance empathy and reduce stress. Pre-assessments and 3-month follow-up assessments on measures of burnout, empathy, self-compassion, and perceived stress were collected to evaluate the impact of the curriculum. It was hypothesized that this curriculum would enhance empathy and self-compassion as well as reduce stress and burnout among family medicine residents. Descriptive statistics revealed positive trends on the mean scores of all the measures, particularly the Mindfulness Scale of the Self-Compassion Inventory and the Jefferson Empathy Scale. However, the small sample size and lack of sufficient power to detect meaningful differences limited the use of inferential statistics. This feasibility study demonstrates how a residency wellness curriculum can be developed, implemented, and evaluated with promising results, including high participant satisfaction.

  5. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnett, C.; Troxel, M. A.; Hartley, W.

    Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  6. Engagement of large-scale networks is related to individual differences in inhibitory control

    PubMed Central

    Congdon, Eliza; Mumford, Jeanette A.; Cohen, Jessica R.; Galvan, Adriana; Aron, Adam R.; Xue, Gui; Miller, Eric; Poldrack, Russell A.

    2010-01-01

    Understanding which brain regions regulate the execution, and suppression, of goal-directed behavior has implications for a number of areas of research. In particular, understanding which brain regions engaged during tasks requiring the execution and inhibition of a motor response provides insight into the mechanisms underlying individual differences in response inhibition ability. However, neuroimaging studies examing the relation between activation and stopping have been inconsistent regarding the direction of the relationship, and also regarding the anatomical location of regions that correlate with behavior. These limitations likely arise from the relatively low power of vox-elwise correlations with small sample sizes. Here, we pooled data over five separate fMRI studies of the Stop-signal task in order to obtain a sufficiently large sample size to robustly detect brain/behavior correlations. In addition, rather than performing mass univariate correlation analysis across all voxels, we increased statistical power by reducing the dimensionality of the data set using independent components analysis and then examined correlations between behavior and the resulting component scores. We found that components reflecting activity in regions thought to be involved in stopping were associated with better stopping ability, while activity in a default-mode network was associated with poorer stopping ability across individuals. These results clearly show a relationship between individual differences in stopping ability in specific activated networks, including regions known to be critical for the behavior. The results also highlight the usefulness of using dimensionality reduction to increase the power to detect brain/behavior correlations in individual differences research. PMID:20600962

  7. Modified Whittaker plots as an assessment and monitoring tool for vegetation in a lowland tropical rainforest.

    PubMed

    Campbell, Patrick; Comiskey, James; Alonso, Alfonso; Dallmeier, Francisco; Nuñez, Percy; Beltran, Hamilton; Baldeon, Severo; Nauray, William; de la Colina, Rafael; Acurio, Lucero; Udvardy, Shana

    2002-05-01

    Resource exploitation in lowland tropical forests is increasing and causing loss of biodiversity. Effective evaluation and management of the impacts of development on tropical forests requires appropriate assessment and monitoring tools. We propose the use of 0.1-ha multi-scale, modified Whittaker plots (MWPs) to assess and monitor vegetation in lowland tropical rainforests. We established MWPs at 4 sites to: (1) describe and compare composition and structure of the sites using MWPs, (2) compare these results to those of 1-ha permanent vegetation plots (BDPs), and (3) evaluate the ability of MWPs to detect changes in populations (statistical power). We recorded more than 400 species at each site. Species composition among the sites was distinctive, while mean abundance and basal area was similar. Comparisons between MWPs and BDPs show that they record similar species composition and abundance and that both perform equally well at detecting rare species. However, MWPs tend to record more species, and power analysis studies show that MWPs were more effective at detecting changes in the mean number of species of trees > or = 10 cm in diameter at breast height (dbh) and in herbaceous plants. Ten MWPs were sufficient to detect a change of 11% in the mean number of herb species, and they were able to detect a 14% change in the mean number of species of trees > or =10 cm dbh. The value of MWPs for assessment and monitoring is discussed, along with recommendations for improving the sampling design to increase power.

  8. Aerobic power and flight capacity in birds: a phylogenetic test of the heart-size hypothesis.

    PubMed

    Nespolo, Roberto F; González-Lagos, César; Solano-Iguaran, Jaiber J; Elfwing, Magnus; Garitano-Zavala, Alvaro; Mañosa, Santiago; Alonso, Juan Carlos; Altimiras, Jordi

    2018-01-09

    Flight capacity is one of the most important innovations in animal evolution; it only evolved in insects, birds, mammals and the extinct pterodactyls. Given that powered flight represents a demanding aerobic activity, an efficient cardiovascular system is essential for the continuous delivery of oxygen to the pectoral muscles during flight. It is well known that the limiting step in the circulation is stroke volume (the volume of blood pumped from the ventricle to the body during each beat), which is determined by the size of the ventricle. Thus, the fresh mass of the heart represents a simple and repeatable anatomical measure of the aerobic power of an animal. Although several authors have compared heart masses across bird species, a phylogenetic comparative analysis is still lacking. By compiling heart sizes for 915 species and applying several statistical procedures controlling for body size and/or testing for adaptive trends in the dataset (e.g. model selection approaches, phylogenetic generalized linear models), we found that (residuals of) heart size is consistently associated with four categories of flight capacity. In general, our results indicate that species exhibiting continuous hovering flight (i.e. hummingbirds) have substantially larger hearts than other groups, species that use flapping flight and gliding show intermediate values, and that species categorized as poor flyers show the smallest values. Our study reveals that on a broad scale, routine flight modes seem to have shaped the energetic requirements of birds sufficiently to be anatomically detected at the comparative level. © 2018. Published by The Company of Biologists Ltd.

  9. More Power to America.

    ERIC Educational Resources Information Center

    Miller, Willis H.

    1979-01-01

    Surveys America's current energy situation and considers means of attaining domestic energy self-sufficiency. Information is presented on hazards of decreasing energy production, traditional energy sources, and exotic energy sources. (Author/DB)

  10. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  11. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  12. Wireless power charging using point of load controlled high frequency power converters

    DOEpatents

    Miller, John M.; Campbell, Steven L.; Chambon, Paul H.; Seiber, Larry E.; White, Clifford P.

    2015-10-13

    An apparatus for wirelessly charging a battery of an electric vehicle is provided with a point of load control. The apparatus includes a base unit for generating a direct current (DC) voltage. The base unit is regulated by a power level controller. One or more point of load converters can be connected to the base unit by a conductor, with each point of load converter comprising a control signal generator that transmits a signal to the power level controller. The output power level of the DC voltage provided by the base unit is controlled by power level controller such that the power level is sufficient to power all active load converters when commanded to do so by any of the active controllers, without generating excessive power that may be otherwise wasted.

  13. Determination of Type I Error Rates and Power of Answer Copying Indices under Various Conditions

    ERIC Educational Resources Information Center

    Yormaz, Seha; Sünbül, Önder

    2017-01-01

    This study aims to determine the Type I error rates and power of S[subscript 1] , S[subscript 2] indices and kappa statistic at detecting copying on multiple-choice tests under various conditions. It also aims to determine how copying groups are created in order to calculate how kappa statistics affect Type I error rates and power. In this study,…

  14. Statistical Power Analysis with Microsoft Excel: Normal Tests for One or Two Means as a Prelude to Using Non-Central Distributions to Calculate Power

    ERIC Educational Resources Information Center

    Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa

    2009-01-01

    This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…

  15. On the analysis of very small samples of Gaussian repeated measurements: an alternative approach.

    PubMed

    Westgate, Philip M; Burchett, Woodrow W

    2017-03-15

    The analysis of very small samples of Gaussian repeated measurements can be challenging. First, due to a very small number of independent subjects contributing outcomes over time, statistical power can be quite small. Second, nuisance covariance parameters must be appropriately accounted for in the analysis in order to maintain the nominal test size. However, available statistical strategies that ensure valid statistical inference may lack power, whereas more powerful methods may have the potential for inflated test sizes. Therefore, we explore an alternative approach to the analysis of very small samples of Gaussian repeated measurements, with the goal of maintaining valid inference while also improving statistical power relative to other valid methods. This approach uses generalized estimating equations with a bias-corrected empirical covariance matrix that accounts for all small-sample aspects of nuisance correlation parameter estimation in order to maintain valid inference. Furthermore, the approach utilizes correlation selection strategies with the goal of choosing the working structure that will result in the greatest power. In our study, we show that when accurate modeling of the nuisance correlation structure impacts the efficiency of regression parameter estimation, this method can improve power relative to existing methods that yield valid inference. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  17. The use of imputed sibling genotypes in sibship-based association analysis: on modeling alternatives, power and model misspecification.

    PubMed

    Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I

    2013-05-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.

  18. The Explicit and Implicit Organizational Structures for the Collective Bargaining Process under the California Legislation.

    ERIC Educational Resources Information Center

    Criswell, Larry W.

    Douglas Mitchell suggests that statute construction issues arise from the interaction between the realities of power resources and the goal of giving each interest group sufficient power to protect and pursue its own interests while preserving the rights or interests of others. California SB 160 explicitly limits the scope of bargaining to wages,…

  19. Electrical Power Quality--What's behind the Outlet?

    ERIC Educational Resources Information Center

    Baird, William H.; Secrest, Jeffrey; Padgett, Clifford

    2017-01-01

    Although we may consider the power outlets in our homes to be nearly ideal voltage sources, a variety of influences in and around the home can cause departures from the nominal 60 Hz, 110-120 V root-mean-square (rms) of the North American grid. Even without instrumentation, we can see that a large motor starting from rest can be sufficient to…

  20. Global Optimization of Low-Thrust Interplanetary Trajectories Subject to Operational Constraints

    NASA Technical Reports Server (NTRS)

    Englander, Jacob Aldo; Vavrina, Matthew; Hinckley, David

    2016-01-01

    Low-thrust electric propulsion provides many advantages for mission to difficult targets-Comets and asteroids-Mercury-Outer planets (with sufficient power supply)Low-thrust electric propulsion is characterized by high power requirements but also very high specific impulse (Isp), leading to very good mass fractions. Low-thrust trajectory design is a very different process from chemical trajectory.

  1. An experimental study of potential residential and commercial applications of small-scale hybrid power systems

    NASA Astrophysics Data System (ADS)

    Acosta, Michael Anthony

    The research presented in this thesis provides an understanding of small-scale hybrid power systems. Experiments were conducted to identify potential applications of renewable energy in residential and commercial applications in the Rio Grande Valley of Texas. Solar and wind energy converted into electric energy was stored in batteries and inverted to power common household and commercial appliances. Several small to medium size hybrid power systems were setup and utilized to conduct numerous tests to study renewable energy prospects and feasibility for various applications. The experimental results obtained indicate that carefully constructed solar power systems can provide people living in isolated communities with sufficient energy to consistently meet their basic power needs.

  2. New heterogeneous test statistics for the unbalanced fixed-effect nested design.

    PubMed

    Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming

    2011-05-01

    When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.

  3. Implantable radio frequency identification sensors: wireless power and communication.

    PubMed

    Hutchens, Chriswell; Rennaker, Robert L; Venkataraman, Srinivasan; Ahmed, Rehan; Liao, Ran; Ibrahim, Tamer

    2011-01-01

    There are significant technical challenges in the development of a fully implantable wirelessly powered neural interface. Challenges include wireless transmission of sufficient power to the implanted device to ensure reliable operation for decades without replacement, minimizing tissue heating, and adequate reliable communications bandwidth. Overcoming these challenges is essential for the development of implantable closed loop system for the treatment of disorders ranging from epilepsy, incontinence, stroke and spinal cord injury. We discuss the development of the wireless power, communication and control for a Radio-Frequency Identification Sensor (RFIDS) system with targeted power range for a 700 mV, 30 to 40 uA load attained at -2 dBm.

  4. Correlation techniques and measurements of wave-height statistics

    NASA Technical Reports Server (NTRS)

    Guthart, H.; Taylor, W. C.; Graf, K. A.; Douglas, D. G.

    1972-01-01

    Statistical measurements of wave height fluctuations have been made in a wind wave tank. The power spectral density function of temporal wave height fluctuations evidenced second-harmonic components and an f to the minus 5th power law decay beyond the second harmonic. The observations of second harmonic effects agreed very well with a theoretical prediction. From the wave statistics, surface drift currents were inferred and compared to experimental measurements with satisfactory agreement. Measurements were made of the two dimensional correlation coefficient at 15 deg increments in angle with respect to the wind vector. An estimate of the two-dimensional spatial power spectral density function was also made.

  5. Control system design method

    DOEpatents

    Wilson, David G [Tijeras, NM; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  6. Estimation of power in low velocity vertical axis wind turbine

    NASA Astrophysics Data System (ADS)

    Sampath, S. S.; Shetty, Sawan; Chithirai Pon Selvan, M.

    2015-06-01

    The present work involves in the construction of a vertical axis wind turbine and the determination of power. Various different types of turbine blades are considered and the optimum blade is selected. Mechanical components of the entire setup are built to obtain maximum rotation per minute. The mechanical energy is converted into the electrical energy by coupling coaxially between the shaft and the generator. This setup produces sufficient power for consumption of household purposes which is economic and easily available.

  7. Contribution of Apollo lunar photography to the establishment of selenodetic control

    NASA Technical Reports Server (NTRS)

    Dermanis, A.

    1975-01-01

    Among the various types of available data relevant to the establishment of geometric control on the moon, the only one covering significant portions of the lunar surface (20%) with sufficient information content, is lunar photography, taken at the proximity of the moon from lunar orbiters. The idea of free geodetic networks is introduced as a tool for the statistical comparison of the geometric aspects of the various data used. Methods were developed for the updating of the statistics of observations and the a priori parameter estimates to obtain statistically consistent solutions by means of the optimum relative weighting concept.

  8. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  9. Ecological statistics of Gestalt laws for the perceptual organization of contours.

    PubMed

    Elder, James H; Goldberg, Richard M

    2002-01-01

    Although numerous studies have measured the strength of visual grouping cues for controlled psychophysical stimuli, little is known about the statistical utility of these various cues for natural images. In this study, we conducted experiments in which human participants trace perceived contours in natural images. These contours are automatically mapped to sequences of discrete tangent elements detected in the image. By examining relational properties between pairs of successive tangents on these traced curves, and between randomly selected pairs of tangents, we are able to estimate the likelihood distributions required to construct an optimal Bayesian model for contour grouping. We employed this novel methodology to investigate the inferential power of three classical Gestalt cues for contour grouping: proximity, good continuation, and luminance similarity. The study yielded a number of important results: (1) these cues, when appropriately defined, are approximately uncorrelated, suggesting a simple factorial model for statistical inference; (2) moderate image-to-image variation of the statistics indicates the utility of general probabilistic models for perceptual organization; (3) these cues differ greatly in their inferential power, proximity being by far the most powerful; and (4) statistical modeling of the proximity cue indicates a scale-invariant power law in close agreement with prior psychophysics.

  10. A clinicomicrobiological study to evaluate the efficacy of manual and powered toothbrushes among autistic patients

    PubMed Central

    Vajawat, Mayuri; Deepika, P. C.; Kumar, Vijay; Rajeshwari, P.

    2015-01-01

    Aim: To compare the efficacy of powered toothbrushes in improving gingival health and reducing salivary red complex counts as compared to manual toothbrushes, among autistic individuals. Materials and Methods: Forty autistics was selected. Test group received powered toothbrushes, and control group received manual toothbrushes. Plaque index and gingival index were recorded. Unstimulated saliva was collected for analysis of red complex organisms using polymerase chain reaction. Results: A statistically significant reduction in the plaque scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.002 for controls). This reduction was statistically more significant in the test group (P = 0.024). A statistically significant reduction in the gingival scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.001 for controls). This reduction was statistically more significant in the test group (P = 0.042). No statistically significant reduction in the detection rate of red complex organisms were seen at 4 weeks in both the groups. Conclusion: Powered toothbrushes result in a significant overall improvement in gingival health when constant reinforcement of oral hygiene instructions is given. PMID:26681855

  11. Statistical properties of radiation power levels from a high-gain free-electron laser at and beyond saturation

    NASA Astrophysics Data System (ADS)

    Schroeder, C. B.; Fawley, W. M.; Esarey, E.

    2003-07-01

    We investigate the statistical properties (e.g., shot-to-shot power fluctuations) of the radiation from a high-gain free-electron laser (FEL) operating in the nonlinear regime. We consider the case of an FEL amplifier reaching saturation whose shot-to-shot fluctuations in input radiation power follow a gamma distribution. We analyze the corresponding output power fluctuations at and beyond saturation, including beam energy spread effects, and find that there are well-characterized values of undulator length for which the fluctuations reach a minimum.

  12. Disease severity in familial cases of IBD.

    PubMed

    Andreu, M; Márquez, L; Domènech, E; Gisbert, J P; García, V; Marín-Jiménez, I; Peñalva, M; Gomollón, F; Calvet, X; Merino, O; Garcia-Planella, E; Vázquez-Romero, N; Esteve, M; Nos, P; Gutiérrez, A; Vera, I; Cabriada, J L; Martín, M D; Cañas-Ventura, A; Panés, J

    2014-03-01

    Phenotypic traits of familial IBD relative to sporadic cases are controversial, probably related to limited statistical power of published evidence. To know if there are phenotype differences between familial and sporadic IBD, evaluating the prospective Spanish registry (ENEIDA) with 11,983 cases. 5783 patients (48.3%) had ulcerative colitis (UC) and 6200 (51.7%) Crohn's disease (CD). Cases with one or more 1st, 2nd or 3rd degree relatives affected by UC/CD were defined as familial case. In UC and CD, familial cases compared with sporadic cases had an earlier disease onset (UC: 33 years [IQR 25-44] vs 37 years [IQR 27-49]; p<0.0001); (CD: 27 years [IQR 21-35] vs 29 years [IQR 22-40]; p<0.0001), higher prevalence of extraintestinal immune-related manifestations (EIMs) (UC: 17.2% vs 14%; p=0.04); (CD: 30.1% vs 23.6%; p<0.0001). Familial CD had higher percentage of ileocolic location (42.7% vs 51.8%; p=0.0001), penetrating behavior (21% vs 17.6%; p=0.01) and perianal disease (32% vs 27.1%; p=0.003). Differences are not influenced by degree of consanguinity. When a sufficiently powered cohort is evaluated, familial aggregation in IBD is associated to an earlier disease onset, more EIMs and more severe phenotype in CD. This feature should be taken into account at establishing predictors of disease course. © 2013.

  13. Association analysis using next-generation sequence data from publicly available control groups: the robust variance score statistic

    PubMed Central

    Derkach, Andriy; Chiang, Theodore; Gong, Jiafen; Addis, Laura; Dobbins, Sara; Tomlinson, Ian; Houlston, Richard; Pal, Deb K.; Strug, Lisa J.

    2014-01-01

    Motivation: Sufficiently powered case–control studies with next-generation sequence (NGS) data remain prohibitively expensive for many investigators. If feasible, a more efficient strategy would be to include publicly available sequenced controls. However, these studies can be confounded by differences in sequencing platform; alignment, single nucleotide polymorphism and variant calling algorithms; read depth; and selection thresholds. Assuming one can match cases and controls on the basis of ethnicity and other potential confounding factors, and one has access to the aligned reads in both groups, we investigate the effect of systematic differences in read depth and selection threshold when comparing allele frequencies between cases and controls. We propose a novel likelihood-based method, the robust variance score (RVS), that substitutes genotype calls by their expected values given observed sequence data. Results: We show theoretically that the RVS eliminates read depth bias in the estimation of minor allele frequency. We also demonstrate that, using simulated and real NGS data, the RVS method controls Type I error and has comparable power to the ‘gold standard’ analysis with the true underlying genotypes for both common and rare variants. Availability and implementation: An RVS R script and instructions can be found at strug.research.sickkids.ca, and at https://github.com/strug-lab/RVS. Contact: lisa.strug@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24733292

  14. Cerebral oxygenation in traumatic brain injury; Can a non-invasive frequency domain near-infrared spectroscopy device detect changes in brain tissue oxygen tension as well as the established invasive monitor?

    PubMed

    Davies, David James; Clancy, Michael; Dehghani, Hamid; Lucas, Samuel John Edwin; Forcione, Mario; Yakoub, Kamal Makram; Belli, Antonio

    2018-06-07

    The cost and highly invasive nature of brain monitoring modality in traumatic brain injury patients currently restrict its utility to specialist neurological intensive care settings. We aim to test the abilities of a frequency domain near-infrared spectroscopy (FD-NIRS) device in predicting changes in invasively measured brain tissue oxygen tension. Individuals admitted to a United Kingdom specialist major trauma centre were contemporaneously monitored with an FD-NIRS device and invasively measured brain tissue oxygen tension probe. Area under the curve receiver operating characteristic (AUROC) statistical analysis was utilised to assess the predictive power of FD-NIRS in detecting both moderate and severe hypoxia (20 and 10 mmHg, respectively), as measured invasively. 16 individuals were prospectively recruited to the investigation. Severe hypoxic episodes were detected in 9 of these individuals, with the NIRS demonstrating a broad range of predictive abilities (AUROC 0.68-0.88) from relatively poor to good. Moderate hypoxic episodes were detected in seven individuals with similar predictive performance (AUROC 0.576 - 0.905). A variable performance in the predictive powers of this FD-NIRS device to detect changes in brain tissue oxygen was demonstrated. Consequently, this enhanced NIRS technology has not demonstrated sufficient ability to replace the established invasive measurement.

  15. [Lymphocytic infiltration in uveal melanoma].

    PubMed

    Sach, J; Kocur, J

    1993-11-01

    After our observation of lymphocytic infiltration in uveal melanomas we present theoretical review to this interesting topic. Due to relatively low incidence of this feature we haven't got sufficiently large collection of cases for presentation of our statistically significant conclusions.

  16. BioCapacitor: A novel principle for biosensors.

    PubMed

    Sode, Koji; Yamazaki, Tomohiko; Lee, Inyoung; Hanashi, Takuya; Tsugawa, Wakako

    2016-02-15

    Studies regarding biofuel cells utilizing biocatalysts such as enzymes and microorganisms as electrocatalysts have been vigorously conducted over the last two decades. Because of their environmental safety and sustainability, biofuel cells are expected to be used as clean power generators. Among several principles of biofuel cells, enzyme fuel cells have attracted significant attention for their use as alternative energy sources for future implantable devices, such as implantable insulin pumps and glucose sensors in artificial pancreas and pacemakers. However, the inherent issue of the biofuel cell principle is the low power of a single biofuel cell. The theoretical voltage of biofuel cells is limited by the redox potential of cofactors and/or mediators employed in the anode and cathode, which are inadequate for operating any devices used for biomedical application. These limitations inspired us to develop a novel biodevice based on an enzyme fuel cell that generates sufficient stable power to operate electric devices, designated "BioCapacitor." To increase voltage, the enzyme fuel cell is connected to a charge pump. To obtain a sufficient power and voltage to operate an electric device, a capacitor is used to store the potential generated by the charge pump. Using the combination of a charge pump and capacitor with an enzyme fuel cell, high voltages with sufficient temporary currents to operate an electric device were generated without changing the design and construction of the enzyme fuel cell. In this review, the BioCapacitor principle is described. The three different representative categories of biodevices employing the BioCapacitor principle are introduced. Further, the recent challenges in the developments of self-powered stand-alone biodevices employing enzyme fuel cells combined with charge pumps and capacitors are introduced. Finally, the future prospects of biodevices employing the BioCapacitor principle are addressed. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Waveguide Multimode Directional Coupler for Harvesting Harmonic Power from the Output of Traveling-Wave Tube Amplifiers

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.; Wintucky, Edwin G.

    2017-01-01

    This paper presents the design, fabrication, and test results for a novel waveguide multimode directional coupler (MDC). The coupler fabricated from dissimilar frequency band waveguides, is capable of isolating power at the 2nd harmonic frequency from the fundamental power at the output port of traveling-wave tube amplifiers. Test results from proof-of-concept demonstrations are presented for Ku/Ka-band and Ka/E-band MDCs, which demonstrate sufficient power in the 2nd harmonic for a space borne beacon source for mm-wave atmospheric propagation studies.

  18. High-power and highly efficient diode-cladding-pumped holmium-doped fluoride fiber laser operating at 2.94 microm.

    PubMed

    Jackson, Stuart D

    2009-08-01

    A high-power diode-cladding-pumped Ho(3+), Pr(3+)-doped fluoride glass fiber laser is demonstrated. The laser produced a maximum output power of 2.5 W at a slope efficiency of 32% using diode lasers emitting at 1,150 nm. The long-emission wavelength of 2.94 microm measured at maximum pump power, which is particularly suited to medical applications, indicates that tailoring of the proportion of Pr(3+) ions can provide specific emission wavelengths while providing sufficient de-excitation of the lower laser level.

  19. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  20. How Statisticians Speak Risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redus, K.S.

    2007-07-01

    The foundation of statistics deals with (a) how to measure and collect data and (b) how to identify models using estimates of statistical parameters derived from the data. Risk is a term used by the statistical community and those that employ statistics to express the results of a statistically based study. Statistical risk is represented as a probability that, for example, a statistical model is sufficient to describe a data set; but, risk is also interpreted as a measure of worth of one alternative when compared to another. The common thread of any risk-based problem is the combination of (a)more » the chance an event will occur, with (b) the value of the event. This paper presents an introduction to, and some examples of, statistical risk-based decision making from a quantitative, visual, and linguistic sense. This should help in understanding areas of radioactive waste management that can be suitably expressed using statistical risk and vice-versa. (authors)« less

  1. Metacontrast Inferred from Reaction Time and Verbal Report: Replication and Comments on the Feher-Biederman Experiment

    ERIC Educational Resources Information Center

    Amundson, Vickie E.; Bernstein, Ira H.

    1973-01-01

    Authors note that Fehrer and Biederman's two statistical tests were not of equal power and that their conclusion could be a statistical artifact of both the lesser power of the verbal report comparison and the insensitivity of their particular verbal report indicator. (Editor)

  2. Prudence and Technology

    ERIC Educational Resources Information Center

    Weinberg, Alvin M.

    1971-01-01

    Argues that perfected technology, not neo-Ludite response, is necessary for solution of world food and resource problems. Although energy supply will ultimately limit available food, reactors can supply sufficient power for 15 billion population. (AL)

  3. Specious causal attributions in the social sciences: the reformulated stepping-stone theory of heroin use as exemplar.

    PubMed

    Baumrind, D

    1983-12-01

    The claims based on causal models employing either statistical or experimental controls are examined and found to be excessive when applied to social or behavioral science data. An exemplary case, in which strong causal claims are made on the basis of a weak version of the regularity model of cause, is critiqued. O'Donnell and Clayton claim that in order to establish that marijuana use is a cause of heroin use (their "reformulated stepping-stone" hypothesis), it is necessary and sufficient to demonstrate that marijuana use precedes heroin use and that the statistically significant association between the two does not vanish when the effects of other variables deemed to be prior to both of them are removed. I argue that O'Donnell and Clayton's version of the regularity model is not sufficient to establish cause and that the planning of social interventions both presumes and requires a generative rather than a regularity causal model. Causal modeling using statistical controls is of value when it compels the investigator to make explicit and to justify a causal explanation but not when it is offered as a substitute for a generative analysis of causal connection.

  4. Got power? A systematic review of sample size adequacy in health professions education research.

    PubMed

    Cook, David A; Hatala, Rose

    2015-03-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.

  5. Product plots.

    PubMed

    Wickham, Hadley; Hofmann, Heike

    2011-12-01

    We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE

  6. Power estimation using simulations for air pollution time-series studies

    PubMed Central

    2012-01-01

    Background Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Methods Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. Results In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. Conclusions These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided. PMID:22995599

  7. Power estimation using simulations for air pollution time-series studies.

    PubMed

    Winquist, Andrea; Klein, Mitchel; Tolbert, Paige; Sarnat, Stefanie Ebelt

    2012-09-20

    Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided.

  8. Thermal electric vapor trap arrangement and method

    DOEpatents

    Alger, Terry

    1988-01-01

    A technique for trapping vapor within a section of a tube is disclosed herein. This technique utilizes a conventional, readily providable thermal electric device having a hot side and a cold side and means for powering the device to accomplish this. The cold side of this device is positioned sufficiently close to a predetermined section of the tube and is made sufficiently cold so that any condensable vapor passing through the predetermined tube section is condensed and trapped, preferably within the predetermined tube section itself.

  9. Thermal electric vapor trap arrangement and method

    DOEpatents

    Alger, T.

    1988-03-15

    A technique for trapping vapor within a section of a tube is disclosed herein. This technique utilizes a conventional, readily providable thermal electric device having a hot side and a cold side and means for powering the device to accomplish this. The cold side of this device is positioned sufficiently close to a predetermined section of the tube and is made sufficiently cold so that any condensable vapor passing through the predetermined tube section is condensed and trapped, preferably within the predetermined tube section itself. 4 figs.

  10. Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.

    PubMed

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei

    2016-02-01

    Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.

  11. Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei

    2015-01-01

    Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979

  12. The statistical overlap theory of chromatography using power law (fractal) statistics.

    PubMed

    Schure, Mark R; Davis, Joe M

    2011-12-30

    The chromatographic dimensionality was recently proposed as a measure of retention time spacing based on a power law (fractal) distribution. Using this model, a statistical overlap theory (SOT) for chromatographic peaks is developed that estimates the number of peak maxima as a function of the chromatographic dimension, saturation and scale. Power law models exhibit a threshold region whereby below a critical saturation value no loss of peak maxima due to peak fusion occurs as saturation increases. At moderate saturation, behavior is similar to the random (Poisson) peak model. At still higher saturation, the power law model shows loss of peaks nearly independent of the scale and dimension of the model. The physicochemical meaning of the power law scale parameter is discussed and shown to be equal to the Boltzmann-weighted free energy of transfer over the scale limits. The scale is discussed. Small scale range (small β) is shown to generate more uniform chromatograms. Large scale range chromatograms (large β) are shown to give occasional large excursions of retention times; this is a property of power laws where "wild" behavior is noted to occasionally occur. Both cases are shown to be useful depending on the chromatographic saturation. A scale-invariant model of the SOT shows very simple relationships between the fraction of peak maxima and the saturation, peak width and number of theoretical plates. These equations provide much insight into separations which follow power law statistics. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. The relation between statistical power and inference in fMRI

    PubMed Central

    Wager, Tor D.; Yarkoni, Tal

    2017-01-01

    Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects), and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial—especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20–30) display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate) prediction methods and meta-analyses with related synthesis-oriented approaches. PMID:29155843

  14. Nuclear Power for Catalonia: The Role of the Official Chamber of Industry of Barcelona, 1953-1962

    ERIC Educational Resources Information Center

    Salom, Francesc X. Barca

    2005-01-01

    Between 1939 and 1959, the regime led by General Franco pursued a policy of economic self-sufficiency. This policy inflicted great injury on Spanish science and industry, not least in Catalonia, and in its capital, Barcelona. In response, Catalan industry looked to a future made more promising by the advent of nuclear power. This paper describes…

  15. Broadband Sources in the 1-3 THz Range

    NASA Technical Reports Server (NTRS)

    Mehdi, Imran; Ward, John; Maestrini, Alain; Chattopadhyay, Goutam; Schlecht, Erich; Thomas, Bertrand; Lin, Robert; Lee, Choonsup; Gill, John

    2009-01-01

    Broadband electronically tunable sources in the terahertz range are a critical technology for enabling space-borne as well as ground-based applications. By power-combining MMIC amplifier and frequency tripler chips, we have recently demonstrated >1 mW of output power at 900 GHz. This source provides a stepping stone to enable sources in the 2-3 THz range than can sufficiently pump multi-pixel imaging arrays.

  16. Integrated Renewable Hydrogen Utility System (IRHUS) business plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-03-01

    This business plan is for a proposed legal entity named IRHUS, Inc. which is to be formed as a subsidiary of Energy Partners, L.C. (EP) of West Palm Beach, Florida. EP is a research and development company specializing in hydrogen proton exchange membrane (PEM) fuel cells and systems. A fuel cell is an engine with no moving parts that takes in hydrogen and produces electricity. The purpose of IRHUS, Inc. is to develop and manufacture a self-sufficient energy system based on the fuel cell and other new technology that produces hydrogen and electricity. The product is called the Integrated renewablemore » Hydrogen utility System (IRHUS). IRHUS, Inc. plans to start limited production of the IRHUS in 2002. The IRHUS is a unique product with an innovative concept in that it provides continuous electrical power in places with no electrical infrastructure, i.e., in remote and island locations. The IRHUS is a zero emissions, self-sufficient, hydrogen fuel generation system that produces electricity on a continuous basis by combining any renewable power source with hydrogen technology. Current plans are to produce a 10 kilowatt IRHUS MP (medium power). Future plans are to design and manufacture IRHUS models to provide power for a variety of power ranges for identified attractive market segments. The technological components of the IRHUS include an electrolyzer, hydrogen and oxygen storage subsystems, fuel cell system, and power control system. The IRHUS product is to be integrated with a variety of renewable energy technologies. 5 figs., 10 tabs.« less

  17. Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.

    PubMed

    Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D

    2018-03-27

    Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods was most pronounced in trials with lower treatment discontinuation rates. Conclusions -We find that the statistical power of both recurrent-events and time-to-first methods are reduced by increasing heterogeneity of patient risk, a parameter not included in conventional power and sample size formulas. Data from real clinical trials are consistent with simulation studies, confirming that the greatest statistical gains from use of recurrent-events methods occur in the presence of high patient heterogeneity and low rates of study drug discontinuation.

  18. 30 CFR 57.6405 - Firing devices.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... sufficient current to energize all electric detonators to be fired with the type of circuits used. Storage or dry cell batteries are not permitted as power sources. (b) Blasting machines shall be tested, repaired...

  19. 30 CFR 57.6405 - Firing devices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sufficient current to energize all electric detonators to be fired with the type of circuits used. Storage or dry cell batteries are not permitted as power sources. (b) Blasting machines shall be tested, repaired...

  20. 30 CFR 57.6405 - Firing devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sufficient current to energize all electric detonators to be fired with the type of circuits used. Storage or dry cell batteries are not permitted as power sources. (b) Blasting machines shall be tested, repaired...

  1. 30 CFR 57.6405 - Firing devices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... sufficient current to energize all electric detonators to be fired with the type of circuits used. Storage or dry cell batteries are not permitted as power sources. (b) Blasting machines shall be tested, repaired...

  2. 30 CFR 57.6405 - Firing devices.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... sufficient current to energize all electric detonators to be fired with the type of circuits used. Storage or dry cell batteries are not permitted as power sources. (b) Blasting machines shall be tested, repaired...

  3. 33 CFR 155.1125 - Additional response plan requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sufficient numbers of trained personnel with the necessary technical skills to remove, to the... recommended procedures, to include— (A) Start-up and running under load of all electrical motors, pumps, power...

  4. Radiographic screen-film noise power spectrum: variation with microdensitometer slit length.

    PubMed

    Sandrik, J M; Wagner, R F

    1981-08-15

    When the noise power spectrum (NPS) of a radiographic screen-film system is measured by microdensito-metrically scanning the film with a long narrow slit, sufficient slit length allows estimation of a section of the 2-D NPS from the 1-D film scans; insufficient length causes underestimation of the NPS, particularly at low frequencies ( greater, similar1 cycle/mm). Spectra of Hi-Plus, Par Speed, and Detail screens used with XRP films measured as a function of microdensitometer slit length tended to plateau at long slit lengths. The slit length was considered sufficient when NPS components at 0.4 cycle/mm were within 5% of the plateau. This occurred for slit lengths of at least 4.2, 2.6, and 2.5 mm for Hi-Plus, Par Speed, and Detail systems, respectively.

  5. A Note on Comparing the Power of Test Statistics at Low Significance Levels.

    PubMed

    Morris, Nathan; Elston, Robert

    2011-01-01

    It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.

  6. The role of reference in cross-situational word learning.

    PubMed

    Wang, Felix Hao; Mintz, Toben H

    2018-01-01

    Word learning involves massive ambiguity, since in a particular encounter with a novel word, there are an unlimited number of potential referents. One proposal for how learners surmount the problem of ambiguity is that learners use cross-situational statistics to constrain the ambiguity: When a word and its referent co-occur across multiple situations, learners will associate the word with the correct referent. Yu and Smith (2007) propose that these co-occurrence statistics are sufficient for word-to-referent mapping. Alternative accounts hold that co-occurrence statistics alone are insufficient to support learning, and that learners are further guided by knowledge that words are referential (e.g., Waxman & Gelman, 2009). However, no behavioral word learning studies we are aware of explicitly manipulate subjects' prior assumptions about the role of the words in the experiments in order to test the influence of these assumptions. In this study, we directly test whether, when faced with referential ambiguity, co-occurrence statistics are sufficient for word-to-referent mappings in adult word-learners. Across a series of cross-situational learning experiments, we varied the degree to which there was support for the notion that the words were referential. At the same time, the statistical information about the words' meanings was held constant. When we overrode support for the notion that words were referential, subjects failed to learn the word-to-referent mappings, but otherwise they succeeded. Thus, cross-situational statistics were useful only when learners had the goal of discovering mappings between words and referents. We discuss the implications of these results for theories of word learning in children's language acquisition. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Interventions for reducing self-stigma in people with mental illnesses: a systematic review of randomized controlled trials

    PubMed Central

    Büchter, Roland Brian; Messer, Melanie

    2017-01-01

    Background: Self-stigma occurs when people with mental illnesses internalize negative stereotypes and prejudices about their condition. It can reduce help-seeking behaviour and treatment adherence. The effectiveness of interventions aimed at reducing self-stigma in people with mental illness is systematically reviewed. Results are discussed in the context of a logic model of the broader social context of mental illness stigma. Methods: Medline, Embase, PsycINFO, ERIC, and CENTRAL were searched for randomized controlled trials in November 2013. Studies were assessed with the Cochrane risk of bias tool. Results: Five trials were eligible for inclusion, four of which provided data for statistical analyses. Four studies had a high risk of bias. The quality of evidence was very low for each set of interventions and outcomes. The interventions studied included various group based anti-stigma interventions and an anti-stigma booklet. The intensity and fidelity of most interventions was high. Two studies were considered to be sufficiently homogeneous to be pooled for the outcome self-stigma. The meta-analysis did not find a statistically significant effect (SMD [95% CI] at 3 months: –0.26 [–0.64, 0.12], I2=0%, n=108). None of the individual studies found sustainable effects on other outcomes, including recovery, help-seeking behaviour and self-stigma. Conclusions: The effectiveness of interventions against self-stigma is uncertain. Previous studies lacked statistical power, used questionable outcome measures and had a high risk of bias. Future studies should be based on robust methods and consider practical implications regarding intervention development (relevance, implementability, and placement in routine services). PMID:28496396

  8. A framework for relating the structures and recovery statistics in pressure time-series surveys for dust devils

    NASA Astrophysics Data System (ADS)

    Jackson, Brian; Lorenz, Ralph; Davis, Karan

    2018-01-01

    Dust devils are likely the dominant source of dust for the martian atmosphere, but the amount and frequency of dust-lifting depend on the statistical distribution of dust devil parameters. Dust devils exhibit pressure perturbations and, if they pass near a barometric sensor, they may register as a discernible dip in a pressure time-series. Leveraging this fact, several surveys using barometric sensors on landed spacecraft have revealed dust devil structures and occurrence rates. However powerful they are, though, such surveys suffer from non-trivial biases that skew the inferred dust devil properties. For example, such surveys are most sensitive to dust devils with the widest and deepest pressure profiles, but the recovered profiles will be distorted, broader and shallow than the actual profiles. In addition, such surveys often do not provide wind speed measurements alongside the pressure time series, and so the durations of the dust devil signals in the time series cannot be directly converted to profile widths. Fortunately, simple statistical and geometric considerations can de-bias these surveys, allowing conversion of the duration of dust devil signals into physical widths, given only a distribution of likely translation velocities, and the recovery of the underlying distributions of physical parameters. In this study, we develop a scheme for de-biasing such surveys. Applying our model to an in-situ survey using data from the Phoenix lander suggests a larger dust flux and a dust devil occurrence rate about ten times larger than previously inferred. Comparing our results to dust devil track surveys suggests only about one in five low-pressure cells lifts sufficient dust to leave a visible track.

  9. Biosimilarity and Interchangeability: Principles and Evidence: A Systematic Review.

    PubMed

    McKinnon, Ross A; Cook, Matthew; Liauw, Winston; Marabani, Mona; Marschner, Ian C; Packer, Nicolle H; Prins, Johannes B

    2018-02-01

    The efficacy, safety and immunogenicity risk of switching between an originator biologic and a biosimilar or from one biosimilar to another are of potential concern. The aim was to conduct a systematic literature review of the outcomes of switching between biologics and their biosimilars and identify any evidence gaps. A systematic literature search was conducted in PubMed, EMBASE and Cochrane Library from inception to June 2017. Relevant societal meetings were also checked. Peer-reviewed studies reporting efficacy and/or safety data on switching between originator and biosimilar products or from one biosimilar to another were selected. Studies with fewer than 20 switched patients were excluded. Data were extracted on interventions, study population, reason for treatment switching, efficacy outcomes, safety and anti-drug antibodies. The systematic literature search identified 63 primary publications covering 57 switching studies. The reason for switching was reported as non-medical in 50 studies (23 clinical, 27 observational). Seven studies (all observational) did not report whether the reasons for switching were medical or non-medical. In 38 of the 57 studies, fewer than 100 patients were switched. Follow-up after switching went beyond 1 year in eight of the 57 studies. Of the 57 studies, 33 included statistical analysis of disease activity or patient outcomes; the majority of these studies found no statistically significant differences between groups for main efficacy parameters (based on P < 0.05 or predefined acceptance ranges), although some studies observed changes for some parameters. Most studies reported similar safety profiles between groups. There are important evidence gaps around the safety of switching between biologics and their biosimilars. Sufficiently powered and appropriately statistically analysed clinical trials and pharmacovigilance studies, with long-term follow-ups and multiple switches, are needed to support decision-making around biosimilar switching.

  10. Immunochip Analyses of Epistasis in Rheumatoid Arthritis Confirm Multiple Interactions within MHC and Suggest Novel Non-MHC Epistatic Signals.

    PubMed

    Wei, Wen-Hua; Loh, Chia-Yin; Worthington, Jane; Eyre, Stephen

    2016-05-01

    Studying statistical gene-gene interactions (epistasis) has been limited by the difficulties in performance, both statistically and computationally, in large enough sample numbers to gain sufficient power. Three large Immunochip datasets from cohort samples recruited in the United Kingdom, United States, and Sweden with European ancestry were used to examine epistasis in rheumatoid arthritis (RA). A full pairwise search was conducted in the UK cohort using a high-throughput tool and the resultant significant epistatic signals were tested for replication in the United States and Swedish cohorts. A forward selection approach was applied to remove redundant signals, while conditioning on the preidentified additive effects. We detected abundant genome-wide significant (p < 1.0e-13) epistatic signals, all within the MHC region. These signals were reduced substantially, but a proportion remained significant (p < 1.0e-03) in conditional tests. We identified 11 independent epistatic interactions across the entire MHC, each explaining on average 0.12% of the phenotypic variance, nearly all replicated in both replication cohorts. We also identified non-MHC epistatic interactions between RA susceptible loci LOC100506023 and IRF5 with Immunochip-wide significance (p < 1.1e-08) and between 2 neighboring single-nucleotide polymorphism near PTPN22 that were in low linkage disequilibrium with independent interaction (p < 1.0e-05). Both non-MHC epistatic interactions were statistically replicated with a similar interaction pattern in the US cohort only. There are multiple but relatively weak interactions independent of the additive effects in RA and a larger sample number is required to confidently assign additional non-MHC epistasis.

  11. Detecting higher spin fields through statistical anisotropy in the CMB and galaxy power spectra

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Kehagias, Alex; Liguori, Michele; Riotto, Antonio; Shiraishi, Maresuke; Tansella, Vittorio

    2018-01-01

    Primordial inflation may represent the most powerful collider to test high-energy physics models. In this paper we study the impact on the inflationary power spectrum of the comoving curvature perturbation in the specific model where massive higher spin fields are rendered effectively massless during a de Sitter epoch through suitable couplings to the inflaton field. In particular, we show that such fields with spin s induce a distinctive statistical anisotropic signal on the power spectrum, in such a way that not only the usual g2 M-statistical anisotropy coefficients, but also higher-order ones (i.e., g4 M,g6 M,…,g(2 s -2 )M and g(2 s )M) are nonvanishing. We examine their imprints in the cosmic microwave background and galaxy power spectra. Our Fisher matrix forecasts indicate that the detectability of gL M depends very weakly on L : all coefficients could be detected in near future if their magnitudes are bigger than about 10-3.

  12. ["Wise be aware of your sayings"--about gaps between epidemiological data base, experimental results and decision making in health administration].

    PubMed

    Hefer, Elioz

    2005-12-01

    When scientific researches are being published one should consider carefully the different possible influences which may change the results. These influences may be of two kinds: Non-Causal explanations, and Casual explanations. Researchers may arrive at their results and not have considered all the causative explanations. Occams's Razor is the basic rule by which most reasonable explanations are chosen. A statistical result and an appropriate simple theory to explain it, is not sufficient to prove causative effect. In many cases though, the media and public tend to accept a statistically significant result as if it was a proven cause and effect relation. There are several conditional demands called Bradford Hill criteria of which epidemiological data and results are only one, the more results arrived by using the Bradford Hill criteria, the better chances exist that the examined variable is the cause for the effect. Finally, there is a gap between a proven causal factor for disease or the harmful effects of treatment and a "clear cut" health policy. There are several intermediate powerful influences which are involved in the process of stating a new health policy. These influences include among others, the involvement of decision makers, political influences and civil service professionals. As an example three different issues of a well proven clinical research will be presented. The research of Rofecoxib = "Vioxx" cardiac effects, the research of Hormonal Replacement Treatment health effects on post menopausal women and the last of Health risks presented by mobile phone use. Although the results of those researches were proven to be statistically significant, Health Policy in each case is different and less clear. Health Policy is not based solely on figures and statistical results, but rather on a far wider and more complex influences and judgment.

  13. Arsenic exposure and bladder cancer: quantitative assessment of studies in human populations to detect risks at low doses.

    PubMed

    Tsuji, Joyce S; Alexander, Dominik D; Perez, Vanessa; Mink, Pamela J

    2014-03-20

    While exposures to high levels of arsenic in drinking water are associated with excess cancer risk (e.g., skin, bladder, and lung), exposures at lower levels (e.g., <100-200 µg/L) generally are not. Lack of significant associations may result from methodological issues (e.g., inadequate statistical power, exposure misclassification), or a different dose-response relationship at low exposures, possibly associated with a toxicological mode of action that requires a sufficient dose for increased tumor formation. The extent to which bladder cancer risk for low-level arsenic exposure can be statistically measured by epidemiological studies was examined using an updated meta-analysis of bladder cancer risk with data from two new publications. The summary relative risk estimate (SRRE) for all nine studies was elevated slightly, but not significantly (1.07; 95% confidence interval [CI]: 0.95-1.21, p-Heterogeneity [p-H]=0.543). The SRRE among never smokers was 0.85 (95% CI: 0.66-1.08, p-H=0.915), whereas the SRRE was positive and more heterogeneous among ever smokers (1.18; 95% CI: 0.97-1.44, p-H=0.034). The SRRE was statistically significantly lower than relative risks predicted for never smokers in the United States based on linear extrapolation of risks from higher doses in southwest Taiwan to arsenic water exposures >10 µg/L for more than one-third of a lifetime. By contrast, for all study subjects, relative risks predicted for one-half of lifetime exposure to 50 µg/L were just above the upper 95% CI on the SRRE. Thus, results from low-exposure studies, particularly for never smokers, were statistically inconsistent with predicted risk based on high-dose extrapolation. Additional studies that better characterize tobacco use and stratify analyses of arsenic and bladder cancer by smoking status are necessary to further examine risks of arsenic exposure for smokers. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. Modern Air&Space Power and political goals at war

    NASA Astrophysics Data System (ADS)

    Özer, Güngör.

    2014-05-01

    Modern AirandSpace Power is increasingly becoming a political tool. In this article, AirandSpacePower as a political tool will be discussed. The primary purpose of this article is to search how AirandSpacePower can provide contributions to security and also determine if it may reach the political goals on its own at war by SWOT Analysis Method and analysing the role of AirandSpace Power in Operation Unified Protector (Libya) as a case study. In conclusion, AirandSpacePower may not be sufficient to win the political goals on its own. However it may reach the political aims partially against the adversary on its own depending upon the situations. Moreover it can alone persuade the adversary to alter its behavior(s) in war.

  15. Design of fuel cell powered data centers for sufficient reliability and availability

    NASA Astrophysics Data System (ADS)

    Ritchie, Alexa J.; Brouwer, Jacob

    2018-04-01

    It is challenging to design a sufficiently reliable fuel cell electrical system for use in data centers, which require 99.9999% uptime. Such a system could lower emissions and increase data center efficiency, but the reliability and availability of such a system must be analyzed and understood. Currently, extensive backup equipment is used to ensure electricity availability. The proposed design alternative uses multiple fuel cell systems each supporting a small number of servers to eliminate backup power equipment provided the fuel cell design has sufficient reliability and availability. Potential system designs are explored for the entire data center and for individual fuel cells. Reliability block diagram analysis of the fuel cell systems was accomplished to understand the reliability of the systems without repair or redundant technologies. From this analysis, it was apparent that redundant components would be necessary. A program was written in MATLAB to show that the desired system reliability could be achieved by a combination of parallel components, regardless of the number of additional components needed. Having shown that the desired reliability was achievable through some combination of components, a dynamic programming analysis was undertaken to assess the ideal allocation of parallel components.

  16. [Application of statistics on chronic-diseases-relating observational research papers].

    PubMed

    Hong, Zhi-heng; Wang, Ping; Cao, Wei-hua

    2012-09-01

    To study the application of statistics on Chronic-diseases-relating observational research papers which were recently published in the Chinese Medical Association Magazines, with influential index above 0.5. Using a self-developed criterion, two investigators individually participated in assessing the application of statistics on Chinese Medical Association Magazines, with influential index above 0.5. Different opinions reached an agreement through discussion. A total number of 352 papers from 6 magazines, including the Chinese Journal of Epidemiology, Chinese Journal of Oncology, Chinese Journal of Preventive Medicine, Chinese Journal of Cardiology, Chinese Journal of Internal Medicine and Chinese Journal of Endocrinology and Metabolism, were reviewed. The rate of clear statement on the following contents as: research objectives, t target audience, sample issues, objective inclusion criteria and variable definitions were 99.43%, 98.57%, 95.43%, 92.86% and 96.87%. The correct rates of description on quantitative and qualitative data were 90.94% and 91.46%, respectively. The rates on correctly expressing the results, on statistical inference methods related to quantitative, qualitative data and modeling were 100%, 95.32% and 87.19%, respectively. 89.49% of the conclusions could directly response to the research objectives. However, 69.60% of the papers did not mention the exact names of the study design, statistically, that the papers were using. 11.14% of the papers were in lack of further statement on the exclusion criteria. Percentage of the papers that could clearly explain the sample size estimation only taking up as 5.16%. Only 24.21% of the papers clearly described the variable value assignment. Regarding the introduction on statistical conduction and on database methods, the rate was only 24.15%. 18.75% of the papers did not express the statistical inference methods sufficiently. A quarter of the papers did not use 'standardization' appropriately. As for the aspect of statistical inference, the rate of description on statistical testing prerequisite was only 24.12% while 9.94% papers did not even employ the statistical inferential method that should be used. The main deficiencies on the application of Statistics used in papers related to Chronic-diseases-related observational research were as follows: lack of sample-size determination, variable value assignment description not sufficient, methods on statistics were not introduced clearly or properly, lack of consideration for pre-requisition regarding the use of statistical inferences.

  17. Sufficient Forecasting Using Factor Models

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei

    2017-01-01

    We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537

  18. Extended performance electric propulsion power processor design study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Inouye, L. Y.; Schoenfeld, A. D.

    1977-01-01

    Several power processor design concepts were evaluated and compared. Emphasis was placed on a 30cm ion thruster power processor with a beam supply rating of 2.2kW to 10kW. Extensions in power processor performance were defined and were designed in sufficient detail to determine efficiency, component weight, part count, reliability and thermal control. Preliminary electrical design, mechanical design, and thermal analysis were performed on a 6kW power transformer for the beam supply. Bi-Mod mechanical, structural, and thermal control configurations were evaluated for the power processor, and preliminary estimates of mechanical weight were determined. A program development plan was formulated that outlines the work breakdown structure for the development, qualification and fabrication of the power processor flight hardware.

  19. Power-gated 32 bit microprocessor with a power controller circuit activated by deep-sleep-mode instruction achieving ultra-low power operation

    NASA Astrophysics Data System (ADS)

    Koike, Hiroki; Ohsawa, Takashi; Miura, Sadahiko; Honjo, Hiroaki; Ikeda, Shoji; Hanyu, Takahiro; Ohno, Hideo; Endoh, Tetsuo

    2015-04-01

    A spintronic-based power-gated micro-processing unit (MPU) is proposed. It includes a power control circuit activated by the newly supported power-off instruction for the deep-sleep mode. These means enable the power-off procedure for the MPU to be executed appropriately. A test chip was designed and fabricated using 90 nm CMOS and an additional 100 nm MTJ process; it was successfully operated. The guideline of the energy reduction effects for this MPU was presented, using the estimation based on the measurement results of the test chip. The result shows that a large operation energy reduction of 1/28 can be achieved when the operation duty is 10%, under the condition of a sufficient number of idle clock cycles.

  20. The value of repeating studies and multiple controls: replicated 28-day growth studies of rainbow trout exposed to clofibric acid.

    PubMed

    Owen, Stewart F; Huggett, Duane B; Hutchinson, Thomas H; Hetheridge, Malcolm J; McCormack, Paul; Kinter, Lewis B; Ericson, Jon F; Constantine, Lisa A; Sumpter, John P

    2010-12-01

    Two studies to examine the effect of waterborne clofibric acid (CA) on growth-rate and condition of rainbow trout were conducted using accepted regulatory tests (Organisation for Economic Co-operation and Development [OECD] 215). The first study (in 2005) showed significant reductions after 21 d of exposure (21-d growth lowest-observed-effect concentration [LOEC] = 0.1 µg/L, 21-d condition LOEC = 0.1 µg/L) that continued to 28 d. Growth rate was reduced by approximately 50% (from 5.27 to 2.67% per day), while the condition of the fish reduced in a concentration-dependant manner. Additionally, in a concentration-dependent manner, significant changes in relative liver size were observed, such that increasing concentrations of CA resulted in smaller livers after 28-d exposure. A no-observed-effect concentration (NOEC) was not achieved in the 2005 study. An expanded second study (in 2006) that included a robust bridge to the 2005 study, with four replicate tanks of eight individual fish per concentration, did not repeat the 2005 findings. In the 2006 study, no significant effect on growth rate, condition, or liver biometry was observed after 21 or 28 d (28-d growth NOEC = 10 µg/L, 28-d condition NOEC = 10 µg/L), contrary to the 2005 findings. We do not dismiss either of these findings and suggest both are relevant and stand for comparison. However, the larger 2006 study carries more statistical power and multiple-tank replication, so probably produced the more robust findings. Despite sufficient statistical power in each study, interpretation of these and similar studies should be conducted with caution, because much significance is placed on the role of limited numbers of individual and tank replicates and the influence of control animals. Copyright © 2010 SETAC.

  1. WebDISCO: a web service for distributed cox model learning without patient-level data sharing.

    PubMed

    Lu, Chia-Lun; Wang, Shuang; Ji, Zhanglong; Wu, Yuan; Xiong, Li; Jiang, Xiaoqian; Ohno-Machado, Lucila

    2015-11-01

    The Cox proportional hazards model is a widely used method for analyzing survival data. To achieve sufficient statistical power in a survival analysis, it usually requires a large amount of data. Data sharing across institutions could be a potential workaround for providing this added power. The authors develop a web service for distributed Cox model learning (WebDISCO), which focuses on the proof-of-concept and algorithm development for federated survival analysis. The sensitive patient-level data can be processed locally and only the less-sensitive intermediate statistics are exchanged to build a global Cox model. Mathematical derivation shows that the proposed distributed algorithm is identical to the centralized Cox model. The authors evaluated the proposed framework at the University of California, San Diego (UCSD), Emory, and Duke. The experimental results show that both distributed and centralized models result in near-identical model coefficients with differences in the range [Formula: see text] to [Formula: see text]. The results confirm the mathematical derivation and show that the implementation of the distributed model can achieve the same results as the centralized implementation. The proposed method serves as a proof of concept, in which a publicly available dataset was used to evaluate the performance. The authors do not intend to suggest that this method can resolve policy and engineering issues related to the federated use of institutional data, but they should serve as evidence of the technical feasibility of the proposed approach.Conclusions WebDISCO (Web-based Distributed Cox Regression Model; https://webdisco.ucsd-dbmi.org:8443/cox/) provides a proof-of-concept web service that implements a distributed algorithm to conduct distributed survival analysis without sharing patient level data. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Design and Control of a Pneumatically Actuated Transtibial Prosthesis.

    PubMed

    Zheng, Hao; Shen, Xiangrong

    2015-04-01

    This paper presents the design and control of a pneumatically actuated transtibial prosthesis, which utilizes a pneumatic cylinder-type actuator to power the prosthetic ankle joint to support the user's locomotion. The pneumatic actuator has multiple advantages over the traditional electric motor, such as light weight, low cost, and high power-to-weight ratio. The objective of this work is to develop a compact and lightweight transtibial prosthesis, leveraging the multiple advantages provided by this highly competitive actuator. In this paper, the design details of the prosthesis are described, including the determination of performance specifications, the layout of the actuation mechanism, and the calculation of the torque capacity. Through the authors' design calculation, the prosthesis is able to provide sufficient range of motion and torque capacity to support the locomotion of a 75 kg individual. The controller design is also described, including the underlying biomechanical analysis and the formulation of the finite-state impedance controller. Finally, the human subject testing results are presented, with the data indicating that the prosthesis is able to generate a natural walking gait and sufficient power output for its amputee user.

  3. Design and Control of a Pneumatically Actuated Transtibial Prosthesis

    PubMed Central

    Zheng, Hao; Shen, Xiangrong

    2015-01-01

    This paper presents the design and control of a pneumatically actuated transtibial prosthesis, which utilizes a pneumatic cylinder-type actuator to power the prosthetic ankle joint to support the user's locomotion. The pneumatic actuator has multiple advantages over the traditional electric motor, such as light weight, low cost, and high power-to-weight ratio. The objective of this work is to develop a compact and lightweight transtibial prosthesis, leveraging the multiple advantages provided by this highly competitive actuator. In this paper, the design details of the prosthesis are described, including the determination of performance specifications, the layout of the actuation mechanism, and the calculation of the torque capacity. Through the authors’ design calculation, the prosthesis is able to provide sufficient range of motion and torque capacity to support the locomotion of a 75 kg individual. The controller design is also described, including the underlying biomechanical analysis and the formulation of the finite-state impedance controller. Finally, the human subject testing results are presented, with the data indicating that the prosthesis is able to generate a natural walking gait and sufficient power output for its amputee user. PMID:26146497

  4. Extended performance electric propulsion power processor design study. Volume 2: Technical summary

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Inouye, L. Y.; Schoenfeld, A. D.

    1977-01-01

    Electric propulsion power processor technology has processed during the past decade to the point that it is considered ready for application. Several power processor design concepts were evaluated and compared. Emphasis was placed on a 30 cm ion thruster power processor with a beam power rating supply of 2.2KW to 10KW for the main propulsion power stage. Extension in power processor performance were defined and were designed in sufficient detail to determine efficiency, component weight, part count, reliability and thermal control. A detail design was performed on a microprocessor as the thyristor power processor controller. A reliability analysis was performed to evaluate the effect of the control electronics redesign. Preliminary electrical design, mechanical design and thermal analysis were performed on a 6KW power transformer for the beam supply. Bi-Mod mechanical, structural and thermal control configurations were evaluated for the power processor and preliminary estimates of mechanical weight were determined.

  5. An electrical betweenness approach for vulnerability assessment of power grids considering the capacity of generators and load

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Zhang, Bu-han; Zhang, Zhe; Yin, Xiang-gen; Wang, Bo

    2011-11-01

    Most existing research on the vulnerability of power grids based on complex networks ignores the electrical characteristics and the capacity of generators and load. In this paper, the electrical betweenness is defined by considering the maximal demand of load and the capacity of generators in power grids. The loss of load, which reflects the ability of power grids to provide sufficient power to customers, is introduced to measure the vulnerability together with the size of the largest cluster. The simulation results of the IEEE-118 bus system and the Central China Power Grid show that the cumulative distributions of node electrical betweenness follow a power-law and that the nodes with high electrical betweenness play critical roles in both topological structure and power transmission of power grids. The results prove that the model proposed in this paper is effective for analyzing the vulnerability of power grids.

  6. On the Relationship between Energy Density and Net Power (Intensity) in Coupled One-Dimensional Dynamic Systems

    DTIC Science & Technology

    1990-03-01

    equation of the statistical energy analysis (SEA) using the procedure indicated in equation (13) [8, 9]. Similarly, one may state the quantities (. (X-)) and...CONGRESS ON ACOUSTICS, July 24-31 1986, Toronto, Canada, Paper D6-1. 5. CUSCHIERI, J.M., Power flow as a compliment to statistical energy analysis and...34Random response of identical one-dimensional subsystems", Journal of Sound and Vibration, 1980, Vol. 70, p. 343-353. 8. LYON, R.H., Statistical Energy Analysis of

  7. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  8. Standardized seawater rearing of chinook salmon smolts to evaluate hatchery practices showed low statistical power

    USGS Publications Warehouse

    Palmisano, Aldo N.; Elder, N.E.

    2001-01-01

    We examined, under standardized conditions, seawater survival of chinook salmon Oncorhynchus tshawytscha at the smolt stage to evaluate the experimental hatchery practices applied to their rearing. The experimental rearing practices included rearing fish at different densities; attempting to control bacterial kidney disease with broodstock segregation, erythromycin injection, and an experimental diet; rearing fish on different water sources; and freeze branding the fish. After application of experimental rearing practices in hatcheries, smolts were transported to a rearing facility for about 2-3 months of seawater rearing. Of 16 experiments, 4 yielded statistically significant differences in seawater survival. In general we found that high variability among replicates, plus the low numbers of replicates available, resulted in low statistical power. We recommend including four or five replicates and using ?? = 0.10 in 1-tailed tests of hatchery experiments to try to increase the statistical power to 0.80.

  9. A pilot single-blind multicentre randomized controlled trial to evaluate the potential benefits of computer-assisted arm rehabilitation gaming technology on the arm function of children with spastic cerebral palsy.

    PubMed

    Preston, Nick; Weightman, Andrew; Gallagher, Justin; Levesley, Martin; Mon-Williams, Mark; Clarke, Mike; O'Connor, Rory J

    2016-10-01

    To evaluate the potential benefits of computer-assisted arm rehabilitation gaming technology on arm function of children with spastic cerebral palsy. A single-blind randomized controlled trial design. Power calculations indicated that 58 children would be required to demonstrate a clinically important difference. Intervention was home-based; recruitment took place in regional spasticity clinics. A total of 15 children with cerebral palsy aged five to 12 years were recruited; eight to the device group. Both study groups received 'usual follow-up treatment' following spasticity treatment with botulinum toxin; the intervention group also received a rehabilitation gaming device. ABILHAND-kids and Canadian Occupational Performance Measure were performed by blinded assessors at baseline, six and 12 weeks. An analysis of covariance showed no group differences in mean ABILHAND-kids scores between time points. A non-parametric analysis of variance on Canadian Occupational Performance Measure scores showed a statistically significant improvement across time points (χ 2 (2,15) = 6.778, p = 0.031), but this improvement did not reach minimal clinically important difference. Mean daily device use was seven minutes. Recruitment did not reach target owing to unanticipated staff shortages in clinical services. Feedback from children and their families indicated that the games were not sufficiently engaging to promote sufficient use that was likely to result in functional benefits. This study suggests that computer-assisted arm rehabilitation gaming does not benefit arm function, but a Type II error cannot be ruled out. © The Author(s) 2015.

  10. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  11. How often should we expect to be wrong? Statistical power, P values, and the expected prevalence of false discoveries.

    PubMed

    Marino, Michael J

    2018-05-01

    There is a clear perception in the literature that there is a crisis in reproducibility in the biomedical sciences. Many underlying factors contributing to the prevalence of irreproducible results have been highlighted with a focus on poor design and execution of experiments along with the misuse of statistics. While these factors certainly contribute to irreproducibility, relatively little attention outside of the specialized statistical literature has focused on the expected prevalence of false discoveries under idealized circumstances. In other words, when everything is done correctly, how often should we expect to be wrong? Using a simple simulation of an idealized experiment, it is possible to show the central role of sample size and the related quantity of statistical power in determining the false discovery rate, and in accurate estimation of effect size. According to our calculations, based on current practice many subfields of biomedical science may expect their discoveries to be false at least 25% of the time, and the only viable course to correct this is to require the reporting of statistical power and a minimum of 80% power (1 - β = 0.80) for all studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra

    NASA Astrophysics Data System (ADS)

    Kumar, Jagadish; Ananthakrishna, G.

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  13. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra.

    PubMed

    Kumar, Jagadish; Ananthakrishna, G

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  14. Chuck for delicate drills

    NASA Technical Reports Server (NTRS)

    Copeland, C. S.

    1972-01-01

    Development of oil film technique to couple power between drive spindle and drill chuck for delicate drilling operations is discussed. Oil film permits application of sufficient pressure, but stops rotating when drill jams. Illustration of equipment is provided.

  15. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Paper for the IEEE Visualization Conference Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space.

  16. Statistical issues on the analysis of change in follow-up studies in dental research.

    PubMed

    Blance, Andrew; Tu, Yu-Kang; Baelum, Vibeke; Gilthorpe, Mark S

    2007-12-01

    To provide an overview to the problems in study design and associated analyses of follow-up studies in dental research, particularly addressing three issues: treatment-baselineinteractions; statistical power; and nonrandomization. Our previous work has shown that many studies purport an interacion between change (from baseline) and baseline values, which is often based on inappropriate statistical analyses. A priori power calculations are essential for randomized controlled trials (RCTs), but in the pre-test/post-test RCT design it is not well known to dental researchers that the choice of statistical method affects power, and that power is affected by treatment-baseline interactions. A common (good) practice in the analysis of RCT data is to adjust for baseline outcome values using ancova, thereby increasing statistical power. However, an important requirement for ancova is there to be no interaction between the groups and baseline outcome (i.e. effective randomization); the patient-selection process should not cause differences in mean baseline values across groups. This assumption is often violated for nonrandomized (observational) studies and the use of ancova is thus problematic, potentially giving biased estimates, invoking Lord's paradox and leading to difficulties in the interpretation of results. Baseline interaction issues can be overcome by use of statistical methods; not widely practiced in dental research: Oldham's method and multilevel modelling; the latter is preferred for its greater flexibility to deal with more than one follow-up occasion as well as additional covariates To illustrate these three key issues, hypothetical examples are considered from the fields of periodontology, orthodontics, and oral implantology. Caution needs to be exercised when considering the design and analysis of follow-up studies. ancova is generally inappropriate for nonrandomized studies and causal inferences from observational data should be avoided.

  17. Resin embedded multicycle imaging (REMI): a tool to evaluate protein domains.

    PubMed

    Busse, B L; Bezrukov, L; Blank, P S; Zimmerberg, J

    2016-08-08

    Protein complexes associated with cellular processes comprise a significant fraction of all biology, but our understanding of their heterogeneous organization remains inadequate, particularly for physiological densities of multiple protein species. Towards resolving this limitation, we here present a new technique based on resin-embedded multicycle imaging (REMI) of proteins in-situ. By stabilizing protein structure and antigenicity in acrylic resins, affinity labels were repeatedly applied, imaged, removed, and replaced. In principle, an arbitrarily large number of proteins of interest may be imaged on the same specimen with subsequent digital overlay. A series of novel preparative methods were developed to address the problem of imaging multiple protein species in areas of the plasma membrane or volumes of cytoplasm of individual cells. For multiplexed examination of antibody staining we used straightforward computational techniques to align sequential images, and super-resolution microscopy was used to further define membrane protein colocalization. We give one example of a fibroblast membrane with eight multiplexed proteins. A simple statistical analysis of this limited membrane proteomic dataset is sufficient to demonstrate the analytical power contributed by additional imaged proteins when studying membrane protein domains.

  18. EviNet: a web platform for network enrichment analysis with flexible definition of gene sets.

    PubMed

    Jeggari, Ashwini; Alekseenko, Zhanna; Petrov, Iurii; Dias, José M; Ericson, Johan; Alexeyenko, Andrey

    2018-06-09

    The new web resource EviNet provides an easily run interface to network enrichment analysis for exploration of novel, experimentally defined gene sets. The major advantages of this analysis are (i) applicability to any genes found in the global network rather than only to those with pathway/ontology term annotations, (ii) ability to connect genes via different molecular mechanisms rather than within one high-throughput platform, and (iii) statistical power sufficient to detect enrichment of very small sets, down to individual genes. The users' gene sets are either defined prior to upload or derived interactively from an uploaded file by differential expression criteria. The pathways and networks used in the analysis can be chosen from the collection menu. The calculation is typically done within seconds or minutes and the stable URL is provided immediately. The results are presented in both visual (network graphs) and tabular formats using jQuery libraries. Uploaded data and analysis results are kept in separated project directories not accessible by other users. EviNet is available at https://www.evinet.org/.

  19. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  20. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

Top