Sample records for range statistics

  1. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    PubMed Central

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%–155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%–71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power. PMID:28479943

  2. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power.

    PubMed

    Miciak, Jeremy; Taylor, W Pat; Stuebing, Karla K; Fletcher, Jack M; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated measures. This can result in attenuated pretest-posttest correlations, reducing the variance explained by the pretest covariate. We investigated the implications of two potential range restriction scenarios: direct truncation on a selection measure and indirect range restriction on correlated measures. Empirical and simulated data indicated direct range restriction on the pretest covariate greatly reduced statistical power and necessitated sample size increases of 82%-155% (dependent on selection criteria) to achieve equivalent statistical power to parameters with unrestricted samples. However, measures demonstrating indirect range restriction required much smaller sample size increases (32%-71%) under equivalent scenarios. Additional analyses manipulated the correlations between measures and pretest-posttest correlations to guide planning experiments. Results highlight the need to differentiate between selection measures and potential covariates and to investigate range restriction as a factor impacting statistical power.

  3. A Frequency Domain Approach to Pretest Analysis Model Correlation and Model Updating for the Mid-Frequency Range

    DTIC Science & Technology

    2009-02-01

    range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The corresponding...frequency range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The...predictions. The averaging process is consistent with the averaging done in statistical energy analysis for stochastic systems. The FEM will always

  4. Mathematical and Statistical Software Index.

    DTIC Science & Technology

    1986-08-01

    geometric) mean HMEAN - harmonic mean MEDIAN - median MODE - mode QUANT - quantiles OGIVE - distribution curve IQRNG - interpercentile range RANGE ... range mutliphase pivoting algorithm cross-classification multiple discriminant analysis cross-tabul ation mul tipl e-objecti ve model curve fitting...Statistics). .. .. .... ...... ..... ...... ..... .. 21 *RANGEX (Correct Correlations for Curtailment of Range ). .. .. .... ...... ... 21 *RUMMAGE II (Analysis

  5. Designing Intervention Studies: Selected Populations, Range Restrictions, and Statistical Power

    ERIC Educational Resources Information Center

    Miciak, Jeremy; Taylor, W. Pat; Stuebing, Karla K.; Fletcher, Jack M.; Vaughn, Sharon

    2016-01-01

    An appropriate estimate of statistical power is critical for the design of intervention studies. Although the inclusion of a pretest covariate in the test of the primary outcome can increase statistical power, samples selected on the basis of pretest performance may demonstrate range restriction on the selection measure and other correlated…

  6. A Model for Developing and Assessing Community College Students' Conceptions of the Range, Interquartile Range, and Standard Deviation

    ERIC Educational Resources Information Center

    Turegun, Mikhail

    2011-01-01

    Traditional curricular materials and pedagogical strategies have not been effective in developing conceptual understanding of statistics topics and statistical reasoning abilities of students. Much of the changes proposed by statistics education research and the reform movement over the past decade have supported efforts to transform teaching…

  7. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  8. Poisson filtering of laser ranging data

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Shelus, Peter J.

    1993-01-01

    The filtering of data in a high noise, low signal strength environment is a situation encountered routinely in lunar laser ranging (LLR) and, to a lesser extent, in artificial satellite laser ranging (SLR). The use of Poisson statistics as one of the tools for filtering LLR data is described first in a historical context. The more recent application of this statistical technique to noisy SLR data is also described.

  9. Estimating order statistics of network degrees

    NASA Astrophysics Data System (ADS)

    Chu, J.; Nadarajah, S.

    2018-01-01

    We model the order statistics of network degrees of big data sets by a range of generalised beta distributions. A three parameter beta distribution due to Libby and Novick (1982) is shown to give the best overall fit for at least four big data sets. The fit of this distribution is significantly better than the fit suggested by Olhede and Wolfe (2012) across the whole range of order statistics for all four data sets.

  10. Patients and medical statistics. Interest, confidence, and ability.

    PubMed

    Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert

    2005-11-01

    People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. Survey with retest after approximately 2 weeks. Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test-retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's alpha=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data.

  11. Patients and Medical Statistics

    PubMed Central

    Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert

    2005-01-01

    BACKGROUND People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. OBJECTIVE To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. DESIGN Survey with retest after approximately 2 weeks. SUBJECTS Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. MEASURES We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. RESULTS Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test–retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's α=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). CONCLUSION The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data. PMID:16307623

  12. Numerical Analysis of Stochastic Dynamical Systems in the Medium-Frequency Range

    DTIC Science & Technology

    2003-02-01

    frequency vibration analysis such as the statistical energy analysis (SEA), the traditional modal analysis (well-suited for high and low: frequency...that the first few structural normal modes primarily constitute the total response. In the higher frequency range, the statistical energy analysis (SEA

  13. Statistical yearbook

    DOT National Transportation Integrated Search

    2010-01-01

    The Statistical Yearbook is an annual compilation of a wide range of international economic, social and environmental statistics on over 200 countries and areas, compiled from sources including UN agencies and other international, national and specia...

  14. Reynolds number dependence of relative dispersion statistics in isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Sawford, Brian L.; Yeung, P. K.; Hackl, Jason F.

    2008-06-01

    Direct numerical simulation results for a range of relative dispersion statistics over Taylor-scale Reynolds numbers up to 650 are presented in an attempt to observe and quantify inertial subrange scaling and, in particular, Richardson's t3 law. The analysis includes the mean-square separation and a range of important but less-studied differential statistics for which the motion is defined relative to that at time t =0. It seeks to unambiguously identify and quantify the Richardson scaling by demonstrating convergence with both the Reynolds number and initial separation. According to these criteria, the standard compensated plots for these statistics in inertial subrange scaling show clear evidence of a Richardson range but with an imprecise estimate for the Richardson constant. A modified version of the cube-root plots introduced by Ott and Mann [J. Fluid Mech. 422, 207 (2000)] confirms such convergence. It has been used to yield more precise estimates for Richardson's constant g which decrease with Taylor-scale Reynolds numbers over the range of 140-650. Extrapolation to the large Reynolds number limit gives an asymptotic value for Richardson's constant in the range g =0.55-0.57, depending on the functional form used to make the extrapolation.

  15. Towards Enhanced Underwater Lidar Detection via Source Separation

    NASA Astrophysics Data System (ADS)

    Illig, David W.

    Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical separation: The first investigations of statistical separation approaches for underwater lidar are presented. By demonstrating that target and backscatter returns have different statistical properties, a new separation axis is opened. This work investigates and quantifies performance of three statistical separation approaches. 4. Application of detection theory to underwater lidar: While many similar applications use detection theory to assess performance, less development has occurred in the underwater lidar field. This work applies these concepts to statistical separation approaches, providing another perspective in which to assess performance. In addition, by using detection theory approaches, statistical metrics can be used to associate a level of confidence in each ranging measurement. 5. Preliminary investigation of forward scatter suppression: If backscatter is sufficiently suppressed, forward scattering becomes a performance-limiting factor. This work presents a proof-of-concept demonstration of the potential for statistical separation approaches to suppress both forward and backward scatter. These results provide a demonstration of the capability that signal processing has to improve separation between target and backscatter. Separation capability improves in the transition from temporal to frequency to statistical separation approaches, with the statistical separation approaches improving target detection sensitivity by as much as 30 dB. Ranging and detection results demonstrate the enhanced performance this would allow in ranging applications. This increased performance is an important step in moving underwater lidar capability towards the requirements of the next generation of sensors.

  16. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  17. Ladar range image denoising by a nonlocal probability statistics algorithm

    NASA Astrophysics Data System (ADS)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  18. 17 CFR 229.1111 - (Item 1111) Pool assets.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... information for the asset pool, including statistical information regarding delinquencies and losses. (d.... Present statistical information in tabular or graphical format, if such presentation will aid understanding. Present statistical information in appropriate distributional groups or incremental ranges in...

  19. Testing statistical isotropy in cosmic microwave background polarization maps

    NASA Astrophysics Data System (ADS)

    Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.

    2018-04-01

    We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.

  20. Statistical methods for estimating normal blood chemistry ranges and variance in rainbow trout (Salmo gairdneri), Shasta Strain

    USGS Publications Warehouse

    Wedemeyer, Gary A.; Nelson, Nancy C.

    1975-01-01

    Gaussian and nonparametric (percentile estimate and tolerance interval) statistical methods were used to estimate normal ranges for blood chemistry (bicarbonate, bilirubin, calcium, hematocrit, hemoglobin, magnesium, mean cell hemoglobin concentration, osmolality, inorganic phosphorus, and pH for juvenile rainbow (Salmo gairdneri, Shasta strain) trout held under defined environmental conditions. The percentile estimate and Gaussian methods gave similar normal ranges, whereas the tolerance interval method gave consistently wider ranges for all blood variables except hemoglobin. If the underlying frequency distribution is unknown, the percentile estimate procedure would be the method of choice.

  1. Estimability and simple dynamical analyses of range (range-rate range-difference) observations to artificial satellites. [laser range observations to LAGEOS using non-Bayesian statistics

    NASA Technical Reports Server (NTRS)

    Vangelder, B. H. W.

    1978-01-01

    Non-Bayesian statistics were used in simulation studies centered around laser range observations to LAGEOS. The capabilities of satellite laser ranging especially in connection with relative station positioning are evaluated. The satellite measurement system under investigation may fall short in precise determinations of the earth's orientation (precession and nutation) and earth's rotation as opposed to systems as very long baseline interferometry (VLBI) and lunar laser ranging (LLR). Relative station positioning, determination of (differential) polar motion, positioning of stations with respect to the earth's center of mass and determination of the earth's gravity field should be easily realized by satellite laser ranging (SLR). The last two features should be considered as best (or solely) determinable by SLR in contrast to VLBI and LLR.

  2. Pupil Size in Outdoor Environments

    DTIC Science & Technology

    2007-04-06

    studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive

  3. The extraction and integration framework: a two-process account of statistical learning.

    PubMed

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  4. Statistics & Input-Output Measures for School Libraries in Colorado, 2002.

    ERIC Educational Resources Information Center

    Colorado State Library, Denver.

    This document presents statistics and input-output measures for K-12 school libraries in Colorado for 2002. Data are presented by type and size of school, i.e., high schools (six categories ranging from 2,000 and over to under 300), junior high/middle schools (five categories ranging from 1,000-1,999 to under 300), elementary schools (four…

  5. Studies in Mathematics Education: The Teaching of Statistics, Volume 7.

    ERIC Educational Resources Information Center

    Morris, Robert, Ed.

    This volume examines the teaching of statistics in the whole range of education, but concentrates on primary and secondary schools. It is based upon selected topics from the Second International Congress on Teaching Statistics (ICOTS 2), convened in Canada in August 1986. The contents of this volume divide broadly into four parts: statistics in…

  6. Network Data: Statistical Theory and New Models

    DTIC Science & Technology

    2016-02-17

    SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging

  7. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  8. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  9. Frequency-selective fading statistics of shallow-water acoustic communication channel with a few multipaths

    NASA Astrophysics Data System (ADS)

    Bae, Minja; Park, Jihyun; Kim, Jongju; Xue, Dandan; Park, Kyu-Chil; Yoon, Jong Rak

    2016-07-01

    The bit error rate of an underwater acoustic communication system is related to multipath fading statistics, which determine the signal-to-noise ratio. The amplitude and delay of each path depend on sea surface roughness, propagation medium properties, and source-to-receiver range as a function of frequency. Therefore, received signals will show frequency-dependent fading. A shallow-water acoustic communication channel generally shows a few strong multipaths that interfere with each other and the resulting interference affects the fading statistics model. In this study, frequency-selective fading statistics are modeled on the basis of the phasor representation of the complex path amplitude. The fading statistics distribution is parameterized by the frequency-dependent constructive or destructive interference of multipaths. At a 16 m depth with a muddy bottom, a wave height of 0.2 m, and source-to-receiver ranges of 100 and 400 m, fading statistics tend to show a Rayleigh distribution at a destructive interference frequency, but a Rice distribution at a constructive interference frequency. The theoretical fading statistics well matched the experimental ones.

  10. A computational statistics approach for estimating the spatial range of morphogen gradients

    PubMed Central

    Kanodia, Jitendra S.; Kim, Yoosik; Tomer, Raju; Khan, Zia; Chung, Kwanghun; Storey, John D.; Lu, Hang; Keller, Philipp J.; Shvartsman, Stanislav Y.

    2011-01-01

    A crucial issue in studies of morphogen gradients relates to their range: the distance over which they can act as direct regulators of cell signaling, gene expression and cell differentiation. To address this, we present a straightforward statistical framework that can be used in multiple developmental systems. We illustrate the developed approach by providing a point estimate and confidence interval for the spatial range of the graded distribution of nuclear Dorsal, a transcription factor that controls the dorsoventral pattern of the Drosophila embryo. PMID:22007136

  11. Selected 1966-69 interior Alaska wildfire statistics with long-term comparisons.

    Treesearch

    Richard J. Barney

    1971-01-01

    This paper presents selected interior Alaska forest and range wildfire statistics for the period 1966-69. Comparisons are made with the decade 1956-65 and the 30-year period 1940-69, which are essentially the total recorded statistical history on wildfires available for Alaska.

  12. Explorations in Statistics: Confidence Intervals

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

  13. Resampling: A Marriage of Computers and Statistics. ERIC/TM Digest.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Shafer, Mary Morello

    Advances in computer technology are making it possible for educational researchers to use simpler statistical methods to address a wide range of questions with smaller data sets and fewer, and less restrictive, assumptions. This digest introduces computationally intensive statistics, collectively called resampling techniques. Resampling is a…

  14. State Comparisons of Education Statistics: 1969-70 to 1996-97.

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Hoffman, Charlene M.

    Information on elementary and secondary schools and institutions of higher learning aggregated at a state level is presented. The report contains a wide array of statistical data ranging from enrollments and enrollment ratios to teacher salaries and institutional finances. The state-level statistics most frequently requested from the National…

  15. Statistical Analysis For Nucleus/Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1989-01-01

    Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.

  16. Spectral statistics of random geometric graphs

    NASA Astrophysics Data System (ADS)

    Dettmann, C. P.; Georgiou, O.; Knight, G.

    2017-04-01

    We use random matrix theory to study the spectrum of random geometric graphs, a fundamental model of spatial networks. Considering ensembles of random geometric graphs we look at short-range correlations in the level spacings of the spectrum via the nearest-neighbour and next-nearest-neighbour spacing distribution and long-range correlations via the spectral rigidity Δ3 statistic. These correlations in the level spacings give information about localisation of eigenvectors, level of community structure and the level of randomness within the networks. We find a parameter-dependent transition between Poisson and Gaussian orthogonal ensemble statistics. That is the spectral statistics of spatial random geometric graphs fits the universality of random matrix theory found in other models such as Erdős-Rényi, Barabási-Albert and Watts-Strogatz random graphs.

  17. Gene flow analysis method, the D-statistic, is robust in a wide parameter space.

    PubMed

    Zheng, Yichen; Janke, Axel

    2018-01-08

    We evaluated the sensitivity of the D-statistic, a parsimony-like method widely used to detect gene flow between closely related species. This method has been applied to a variety of taxa with a wide range of divergence times. However, its parameter space and thus its applicability to a wide taxonomic range has not been systematically studied. Divergence time, population size, time of gene flow, distance of outgroup and number of loci were examined in a sensitivity analysis. The sensitivity study shows that the primary determinant of the D-statistic is the relative population size, i.e. the population size scaled by the number of generations since divergence. This is consistent with the fact that the main confounding factor in gene flow detection is incomplete lineage sorting by diluting the signal. The sensitivity of the D-statistic is also affected by the direction of gene flow, size and number of loci. In addition, we examined the ability of the f-statistics, [Formula: see text] and [Formula: see text], to estimate the fraction of a genome affected by gene flow; while these statistics are difficult to implement to practical questions in biology due to lack of knowledge of when the gene flow happened, they can be used to compare datasets with identical or similar demographic background. The D-statistic, as a method to detect gene flow, is robust against a wide range of genetic distances (divergence times) but it is sensitive to population size. The D-statistic should only be applied with critical reservation to taxa where population sizes are large relative to branch lengths in generations.

  18. Comparison of photorefractive keratectomy and laser in situ keratomileusis for myopia of -6 D or less using the Nidek EC-5000 laser.

    PubMed

    Fernández, A P; Jaramillo, J; Jaramillo, M

    2000-01-01

    We compared the efficacy, predictability, and safety of photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) for the surgical correction of low and moderate myopia. A retrospective study was performed to evaluate uncorrected and spectacle-corrected visual acuity, and manifest refraction 1 year after PRK or LASIK. All procedures were done using an automatic microkeratome (Chiron Ophthalmic) and the Nidek EC-5000 excimer laser. PRK was performed in 75 eyes of 45 patients and LASIK in 133 eyes of 77 patients. Mean age for PRK patients was 32.8 years (range, 18 to 52 yr) and LASIK patients was 29.6 years (range, 18 to 49 yr). Mean preoperative spherical equivalent refraction for PRK patients was -3.28 D (range, -1.00 to -6.00 D) and LASIK, -3.86 D (range, -1.00 to -6.00 D). One year after surgery, mean spherical equivalent refraction for Group 1 (baseline, -1.00 to -3.00 D) PRK eyes was -0.18 +/- 0.61 D (range, -1.50 to +0.75 D) and for LASIK eyes, -0.08 +/- 0.61 D (range, -1.50 to +1.62 D), with no statistically significant difference. For Group 2 eyes (baseline, -3.25 to -6.00 D), mean spherical equivalent refraction for PRK eyes was -0.44 +/- 0.87 D (range, -2.00 to +2.12 D) and for LASIK eyes, -0.09 +/- 0.83 D (range, -1.50 to +1.75 D), with no statistically significant difference. The antilogarithm of the mean UCVA (antilogUCVA) in Group 1 for PRK was 0.79 +/- 0.21 (20/25) and for LASIK was 0.87 +/- 0.19 (20/23), with no statistically significant difference. The antilogUCVA in Group 2 for PRK eyes was 0.70 +/- 0.24 (20/28) and for LASIK eyes was 0.83 +/- 0.18 (20/24), with a statistically significant difference (0.7 vs. 0.83, P < .005). The percentage of eyes with a postoperative UCVA >20/40 in Group 1 for PRK was 91.5% (38 eyes) and for LASIK was 95% (50 eyes) (no statistically significant difference), and in Group 2 for PRK eyes, it was 82% (27 eyes) and 97.5% (78 eyes) for LASIK (statistically significant difference, P < .05). PRK and LASIK with the Nidek EC-5000 excimer laser are effective and safe for correcting low to moderate myopia, but LASIK eyes showed better results for moderate myopia in terms of uncorrected visual acuity.

  19. Long-ranged Fermi-Pasta-Ulam systems in thermal contact: Crossover from q-statistics to Boltzmann-Gibbs statistics

    NASA Astrophysics Data System (ADS)

    Bagchi, Debarshee; Tsallis, Constantino

    2017-04-01

    The relaxation to equilibrium of two long-range-interacting Fermi-Pasta-Ulam-like models (β type) in thermal contact is numerically studied. These systems, with different sizes and energy densities, are coupled to each other by a few thermal contacts which are short-range harmonic springs. By using the kinetic definition of temperature, we compute the time evolution of temperature and energy density of the two systems. Eventually, for some time t >teq, the temperature and energy density of the coupled system equilibrate to values consistent with standard Boltzmann-Gibbs thermostatistics. The equilibration time teq depends on the system size N as teq ∼Nγ where γ ≃ 1.8. We compute the velocity distribution P (v) of the oscillators of the two systems during the relaxation process. We find that P (v) is non-Gaussian and is remarkably close to a q-Gaussian distribution for all times before thermal equilibrium is reached. During the relaxation process we observe q > 1 while close to t =teq the value of q converges to unity and P (v) approaches a Gaussian. Thus the relaxation phenomenon in long-ranged systems connected by a thermal contact can be generically described as a crossover from q-statistics to Boltzmann-Gibbs statistics.

  20. Statistical properties of MHD fluctuations associated with high speed streams from HELIOS 2 observations

    NASA Technical Reports Server (NTRS)

    Bavassano, B.; Dobrowolny, H.; Fanfoni, G.; Mariani, F.; Ness, N. F.

    1981-01-01

    Helios 2 magnetic data were used to obtain several statistical properties of MHD fluctuations associated with the trailing edge of a given stream served in different solar rotations. Eigenvalues and eigenvectors of the variance matrix, total power and degree of compressibility of the fluctuations were derived and discussed both as a function of distance from the Sun and as a function of the frequency range included in the sample. The results obtained add new information to the picture of MHD turbulence in the solar wind. In particular, a dependence from frequency range of the radial gradients of various statistical quantities is obtained.

  1. Reliability and validity of a nutrition and physical activity environmental self-assessment for child care

    PubMed Central

    Benjamin, Sara E; Neelon, Brian; Ball, Sarah C; Bangdiwala, Shrikant I; Ammerman, Alice S; Ward, Dianne S

    2007-01-01

    Background Few assessment instruments have examined the nutrition and physical activity environments in child care, and none are self-administered. Given the emerging focus on child care settings as a target for intervention, a valid and reliable measure of the nutrition and physical activity environment is needed. Methods To measure inter-rater reliability, 59 child care center directors and 109 staff completed the self-assessment concurrently, but independently. Three weeks later, a repeat self-assessment was completed by a sub-sample of 38 directors to assess test-retest reliability. To assess criterion validity, a researcher-administered environmental assessment was conducted at 69 centers and was compared to a self-assessment completed by the director. A weighted kappa test statistic and percent agreement were calculated to assess agreement for each question on the self-assessment. Results For inter-rater reliability, kappa statistics ranged from 0.20 to 1.00 across all questions. Test-retest reliability of the self-assessment yielded kappa statistics that ranged from 0.07 to 1.00. The inter-quartile kappa statistic ranges for inter-rater and test-retest reliability were 0.45 to 0.63 and 0.27 to 0.45, respectively. When percent agreement was calculated, questions ranged from 52.6% to 100% for inter-rater reliability and 34.3% to 100% for test-retest reliability. Kappa statistics for validity ranged from -0.01 to 0.79, with an inter-quartile range of 0.08 to 0.34. Percent agreement for validity ranged from 12.9% to 93.7%. Conclusion This study provides estimates of criterion validity, inter-rater reliability and test-retest reliability for an environmental nutrition and physical activity self-assessment instrument for child care. Results indicate that the self-assessment is a stable and reasonably accurate instrument for use with child care interventions. We therefore recommend the Nutrition and Physical Activity Self-Assessment for Child Care (NAP SACC) instrument to researchers and practitioners interested in conducting healthy weight intervention in child care. However, a more robust, less subjective measure would be more appropriate for researchers seeking an outcome measure to assess intervention impact. PMID:17615078

  2. A new statistic for the analysis of circular data in gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  3. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  4. Facts about Newspapers '86: A Statistical Summary of the Newspaper Business.

    ERIC Educational Resources Information Center

    American Newspaper Publishers Association, Washington, DC.

    Attesting to the continuing economic strength and institutional vitality of the newspaper business in 1985, this booklet presents a statistical summary of the industry in the United States and Canada. The statistics cover a wide range of topics, including (1) number of daily newspapers, (2) daily newspaper circulation, (3) daily newspapers by…

  5. Issues affecting the interpretation of eastern hardwood resource statistics

    Treesearch

    William G. Luppold; William H. McWilliams

    2000-01-01

    Forest inventory statistics developed by the USDA Forest Service are used by customers ranging from forest industry to state and local economic development groups. In recent years, these statistics have been used increasingly to justify greater utilization of the eastem hardwood resource or to evaluate the sustainability of expanding demand for hardwood roundwood and...

  6. Statistical properties of measures of association and the Kappa statistic for assessing the accuracy of remotely sensed data using double sampling

    Treesearch

    Mohammed A. Kalkhan; Robin M. Reich; Raymond L. Czaplewski

    1996-01-01

    A Monte Carlo simulation was used to evaluate the statistical properties of measures of association and the Kappa statistic under double sampling with replacement. Three error matrices representing three levels of classification accuracy of Landsat TM Data consisting of four forest cover types in North Carolina. The overall accuracy of the five indices ranged from 0.35...

  7. Prediction of drug transport processes using simple parameters and PLS statistics. The use of ACD/logP and ACD/ChemSketch descriptors.

    PubMed

    Osterberg, T; Norinder, U

    2001-01-01

    A method of modelling and predicting biopharmaceutical properties using simple theoretically computed molecular descriptors and multivariate statistics has been investigated for several data sets related to solubility, IAM chromatography, permeability across Caco-2 cell monolayers, human intestinal perfusion, brain-blood partitioning, and P-glycoprotein ATPase activity. The molecular descriptors (e.g. molar refractivity, molar volume, index of refraction, surface tension and density) and logP were computed with ACD/ChemSketch and ACD/logP, respectively. Good statistical models were derived that permit simple computational prediction of biopharmaceutical properties. All final models derived had R(2) values ranging from 0.73 to 0.95 and Q(2) values ranging from 0.69 to 0.86. The RMSEP values for the external test sets ranged from 0.24 to 0.85 (log scale).

  8. (FEDSTATS)

    EPA Science Inventory

    Federal Statistics (FedStats) offers the full range of official statistical information available to the public from the Federal Government. It uses the Internet's powerful linking and searching capabilities to track economic and population trends, education, health care costs, a...

  9. Temporal Variability of Upper-level Winds at the Eastern Range, Western Range and Wallops Flight Facility

    NASA Technical Reports Server (NTRS)

    Decker, Ryan K.; Barbre, Robert E., Jr.

    2014-01-01

    Space launch vehicles incorporate upper-level wind profiles to determine wind effects on the vehicle and for a commit to launch decision. These assessments incorporate wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the upper-level winds over the time period between the assessment and launch can be mitigated by a statistical analysis of wind change over time periods of interest using historical data from the launch range. Five sets of temporal wind pairs at various times (.75, 1.5, 2, 3 and 4-hrs) at the Eastern Range, Western Range and Wallops Flight Facility were developed for use in upper-level wind assessments. Database development procedures as well as statistical analysis of temporal wind variability at each launch range will be presented.

  10. [Statistical approach to evaluate the occurrence of out-of acceptable ranges and accuracy for antimicrobial susceptibility tests in inter-laboratory quality control program].

    PubMed

    Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa

    2013-03-01

    To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate < or = 0.05). The standard deviation indices(SDI) were calculated by using reported results, mean and standard deviation values for the respective antimicrobial agents tested. In the evaluation of accuracy, mean value from each laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.

  11. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors

    NASA Astrophysics Data System (ADS)

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  12. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.

    PubMed

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  13. Many-body localization in a long range XXZ model with random-field

    NASA Astrophysics Data System (ADS)

    Li, Bo

    2016-12-01

    Many-body localization (MBL) in a long range interaction XXZ model with random field are investigated. Using the exact diagonal method, the MBL phase diagram with different tuning parameters and interaction range is obtained. It is found that the phase diagram of finite size results supplies strong evidence to confirm that the threshold interaction exponent α = 2. The tuning parameter Δ can efficiently change the MBL edge in high energy density stats, thus the system can be controlled to transfer from thermal phase to MBL phase by changing Δ. The energy level statistics data are consistent with result of the MBL phase diagram. However energy level statistics data cannot detect the thermal phase correctly in extreme long range case.

  14. Parametric vs. non-parametric statistics of low resolution electromagnetic tomography (LORETA).

    PubMed

    Thatcher, R W; North, D; Biver, C

    2005-01-01

    This study compared the relative statistical sensitivity of non-parametric and parametric statistics of 3-dimensional current sources as estimated by the EEG inverse solution Low Resolution Electromagnetic Tomography (LORETA). One would expect approximately 5% false positives (classification of a normal as abnormal) at the P < .025 level of probability (two tailed test) and approximately 1% false positives at the P < .005 level. EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) from 43 normal adult subjects were imported into the Key Institute's LORETA program. We then used the Key Institute's cross-spectrum and the Key Institute's LORETA output files (*.lor) as the 2,394 gray matter pixel representation of 3-dimensional currents at different frequencies. The mean and standard deviation *.lor files were computed for each of the 2,394 gray matter pixels for each of the 43 subjects. Tests of Gaussianity and different transforms were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of parametric vs. non-parametric statistics were compared using a "leave-one-out" cross validation method in which individual normal subjects were withdrawn and then statistically classified as being either normal or abnormal based on the remaining subjects. Log10 transforms approximated Gaussian distribution in the range of 95% to 99% accuracy. Parametric Z score tests at P < .05 cross-validation demonstrated an average misclassification rate of approximately 4.25%, and range over the 2,394 gray matter pixels was 27.66% to 0.11%. At P < .01 parametric Z score cross-validation false positives were 0.26% and ranged from 6.65% to 0% false positives. The non-parametric Key Institute's t-max statistic at P < .05 had an average misclassification error rate of 7.64% and ranged from 43.37% to 0.04% false positives. The nonparametric t-max at P < .01 had an average misclassification rate of 6.67% and ranged from 41.34% to 0% false positives of the 2,394 gray matter pixels for any cross-validated normal subject. In conclusion, adequate approximation to Gaussian distribution and high cross-validation can be achieved by the Key Institute's LORETA programs by using a log10 transform and parametric statistics, and parametric normative comparisons had lower false positive rates than the non-parametric tests.

  15. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    PubMed

    Vetter, Thomas R

    2017-11-01

    Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"

  16. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  17. Bias Reduction and Filter Convergence for Long Range Stereo

    NASA Technical Reports Server (NTRS)

    Sibley, Gabe; Matthies, Larry; Sukhatme, Gaurav

    2005-01-01

    We are concerned here with improving long range stereo by filtering image sequences. Traditionally, measurement errors from stereo camera systems have been approximated as 3-D Gaussians, where the mean is derived by triangulation and the covariance by linearized error propagation. However, there are two problems that arise when filtering such 3-D measurements. First, stereo triangulation suffers from a range dependent statistical bias; when filtering this leads to over-estimating the true range. Second, filtering 3-D measurements derived via linearized error propagation leads to apparent filter divergence; the estimator is biased to under-estimate range. To address the first issue, we examine the statistical behavior of stereo triangulation and show how to remove the bias by series expansion. The solution to the second problem is to filter with image coordinates as measurements instead of triangulated 3-D coordinates.

  18. Ignition delay times of benzene and toluene with oxygen in argon mixtures

    NASA Technical Reports Server (NTRS)

    Burcat, A.; Snyder, C.; Brabbs, T.

    1985-01-01

    The ignition delay times of benzene and toluene with oxygen diluted in argon were investigated over a wide range of conditions. For benzene the concentration ranges were 0.42 to 1.69 percent fuel and 3.78 to 20.3 percent oxygen. The temperature range was 1212 to 1748 K and the reflected shock pressures were 1.7 to 7.89 atm. Statistical evaluation of the benzene experiments provided an overall equation which is given. For toluene the concentration ranges were 0.5 to 1.5 percent fuel and 4.48 to 13.45 percent oxygen. The temperature range was 1339 to 1797 K and the reflected shock pressures were 1.95 to 8.85 atm. The overall ignition delay equation for toluene after a statistical evaluation is also given. Detailed experimental information is provided.

  19. A Classroom Note on the Binomial and Poisson Distributions: Biomedical Examples for Use in Teaching Introductory Statistics

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2006-01-01

    A generally-educated individual should have some insight into how decisions are made in the very wide range of fields that employ statistical and probabilistic reasoning. Also, students of introductory probability and statistics are often best motivated by specific applications rather than by theory and mathematical development, because most…

  20. Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    ERIC Educational Resources Information Center

    White, Patrick; Gorard, Stephen

    2017-01-01

    Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on…

  1. Facts about Newspapers '87: A Statistical Summary of the Newspaper Business.

    ERIC Educational Resources Information Center

    American Newspaper Publishers Association, Washington, DC.

    Attesting to the continuing economic strength and institutional vitality of the newspaper business in 1987, this booklet presents a statistical summary of the industry in the United States and Canada. The statistics cover a wide range of topics, including (1) number of daily newspapers; (2) daily newspaper circulation; (3) single copy sales price;…

  2. Assessing Climate Change Impacts for DoD installations in the Southwest United States During the Warm Season

    DTIC Science & Technology

    2017-03-10

    20 4. Statistical analysis methods to characterize distributions and trends...duration precipitation diagram from convective- permitting simulations for Barry Goldwater Range, Arizona. ix Figure 60: Statistically ...Same as Fig. 60 for other DoD facilities in the Southwest as labeled. Figure 62: Statistically significant model ensemble changes in rainfall

  3. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  4. Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data

    NASA Astrophysics Data System (ADS)

    Reno, B. L.; Brown, M.; Piccoli, P. M.

    2007-12-01

    Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.

  5. The power and robustness of maximum LOD score statistics.

    PubMed

    Yoo, Y J; Mendell, N R

    2008-07-01

    The maximum LOD score statistic is extremely powerful for gene mapping when calculated using the correct genetic parameter value. When the mode of genetic transmission is unknown, the maximum of the LOD scores obtained using several genetic parameter values is reported. This latter statistic requires higher critical value than the maximum LOD score statistic calculated from a single genetic parameter value. In this paper, we compare the power of maximum LOD scores based on three fixed sets of genetic parameter values with the power of the LOD score obtained after maximizing over the entire range of genetic parameter values. We simulate family data under nine generating models. For generating models with non-zero phenocopy rates, LOD scores maximized over the entire range of genetic parameters yielded greater power than maximum LOD scores for fixed sets of parameter values with zero phenocopy rates. No maximum LOD score was consistently more powerful than the others for generating models with a zero phenocopy rate. The power loss of the LOD score maximized over the entire range of genetic parameters, relative to the maximum LOD score calculated using the correct genetic parameter value, appeared to be robust to the generating models.

  6. The Time in Therapeutic Range and Bleeding Complications of Warfarin in Different Geographic Regions of Turkey: A Subgroup Analysis of WARFARIN-TR Study.

    PubMed

    Kılıç, Salih; Çelik, Ahmet; Çakmak, Hüseyin Altuğ; Afşin, Abdülmecit; Tekkeşin, Ahmet İlker; Açıksarı, Gönül; Memetoğlu, Mehmet Erdem; Özpamuk Karadeniz, Fatma; Şahan, Ekrem; Alıcı, Mehmet Hayri; Dereli, Yüksel; Sinan, Ümit Yaşar; Zoghi, Mehdi

    2017-08-04

    The time in therapeutic range values may vary between different geographical regions of Turkey in patients vitamin K antagonist therapy. To evaluate the time in therapeutic range percentages, efficacy, safety and awareness of warfarin according to the different geographical regions in patients who participated in the WARFARIN-TR study (The Awareness, Efficacy, Safety and Time in Therapeutic Range of Warfarin in the Turkish population) in Turkey. Cross-sectional study. The WARFARIN-TR study includes 4987 patients using warfarin and involved regular international normalized ratio monitoring between January 1, 2014 and December 31, 2014. Patients attended follow-ups for 12 months. The sample size calculations were analysed according to the density of the regional population and according to Turkish Statistical Institute data. The time in therapeutic range was calculated according to F.R. Roosendaal's algorithm. Awareness was evaluated based on the patients' knowledge of the effect of warfarin and food-drug interactions with simple questions developed based on a literature review. The Turkey-wide time in therapeutic range was reported as 49.5%±22.9 in the WARFARIN-TR study. There were statistically significant differences between regions in terms of time in therapeutic range (p>0.001). The highest rate was reported in the Marmara region (54.99%±20.91) and the lowest was in the South-eastern Anatolia region (41.95±24.15) (p>0.001). Bleeding events were most frequently seen in Eastern Anatolia (41.6%), with major bleeding in the Aegean region (5.11%) and South-eastern Anatolia (5.36%). There were statistically significant differences between the regions in terms of awareness (p>0.001). Statistically significant differences were observed in terms of the efficacy, safety and awareness of warfarin therapy according to different geographical regions in Turkey.

  7. The distinct character of anisotropy and intermittency in inertial and kinetic range solar wind plasma turbulence

    NASA Astrophysics Data System (ADS)

    Kiyani, Khurom; Chapman, Sandra; Osman, Kareem; Sahraoui, Fouad; Hnat, Bogdan

    2014-05-01

    The anisotropic nature of the scaling properties of solar wind magnetic turbulence fluctuations is investigated scale by scale using high cadence in situ magnetic field measurements from the Cluster, ACE and STEREO spacecraft missions in both fast and slow quiet solar wind conditions. The data span five decades in scales from the inertial range to the electron Larmor radius. We find a clear transition in scaling behaviour between the inertial and kinetic range of scales, which provides a direct, quantitative constraint on the physical processes that mediate the cascade of energy through these scales. In the inertial (magnetohydrodynamic) range the statistical nature of turbulent fluctuations are known to be anisotropic, both in the vector components of the magnetic field fluctuations (variance anisotropy) and in the spatial scales of these fluctuations (wavevector or k-anisotropy). We show for the first time that, when measuring parallel to the local magnetic field direction, the full statistical signature of the magnetic and Elsasser field fluctuations is that of a non-Gaussian globally scale-invariant process. This is distinct from the classic multi-exponent statistics observed when the local magnetic field is perpendicular to the flow direction. These observations suggest the weakness, or absence, of a parallel magnetofluid turbulence energy cascade. In contrast to the inertial range, there is a successive increase toward isotropy between parallel and transverse power at scales below the ion Larmor radius, with isotropy being achieved at the electron Larmor radius. Computing higher-order statistics, we show that the full statistical signature of both parallel, and perpendicular fluctuations at scales below the ion Larmor radius are that of an isotropic globally scale-invariant non-Gaussian process. Lastly, we perform a survey of multiple intervals of quiet solar wind sampled under different plasma conditions (fast, slow wind; plasma beta etc.) and find that the above results on the scaling transition between inertial and kinetic range scales are qualitatively robust, and that quantitatively, there is a spread in the values of the scaling exponents.

  8. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.

  9. Development of a mathematical model for the dissolution of uranium dioxide. II. Statistical model for the dissolution of uranium dioxide tablets in nitric acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhukovskii, Yu.M.; Luksha, O.P.; Nenarokomov, E.A.

    1988-03-01

    We have derived a statistical model for the dissolution of uranium dioxide tablets for the 6 to 12 M concentration range and temperatures from 80/sup 0/C to the boiling point. The model differs qualitatively from the dissolution model for ground uranium dioxide. In the indicated range of experimental conditions, the mean-square deviation of the curves for the model from the experimental curves is not greater than 6%.

  10. Statistical tables and charts showing geochemical variation in the Mesoproterozoic Big Creek, Apple Creek, and Gunsight formations, Lemhi group, Salmon River Mountains and Lemhi Range, central Idaho

    USGS Publications Warehouse

    Lindsey, David A.; Tysdal, Russell G.; Taggart, Joseph E.

    2002-01-01

    The principal purpose of this report is to provide a reference archive for results of a statistical analysis of geochemical data for metasedimentary rocks of Mesoproterozoic age of the Salmon River Mountains and Lemhi Range, central Idaho. Descriptions of geochemical data sets, statistical methods, rationale for interpretations, and references to the literature are provided. Three methods of analysis are used: R-mode factor analysis of major oxide and trace element data for identifying petrochemical processes, analysis of variance for effects of rock type and stratigraphic position on chemical composition, and major-oxide ratio plots for comparison with the chemical composition of common clastic sedimentary rocks.

  11. Genital Herpes - Initial Visits to Physicians' Offices, United States, 1966-2012

    MedlinePlus

    ... Archive Data & Statistics Sexually Transmitted Diseases Figure 48. Genital Herpes — Initial Visits to Physicians’ Offices, United States, 1966 – ... Statistics page . NOTE : The relative standard errors for genital herpes estimates of more than 100,000 range from ...

  12. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application

    PubMed Central

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy. PMID:27551747

  13. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application.

    PubMed

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy.

  14. Statistical regularities of art images and natural scenes: spectra, sparseness and nonlinearities.

    PubMed

    Graham, Daniel J; Field, David J

    2007-01-01

    Paintings are the product of a process that begins with ordinary vision in the natural world and ends with manipulation of pigments on canvas. Because artists must produce images that can be seen by a visual system that is thought to take advantage of statistical regularities in natural scenes, artists are likely to replicate many of these regularities in their painted art. We have tested this notion by computing basic statistical properties and modeled cell response properties for a large set of digitized paintings and natural scenes. We find that both representational and non-representational (abstract) paintings from our sample (124 images) show basic similarities to a sample of natural scenes in terms of their spatial frequency amplitude spectra, but the paintings and natural scenes show significantly different mean amplitude spectrum slopes. We also find that the intensity distributions of paintings show a lower skewness and sparseness than natural scenes. We account for this by considering the range of luminances found in the environment compared to the range available in the medium of paint. A painting's range is limited by the reflective properties of its materials. We argue that artists do not simply scale the intensity range down but use a compressive nonlinearity. In our studies, modeled retinal and cortical filter responses to the images were less sparse for the paintings than for the natural scenes. But when a compressive nonlinearity was applied to the images, both the paintings' sparseness and the modeled responses to the paintings showed the same or greater sparseness compared to the natural scenes. This suggests that artists achieve some degree of nonlinear compression in their paintings. Because paintings have captivated humans for millennia, finding basic statistical regularities in paintings' spatial structure could grant insights into the range of spatial patterns that humans find compelling.

  15. Mississippi Public Junior Colleges Statistical Data, 1985-86.

    ERIC Educational Resources Information Center

    Moody, George V.; And Others

    Statistical data for the 1985-86 academic year are prestned here for Mississippi's 15 public junior colleges, including information on enrollments, degrees and certificates awarded, revenues, expenditures, academic salary ranges, transportation services, dormitory utilization, and auxiliary enterprises. Introductory remarks and the Board of…

  16. Mississippi Community and Junior Colleges. Statistical Data, 1986-87.

    ERIC Educational Resources Information Center

    Moody, George V.; And Others

    Statistical data for Mississippi's 15 public community and junior college districts are presented in this document, providing information on enrollments, degrees and certificates awarded, revenues, expenditures, academic salary ranges, learning resources, transportation services, dormitory utilization, and auxiliary enterprises in 1986-87.…

  17. Methods for detrending success metrics to account for inflationary and deflationary factors*

    NASA Astrophysics Data System (ADS)

    Petersen, A. M.; Penner, O.; Stanley, H. E.

    2011-01-01

    Time-dependent economic, technological, and social factors can artificially inflate or deflate quantitative measures for career success. Here we develop and test a statistical method for normalizing career success metrics across time dependent factors. In particular, this method addresses the long standing question: how do we compare the career achievements of professional athletes from different historical eras? Developing an objective approach will be of particular importance over the next decade as major league baseball (MLB) players from the "steroids era" become eligible for Hall of Fame induction. Some experts are calling for asterisks (*) to be placed next to the career statistics of athletes found guilty of using performance enhancing drugs (PED). Here we address this issue, as well as the general problem of comparing statistics from distinct eras, by detrending the seasonal statistics of professional baseball players. We detrend player statistics by normalizing achievements to seasonal averages, which accounts for changes in relative player ability resulting from a range of factors. Our methods are general, and can be extended to various arenas of competition where time-dependent factors play a key role. For five statistical categories, we compare the probability density function (pdf) of detrended career statistics to the pdf of raw career statistics calculated for all player careers in the 90-year period 1920-2009. We find that the functional form of these pdfs is stationary under detrending. This stationarity implies that the statistical regularity observed in the right-skewed distributions for longevity and success in professional sports arises from both the wide range of intrinsic talent among athletes and the underlying nature of competition. We fit the pdfs for career success by the Gamma distribution in order to calculate objective benchmarks based on extreme statistics which can be used for the identification of extraordinary careers.

  18. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    PubMed

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  19. Theoretical Studies of Kinetic Mechanisms of Negative Ion Formation in Plasmas.

    DTIC Science & Technology

    1987-06-01

    927258 ILLUSTRATIONS Figure Title Pg e 1 Long-Range Behavior of Excited IVg States of Li2 21 . 2 Long-Range Behavior of Excited It* States of Li2 22U 2 3...34) yields a statistically better fit with X2 - 0.002 as compared to X2 - 0.01 for the Ceperley and Partridge potential (Ref. 24). A significantly...including those reported by Jordan and Amdur (Ref. 37), yield significantly poorer statistical fits. We have not analyzed the new potential of Nitz, et

  20. Entropic Repulsion Between Fluctuating Surfaces

    NASA Astrophysics Data System (ADS)

    Janke, W.

    The statistical mechanics of fluctuating surfaces plays an important role in a variety of physical systems, ranging from biological membranes to world sheets of strings in theories of fundamental interactions. In many applications it is a good approximation to assume that the surfaces possess no tension. Their statistical properties are then governed by curvature energies only, which allow for gigantic out-of-plane undulations. These fluctuations are the “entropic” origin of long-range repulsive forces in layered surface systems. Theoretical estimates of these forces for simple model surfaces are surveyed and compared with recent Monte Carlo simulations.

  1. Comparison of 2- and 10-micron coherent Doppler lidar performance

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1995-01-01

    The performance of 2- and 10-micron coherent Doppler lidar is presented in terms of the statistical distribution of the maximum-likelihood velocity estimator from simulations for fixed range resolution and fixed velocity search space as a function of the number of coherent photoelectrons per estimate. The wavelength dependence of the aerosol backscatter coefficient, the detector quantum efficiency, and the atmospheric extinction produce a simple shift of the performance curves. Results are presented for a typical boundary layer measurement and a space-based measurement for two regimes: the pulse-dominated regime where the signal statistics are determined by the transmitted pulse, and the atmospheric-dominated regime where the signal statistics are determined by the velocity fluctuations over the range gate. The optimal choice of wavelength depends on the problem under consideration.

  2. Measurements of Transatmospheric Attenuation Statistics at the Microwave Frequencies : 15, 19, and 34 GHz

    DOT National Transportation Integrated Search

    1971-06-01

    Attenuation statistics resulting from a twelve month observation program are presented. The sun is used as a source of microwave radiation. The dynamic range of atmospheric attenuation measurement capability is in excess of 30 dB. Solar radiation cha...

  3. Statistical Interpretation of the Local Field Inside Dielectrics.

    ERIC Educational Resources Information Center

    Berrera, Ruben G.; Mello, P. A.

    1982-01-01

    Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)

  4. Methods for estimating selected low-flow frequency statistics and harmonic mean flows for streams in Iowa

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.

    2017-01-01

    A statewide study was conducted to develop regression equations for estimating six selected low-flow frequency statistics and harmonic mean flows for ungaged stream sites in Iowa. The estimation equations developed for the six low-flow frequency statistics include: the annual 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years, the annual 30-day mean low flow for a recurrence interval of 5 years, and the seasonal (October 1 through December 31) 1- and 7-day mean low flows for a recurrence interval of 10 years. Estimation equations also were developed for the harmonic-mean-flow statistic. Estimates of these seven selected statistics are provided for 208 U.S. Geological Survey continuous-record streamgages using data through September 30, 2006. The study area comprises streamgages located within Iowa and 50 miles beyond the State's borders. Because trend analyses indicated statistically significant positive trends when considering the entire period of record for the majority of the streamgages, the longest, most recent period of record without a significant trend was determined for each streamgage for use in the study. The median number of years of record used to compute each of these seven selected statistics was 35. Geographic information system software was used to measure 54 selected basin characteristics for each streamgage. Following the removal of two streamgages from the initial data set, data collected for 206 streamgages were compiled to investigate three approaches for regionalization of the seven selected statistics. Regionalization, a process using statistical regression analysis, provides a relation for efficiently transferring information from a group of streamgages in a region to ungaged sites in the region. The three regionalization approaches tested included statewide, regional, and region-of-influence regressions. For the regional regression, the study area was divided into three low-flow regions on the basis of hydrologic characteristics, landform regions, and soil regions. A comparison of root mean square errors and average standard errors of prediction for the statewide, regional, and region-of-influence regressions determined that the regional regression provided the best estimates of the seven selected statistics at ungaged sites in Iowa. Because a significant number of streams in Iowa reach zero flow as their minimum flow during low-flow years, four different types of regression analyses were used: left-censored, logistic, generalized-least-squares, and weighted-least-squares regression. A total of 192 streamgages were included in the development of 27 regression equations for the three low-flow regions. For the northeast and northwest regions, a censoring threshold was used to develop 12 left-censored regression equations to estimate the 6 low-flow frequency statistics for each region. For the southern region a total of 12 regression equations were developed; 6 logistic regression equations were developed to estimate the probability of zero flow for the 6 low-flow frequency statistics and 6 generalized least-squares regression equations were developed to estimate the 6 low-flow frequency statistics, if nonzero flow is estimated first by use of the logistic equations. A weighted-least-squares regression equation was developed for each region to estimate the harmonic-mean-flow statistic. Average standard errors of estimate for the left-censored equations for the northeast region range from 64.7 to 88.1 percent and for the northwest region range from 85.8 to 111.8 percent. Misclassification percentages for the logistic equations for the southern region range from 5.6 to 14.0 percent. Average standard errors of prediction for generalized least-squares equations for the southern region range from 71.7 to 98.9 percent and pseudo coefficients of determination for the generalized-least-squares equations range from 87.7 to 91.8 percent. Average standard errors of prediction for weighted-least-squares equations developed for estimating the harmonic-mean-flow statistic for each of the three regions range from 66.4 to 80.4 percent. The regression equations are applicable only to stream sites in Iowa with low flows not significantly affected by regulation, diversion, or urbanization and with basin characteristics within the range of those used to develop the equations. If the equations are used at ungaged sites on regulated streams, or on streams affected by water-supply and agricultural withdrawals, then the estimates will need to be adjusted by the amount of regulation or withdrawal to estimate the actual flow conditions if that is of interest. Caution is advised when applying the equations for basins with characteristics near the applicable limits of the equations and for basins located in karst topography. A test of two drainage-area ratio methods using 31 pairs of streamgages, for the annual 7-day mean low-flow statistic for a recurrence interval of 10 years, indicates a weighted drainage-area ratio method provides better estimates than regional regression equations for an ungaged site on a gaged stream in Iowa when the drainage-area ratio is between 0.5 and 1.4. These regression equations will be implemented within the U.S. Geological Survey StreamStats web-based geographic-information-system tool. StreamStats allows users to click on any ungaged site on a river and compute estimates of the seven selected statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged sites also are provided. StreamStats also allows users to click on any streamgage in Iowa and estimates computed for these seven selected statistics are provided for the streamgage.

  5. Image dynamic range test and evaluation of Gaofen-2 dual cameras

    NASA Astrophysics Data System (ADS)

    Zhang, Zhenhua; Gan, Fuping; Wei, Dandan

    2015-12-01

    In order to fully understand the dynamic range of Gaofen-2 satellite data and support the data processing, application and next satellites development, in this article, we evaluated the dynamic range by calculating some statistics such as maximum ,minimum, average and stand deviation of four images obtained at the same time by Gaofen-2 dual cameras in Beijing area; then the maximum ,minimum, average and stand deviation of each longitudinal overlap of PMS1,PMS2 were calculated respectively for the evaluation of each camera's dynamic range consistency; and these four statistics of each latitudinal overlap of PMS1,PMS2 were calculated respectively for the evaluation of the dynamic range consistency between PMS1 and PMS2 at last. The results suggest that there is a wide dynamic range of DN value in the image obtained by PMS1 and PMS2 which contains rich information of ground objects; in general, the consistency of dynamic range between the single camera images is in close agreement, but also a little difference, so do the dual cameras. The consistency of dynamic range between the single camera images is better than the dual cameras'.

  6. Numerical solutions for patterns statistics on Markov chains.

    PubMed

    Nuel, Gregory

    2006-01-01

    We propose here a review of the methods available to compute pattern statistics on text generated by a Markov source. Theoretical, but also numerical aspects are detailed for a wide range of techniques (exact, Gaussian, large deviations, binomial and compound Poisson). The SPatt package (Statistics for Pattern, free software available at http://stat.genopole.cnrs.fr/spatt) implementing all these methods is then used to compare all these approaches in terms of computational time and reliability in the most complete pattern statistics benchmark available at the present time.

  7. Analysis of Two Different Arthroscopic Broström Repair Constructs for Treatment of Chronic Lateral Ankle Instability in 110 Patients: A Retrospective Cohort Study.

    PubMed

    Cottom, James M; Baker, Joseph; Plemmons, Britton S

    Chronic lateral ankle instability is a common condition treated by most foot and ankle surgeons. Once conservative treatment has failed, patients often undergo surgical reconstruction, either anatomic or nonanatomic. The present retrospective cohort study compared the clinical outcomes of 2 different arthroscopic Broström procedures. A total of 110 patients (83 females [75.5%] and 27 males [24.5%]) were treated with 1 of the 2 lateral ankle stabilization techniques from October 1, 2014 to December 31, 2015. Of the 110 patients, 75 were included in the arthroscopic lateral ankle stabilization group with an additional suture anchor used proximally and 35 were included in the arthroscopic lateral ankle stabilization group using the knotless design. The age of the cohort was 46.05 ± 17.89 (range 12 to 83) years. The body mass index was 30.03 ± 7.42 (range 18.3 to 52.5) kg/m 2 . Of the 110 patients, 25 (22.7%) had undergone concomitant procedures during lateral ankle stabilization. Overall, postoperative complications occurred in 14 patients (12.7%). No statistically significant differences were found between the 2 groups regarding the complication rates, use of concomitant procedures, and the presence of diabetes and workers compensation claims. No statistically significant differences were found in the mean age, body mass index, or gender distribution between the 2 groups. The preoperative American Orthopaedic Foot and Ankle Society (AOFAS) Ankle-Hindfoot scores were 50.85 ± 13.56 (range 18 to 76) and 51.26 ± 13.32 (range 18 to 69) in groups 1 and 2, respectively. The postoperative AOFAS Ankle-Hindfoot scores were 88.19 ± 10.72 (range 54 to 100) and 84 ± 15.41 (range 16 to 100) in groups 1 and 2, respectively. No statistically significant difference was found between these 2 groups. The preoperative visual analog scale score was 7.45 ± 1.39 (range 3 to 10) and 6.97 ± 1.25 (range 5 to 10), which had improved to 1.12 ± 1.38 (range 0 to 5) and 1.8 ± 1.98 (range 1 to 9) postoperatively for groups 1 and 2, respectively. The difference in the postoperative visual analog scale score between the 2 groups was statistically significant. The preoperative and postoperative AOFAS scale, Foot Function Index, and Karlsson-Peterson scores showed no statistically significant differences between the 2 groups. From our experience, either procedure is an acceptable treatment option for chronic lateral ankle instability, with the knotless technique showing a trend toward more complications. Copyright © 2017 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  8. Sensors and signal processing for high accuracy passenger counting : final report.

    DOT National Transportation Integrated Search

    2009-03-05

    It is imperative for a transit system to track statistics about their ridership in order to plan bus routes. There exists a wide variety of methods for obtaining these statistics that range from relying on the driver to count people to utilizing came...

  9. 76 FR 11195 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ..., econometrics, cognitive psychology, and computer science as they pertain to the full range of Census Bureau... technical expertise from the following disciplines: demography, economics, geography, psychology, statistics..., psychology, statistics, survey methodology, social and behavioral sciences, Information Technology, computing...

  10. Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.

    PubMed

    Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S

    2018-05-05

    Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.

  11. Using the Bootstrap Method to Evaluate the Critical Range of Misfit for Polytomous Rasch Fit Statistics.

    PubMed

    Seol, Hyunsoo

    2016-06-01

    The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.

  12. Dissipative Effects on Inertial-Range Statistics at High Reynolds Numbers.

    PubMed

    Sinhuber, Michael; Bewley, Gregory P; Bodenschatz, Eberhard

    2017-09-29

    Using the unique capabilities of the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, we report experimental measurements in classical grid turbulence that uncover oscillations of the velocity structure functions in the inertial range. This was made possible by measuring extremely long time series of up to 10^{10} samples of the turbulent fluctuating velocity, which corresponds to O(10^{7}) integral length scales. The measurements were conducted in a well-controlled environment at a wide range of high Reynolds numbers from R_{λ}=110 up to R_{λ}=1600, using both traditional hot-wire probes as well as the nanoscale thermal anemometry probe developed at Princeton University. An implication of the observed oscillations is that dissipation influences the inertial-range statistics of turbulent flows at scales significantly larger than predicted by current models and theories.

  13. Turbulent statistics and intermittency enhancement in coflowing superfluid 4He

    NASA Astrophysics Data System (ADS)

    Biferale, L.; Khomenko, D.; L'vov, V.; Pomyalov, A.; Procaccia, I.; Sahoo, G.

    2018-02-01

    The large-scale turbulent statistics of mechanically driven superfluid 4He was shown experimentally to follow the classical counterpart. In this paper, we use direct numerical simulations to study the whole range of scales in a range of temperatures T ∈[1.3 ,2.1 ] K. The numerics employ self-consistent and nonlinearly coupled normal and superfluid components. The main results are that (i) the velocity fluctuations of normal and super components are well correlated in the inertial range of scales, but decorrelate at small scales. (ii) The energy transfer by mutual friction between components is particulary efficient in the temperature range between 1.8 and 2 K, leading to enhancement of small-scale intermittency for these temperatures. (iii) At low T and close to Tλ, the scaling properties of the energy spectra and structure functions of the two components are approaching those of classical hydrodynamic turbulence.

  14. Directional change of fluid particles in two-dimensional turbulence and of football players

    NASA Astrophysics Data System (ADS)

    Kadoch, Benjamin; Bos, Wouter J. T.; Schneider, Kai

    2017-06-01

    Multiscale directional statistics are investigated in two-dimensional incompressible turbulence. It is shown that the short-time behavior of the mean angle of directional change of fluid particles is linearly dependent on the time lag and that no inertial range behavior is observed in the directional change associated with the enstrophy-cascade range. In simulations of the inverse-cascade range, the directional change shows a power law behavior at inertial range time scales. By comparing the directional change in space-periodic and wall-bounded flow, it is shown that the probability density function of the directional change at long times carries the signature of the confinement. The geometrical origin of this effect is validated by Monte Carlo simulations. The same effect is also observed in the directional statistics computed from the trajectories of football players (soccer players in American English).

  15. Model for neural signaling leap statistics

    NASA Astrophysics Data System (ADS)

    Chevrollier, Martine; Oriá, Marcos

    2011-03-01

    We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T = 37.5°C, awaken regime) and Lévy statistics (T = 35.5°C, sleeping period), characterized by rare events of long range connections.

  16. Statistical Approach To Extraction Of Texture In SAR

    NASA Technical Reports Server (NTRS)

    Rignot, Eric J.; Kwok, Ronald

    1992-01-01

    Improved statistical method of extraction of textural features in synthetic-aperture-radar (SAR) images takes account of effects of scheme used to sample raw SAR data, system noise, resolution of radar equipment, and speckle. Treatment of speckle incorporated into overall statistical treatment of speckle, system noise, and natural variations in texture. One computes speckle auto-correlation function from system transfer function that expresses effect of radar aperature and incorporates range and azimuth resolutions.

  17. HHE/LORAN-C Surveying.

    DTIC Science & Technology

    1982-11-01

    N* I. Khednipin.CeldegN. CG-D-Y4-82 4. Id ed I.ld j. D* NIUVin 1982U /LIAC WH ,. Peb- olm..,, Cede 9. Pwmln,, OWe, .omee . en Adoe., 10 . Wei Uat Me... 10 PRE-SURVEY PLANNING ..a-C ..hin.at.......................... 11 Overviewon of S ..... ........... 1......... ... .16 R eWaypoint Defintion...TDSS Statistics Summary.. ....................... 29 9 Example of Range-Range Waypoint Calculation.................30 10 Summary of Range-Range Waypoint

  18. Qualitative Meta-Analysis on the Hospital Task: Implications for Research

    ERIC Educational Resources Information Center

    Noll, Jennifer; Sharma, Sashi

    2014-01-01

    The "law of large numbers" indicates that as sample size increases, sample statistics become less variable and more closely estimate their corresponding population parameters. Different research studies investigating how people consider sample size when evaluating the reliability of a sample statistic have found a wide range of…

  19. Appalachian Children and Their Families. A Statistical Profile.

    ERIC Educational Resources Information Center

    CSR, Inc., Arlington, VA.

    A statistical profile of Appalachia's young children, from birth to 9 years, was compiled from federal and state data sources. The profile provides information important in making immediate and long range plans for improving the status of Appalachian children and their families. An examination of family living conditions suggests that Appalachian…

  20. Children in the States, 2000.

    ERIC Educational Resources Information Center

    Andrejack, Kate, Comp.; Judge, Amy, Comp.; Simons, Janet, Comp.

    This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) national rankings in population and family characteristics; (2) health and disabilities (including children lacking health…

  1. Children in the States, 2001.

    ERIC Educational Resources Information Center

    Judge, Amy, Comp.

    This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) child health, including uninsured children, low birth weight babies, infant deaths, and immunizations; (2) child care and early…

  2. Statistical behavior of the tensile property of heated cotton fiber

    USDA-ARS?s Scientific Manuscript database

    The temperature dependence of the tensile property of single cotton fiber was studied in the range of 160-300°C using Favimat test, and its statistical behavior was interpreted in terms of structural changes. The tenacity of control cotton fiber was well described by the single Weibull distribution,...

  3. 75 FR 51335 - Revised Medical Criteria for Evaluating Mental Disorders

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-19

    ... Statistical Manual of Mental Disorders (DSM).\\3\\ We have also gained considerable adjudicative experience in... ``mild'' range in the current edition of the DSM, the Diagnostic and Statistical Manual of Mental... sometimes four, parts.\\10\\ The first part of every mental disorder listing is a brief introductory paragraph...

  4. Exploring Statistics Anxiety: Contrasting Mathematical, Academic Performance and Trait Psychological Predictors

    ERIC Educational Resources Information Center

    Bourne, Victoria J.

    2018-01-01

    Statistics anxiety is experienced by a large number of psychology students, and previous research has examined a range of potential correlates, including academic performance, mathematical ability and psychological predictors. These varying predictors are often considered separately, although there may be shared variance between them. In the…

  5. Spatiotemporal correlation of optical coherence tomography in-vivo images of rabbit airway for the diagnosis of edema

    NASA Astrophysics Data System (ADS)

    Kang, DongYel; Wang, Alex; Volgger, Veronika; Chen, Zhongping; Wong, Brian J. F.

    2015-07-01

    Detection of an early stage of subglottic edema is vital for airway management and prevention of stenosis, a life-threatening condition in critically ill neonates. As an observer for the task of diagnosing edema in vivo, we investigated spatiotemporal correlation (STC) of full-range optical coherence tomography (OCT) images acquired in the rabbit airway with experimentally simulated edema. Operating the STC observer on OCT images generates STC coefficients as test statistics for the statistical decision task. Resulting from this, the receiver operating characteristic (ROC) curves for the diagnosis of airway edema with full-range OCT in-vivo images were extracted and areas under ROC curves were calculated. These statistically quantified results demonstrated the potential clinical feasibility of the STC method as a means to identify early airway edema.

  6. Thermodynamics of ideal quantum gas with fractional statistics in D dimensions.

    PubMed

    Potter, Geoffrey G; Müller, Gerhard; Karbach, Michael

    2007-06-01

    We present exact and explicit results for the thermodynamic properties (isochores, isotherms, isobars, response functions, velocity of sound) of a quantum gas in dimensions D > or = 1 and with fractional exclusion statistics 0 < or = g < or =1 connecting bosons (g=0) and fermions (g=1) . In D=1 the results are equivalent to those of the Calogero-Sutherland model. Emphasis is given to the crossover between bosonlike and fermionlike features, caused by aspects of the statistical interaction that mimic long-range attraction and short-range repulsion. A phase transition along the isobar occurs at a nonzero temperature in all dimensions. The T dependence of the velocity of sound is in simple relation to isochores and isobars. The effects of soft container walls are accounted for rigorously for the case of a pure power-law potential.

  7. Primer of statistics in dental research: part I.

    PubMed

    Shintani, Ayumi

    2014-01-01

    Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.

  8. Meteor trail footprint statistics

    NASA Astrophysics Data System (ADS)

    Mui, S. Y.; Ellicott, R. C.

    Footprint statistics derived from field-test data are presented. The statistics are the probability that two receivers will lie in the same footprint. The dependence of the footprint statistics on the transmitter range, link orientation, and antenna polarization are examined. Empirical expressions for the footprint statistics are presented. The need to distinguish the instantaneous footprint, which is the area illuminated at a particular instant, from the composite footprint, which is the total area illuminated during the lifetime of the meteor trail, is explained. The statistics for the instantaneous and composite footprints have been found to be similar. The only significant difference lies in the parameter that represents the probability of two colocated receivers being in the same footprint. The composite footprint statistics can be used to calculate the space diversity gain of a multiple-receiver system. The instantaneous footprint statistics are useful in the evaluation of the interference probability in a network of meteor burst communication nodes.

  9. Statistical Studies of the Electric Breakdown in Nitrogen in the Duration Range of 3 ms-60 min

    NASA Astrophysics Data System (ADS)

    Gorokhov, V. V.; Karelin, V. I.; Perminov, A. V.; Repin, P. B.

    2018-05-01

    The statistical characteristics of an electric breakdown in the nitrogen in the spike (cathode)-plane gap in the duration range of (3 × 10-3)-3600 s at voltages close to a static breakdown have been studied. It has been found that a probability of a gap breakdown is nonmonotonously distributed over time. The presence of maxima in the probability distribution confirms a contribution of some processes that both stimulate and suppress a breakdown. The typical times of the processes are 30 ms, 10-1 s, and 300 s.

  10. Lagrangian statistics in weakly forced two-dimensional turbulence.

    PubMed

    Rivera, Michael K; Ecke, Robert E

    2016-01-01

    Measurements of Lagrangian single-point and multiple-point statistics in a quasi-two-dimensional stratified layer system are reported. The system consists of a layer of salt water over an immiscible layer of Fluorinert and is forced electromagnetically so that mean-squared vorticity is injected at a well-defined spatial scale ri. Simultaneous cascades develop in which enstrophy flows predominately to small scales whereas energy cascades, on average, to larger scales. Lagrangian correlations and one- and two-point displacements are measured for random initial conditions and for initial positions within topological centers and saddles. Some of the behavior of these quantities can be understood in terms of the trapping characteristics of long-lived centers, the slow motion near strong saddles, and the rapid fluctuations outside of either centers or saddles. We also present statistics of Lagrangian velocity fluctuations using energy spectra in frequency space and structure functions in real space. We compare with complementary Eulerian velocity statistics. We find that simultaneous inverse energy and enstrophy ranges present in spectra are not directly echoed in real-space moments of velocity difference. Nevertheless, the spectral ranges line up well with features of moment ratios, indicating that although the moments are not exhibiting unambiguous scaling, the behavior of the probability distribution functions is changing over short ranges of length scales. Implications for understanding weakly forced 2D turbulence with simultaneous inverse and direct cascades are discussed.

  11. The U.S. geological survey rass-statpac system for management and statistical reduction of geochemical data

    USGS Publications Warehouse

    VanTrump, G.; Miesch, A.T.

    1977-01-01

    RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.

  12. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Delay, change and bifurcation of the immunofluorescence distribution attractors in health statuses diagnostics and in medical treatment

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.; Filatov, Michael V.

    2008-07-01

    Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  14. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1986-01-01

    The reliability of microfocous X-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 precent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  15. Probability of detection of internal voids in structural ceramics using microfocus radiography

    NASA Technical Reports Server (NTRS)

    Baaklini, G. Y.; Roth, D. J.

    1985-01-01

    The reliability of microfocus x-radiography for detecting subsurface voids in structural ceramic test specimens was statistically evaluated. The microfocus system was operated in the projection mode using low X-ray photon energies (20 keV) and a 10 micro m focal spot. The statistics were developed for implanted subsurface voids in green and sintered silicon carbide and silicon nitride test specimens. These statistics were compared with previously-obtained statistics for implanted surface voids in similar specimens. Problems associated with void implantation are discussed. Statistical results are given as probability-of-detection curves at a 95 percent confidence level for voids ranging in size from 20 to 528 micro m in diameter.

  16. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  17. Projecting Range Limits with Coupled Thermal Tolerance - Climate Change Models: An Example Based on Gray Snapper (Lutjanus griseus) along the U.S. East Coast

    PubMed Central

    Hare, Jonathan A.; Wuenschel, Mark J.; Kimball, Matthew E.

    2012-01-01

    We couple a species range limit hypothesis with the output of an ensemble of general circulation models to project the poleward range limit of gray snapper. Using laboratory-derived thermal limits and statistical downscaling from IPCC AR4 general circulation models, we project that gray snapper will shift northwards; the magnitude of this shift is dependent on the magnitude of climate change. We also evaluate the uncertainty in our projection and find that statistical uncertainty associated with the experimentally-derived thermal limits is the largest contributor (∼ 65%) to overall quantified uncertainty. This finding argues for more experimental work aimed at understanding and parameterizing the effects of climate change and variability on marine species. PMID:23284974

  18. Predicted range expansion of Chinese tallow tree (Triadica sebifera) in forestlands of the southern United States

    Treesearch

    Hsiao-Hsuan Wang; William Grant; Todd Swannack; Jianbang Gan; William Rogers; Tomasz Koralewski; James Miller; John W. Taylor Jr.

    2011-01-01

    We present an integrated approach for predicting future range expansion of an invasive species (Chinese tallow tree) that incorporates statistical forecasting and analytical techniques within a spatially explicit, agent-based, simulation framework.

  19. Regional Regression Equations to Estimate Flow-Duration Statistics at Ungaged Stream Sites in Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.

    2010-01-01

    Multiple linear regression equations for determining flow-duration statistics were developed to estimate select flow exceedances ranging from 25- to 99-percent for six 'bioperiods'-Salmonid Spawning (November), Overwinter (December-February), Habitat Forming (March-April), Clupeid Spawning (May), Resident Spawning (June), and Rearing and Growth (July-October)-in Connecticut. Regression equations also were developed to estimate the 25- and 99-percent flow exceedances without reference to a bioperiod. In total, 32 equations were developed. The predictive equations were based on regression analyses relating flow statistics from streamgages to GIS-determined basin and climatic characteristics for the drainage areas of those streamgages. Thirty-nine streamgages (and an additional 6 short-term streamgages and 28 partial-record sites for the non-bioperiod 99-percent exceedance) in Connecticut and adjacent areas of neighboring States were used in the regression analysis. Weighted least squares regression analysis was used to determine the predictive equations; weights were assigned based on record length. The basin characteristics-drainage area, percentage of area with coarse-grained stratified deposits, percentage of area with wetlands, mean monthly precipitation (November), mean seasonal precipitation (December, January, and February), and mean basin elevation-are used as explanatory variables in the equations. Standard errors of estimate of the 32 equations ranged from 10.7 to 156 percent with medians of 19.2 and 55.4 percent to predict the 25- and 99-percent exceedances, respectively. Regression equations to estimate high and median flows (25- to 75-percent exceedances) are better predictors (smaller variability of the residual values around the regression line) than the equations to estimate low flows (less than 75-percent exceedance). The Habitat Forming (March-April) bioperiod had the smallest standard errors of estimate, ranging from 10.7 to 20.9 percent. In contrast, the Rearing and Growth (July-October) bioperiod had the largest standard errors, ranging from 30.9 to 156 percent. The adjusted coefficient of determination of the equations ranged from 77.5 to 99.4 percent with medians of 98.5 and 90.6 percent to predict the 25- and 99-percent exceedances, respectively. Descriptive information on the streamgages used in the regression, measured basin and climatic characteristics, and estimated flow-duration statistics are provided in this report. Flow-duration statistics and the 32 regression equations for estimating flow-duration statistics in Connecticut are stored on the U.S. Geological Survey World Wide Web application ?StreamStats? (http://water.usgs.gov/osw/streamstats/index.html). The regression equations developed in this report can be used to produce unbiased estimates of select flow exceedances statewide.

  20. Testing for clustering at many ranges inflates family-wise error rate (FWE).

    PubMed

    Loop, Matthew Shane; McClure, Leslie A

    2015-01-15

    Testing for clustering at multiple ranges within a single dataset is a common practice in spatial epidemiology. It is not documented whether this approach has an impact on the type 1 error rate. We estimated the family-wise error rate (FWE) for the difference in Ripley's K functions test, when testing at an increasing number of ranges at an alpha-level of 0.05. Case and control locations were generated from a Cox process on a square area the size of the continental US (≈3,000,000 mi2). Two thousand Monte Carlo replicates were used to estimate the FWE with 95% confidence intervals when testing for clustering at one range, as well as 10, 50, and 100 equidistant ranges. The estimated FWE and 95% confidence intervals when testing 10, 50, and 100 ranges were 0.22 (0.20 - 0.24), 0.34 (0.31 - 0.36), and 0.36 (0.34 - 0.38), respectively. Testing for clustering at multiple ranges within a single dataset inflated the FWE above the nominal level of 0.05. Investigators should construct simultaneous critical envelopes (available in spatstat package in R), or use a test statistic that integrates the test statistics from each range, as suggested by the creators of the difference in Ripley's K functions test.

  1. Importance of regional variation in conservation planning: A rangewide example of the Greater Sage-Grouse

    USGS Publications Warehouse

    Doherty, Kevin E.; Evans, Jeffrey S.; Coates, Peter S.; Juliusson, Lara; Fedy, Bradley C.

    2016-01-01

    We developed rangewide population and habitat models for Greater Sage-Grouse (Centrocercus urophasianus) that account for regional variation in habitat selection and relative densities of birds for use in conservation planning and risk assessments. We developed a probabilistic model of occupied breeding habitat by statistically linking habitat characteristics within 4 miles of an occupied lek using a nonlinear machine learning technique (Random Forests). Habitat characteristics used were quantified in GIS and represent standard abiotic and biotic variables related to sage-grouse biology. Statistical model fit was high (mean correctly classified = 82.0%, range = 75.4–88.0%) as were cross-validation statistics (mean = 80.9%, range = 75.1–85.8%). We also developed a spatially explicit model to quantify the relative density of breeding birds across each Greater Sage-Grouse management zone. The models demonstrate distinct clustering of relative abundance of sage-grouse populations across all management zones. On average, approximately half of the breeding population is predicted to be within 10% of the occupied range. We also found that 80% of sage-grouse populations were contained in 25–34% of the occupied range within each management zone. Our rangewide population and habitat models account for regional variation in habitat selection and the relative densities of birds, and thus, they can serve as a consistent and common currency to assess how sage-grouse habitat and populations overlap with conservation actions or threats over the entire sage-grouse range. We also quantified differences in functional habitat responses and disturbance thresholds across the Western Association of Fish and Wildlife Agencies (WAFWA) management zones using statistical relationships identified during habitat modeling. Even for a species as specialized as Greater Sage-Grouse, our results show that ecological context matters in both the strength of habitat selection (i.e., functional response curves) and response to disturbance.

  2. Timber resource statistics for eastern Washington, 1995.

    Treesearch

    Neil McKay; Patricia M. Bassett; Colin D. MacLean

    1995-01-01

    This report summarizes a 1990-91 timber resource inventory of Washington east of the crest of the Cascade Range. The inventory was conducted on all private and public lands except National Forests. Timber resource statistics from National Forest inventories also are presented. Detailed tables provide estimates of forest area, timber volume, growth, mortality, and...

  3. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  4. Children in the States, 1999.

    ERIC Educational Resources Information Center

    Children's Defense Fund, Washington, DC.

    This data book provides statistics on a range of indicators that measure critical aspects of children's lives in each of the 50 states and the District of Columbia. Statistics are provided in the following categories: (1) population and family characteristics (including number of children under age 18 and age 5, percentage of population under age…

  5. Statistical properties of DNA sequences

    NASA Technical Reports Server (NTRS)

    Peng, C. K.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Simons, M.; Stanley, H. E.

    1995-01-01

    We review evidence supporting the idea that the DNA sequence in genes containing non-coding regions is correlated, and that the correlation is remarkably long range--indeed, nucleotides thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationarity" feature of the sequence of base pairs by applying a new algorithm called detrended fluctuation analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and non-coding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to every DNA sequence (33301 coding and 29453 non-coding) in the entire GenBank database. Finally, we describe briefly some recent work showing that the non-coding sequences have certain statistical features in common with natural and artificial languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts. These statistical properties of non-coding sequences support the possibility that non-coding regions of DNA may carry biological information.

  6. Impact of statin adherence on cardiovascular disease and mortality outcomes: a systematic review

    PubMed Central

    De Vera, Mary A; Bhole, Vidula; Burns, Lindsay C; Lacaille, Diane

    2014-01-01

    Aims While suboptimal adherence to statin medication has been quantified in real-world patient settings, a better understanding of its impact is needed, particularly with respect to distinct problems of medication taking. Our aim was to synthesize current evidence on the impacts of statin adherence, discontinuation and persistence on cardiovascular disease and mortality outcomes. Methods We conducted a systematic review of peer-reviewed studies using a mapped search of Medline, Embase and International Pharmaceutical Abstracts databases. Observational studies that met the following criteria were included: defined patient population; statin adherence exposure; defined study outcome [i.e. cardiovascular disease (CVD), mortality]; and reporting of statin-specific results. Results Overall, 28 studies were included, with 19 studies evaluating outcomes associated with statin adherence, six with statin discontinuation and three with statin persistence. Among adherence studies, the proportion of days covered was the most widely used measure, with the majority of studies reporting increased risk of CVD (statistically significant risk estimates ranging from 1.22 to 5.26) and mortality (statistically significant risk estimates ranging from 1.25 to 2.54) among non-adherent individuals. There was greater methodological variability in discontinuation and persistence studies. However, findings of increased CVD (statistically significant risk estimates ranging from 1.22 to 1.67) and mortality (statistically significant risk estimates ranging from 1.79 to 5.00) among nonpersistent individuals were also consistently reported. Conclusions Observational studies consistently report an increased risk of adverse outcomes associated with poor statin adherence. These findings have important implications for patients and physicians and emphasize the importance of monitoring and encouraging adherence to statin therapy. PMID:25364801

  7. Methods for estimating the magnitude and frequency of peak streamflows at ungaged sites in and near the Oklahoma Panhandle

    USGS Publications Warehouse

    Smith, S. Jerrod; Lewis, Jason M.; Graves, Grant M.

    2015-09-28

    Generalized-least-squares multiple-linear regression analysis was used to formulate regression relations between peak-streamflow frequency statistics and basin characteristics. Contributing drainage area was the only basin characteristic determined to be statistically significant for all percentage of annual exceedance probabilities and was the only basin characteristic used in regional regression equations for estimating peak-streamflow frequency statistics on unregulated streams in and near the Oklahoma Panhandle. The regression model pseudo-coefficient of determination, converted to percent, for the Oklahoma Panhandle regional regression equations ranged from about 38 to 63 percent. The standard errors of prediction and the standard model errors for the Oklahoma Panhandle regional regression equations ranged from about 84 to 148 percent and from about 76 to 138 percent, respectively. These errors were comparable to those reported for regional peak-streamflow frequency regression equations for the High Plains areas of Texas and Colorado. The root mean square errors for the Oklahoma Panhandle regional regression equations (ranging from 3,170 to 92,000 cubic feet per second) were less than the root mean square errors for the Oklahoma statewide regression equations (ranging from 18,900 to 412,000 cubic feet per second); therefore, the Oklahoma Panhandle regional regression equations produce more accurate peak-streamflow statistic estimates for the irrigated period of record in the Oklahoma Panhandle than do the Oklahoma statewide regression equations. The regression equations developed in this report are applicable to streams that are not substantially affected by regulation, impoundment, or surface-water withdrawals. These regression equations are intended for use for stream sites with contributing drainage areas less than or equal to about 2,060 square miles, the maximum value for the independent variable used in the regression analysis.

  8. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Statistical interior properties of globular proteins

    NASA Astrophysics Data System (ADS)

    Jiang, Zhou-Ting; Zhang, Lin-Xi; Sun, Ting-Ting; Wu, Tai-Quan

    2009-10-01

    The character of forming long-range contacts affects the three-dimensional structure of globular proteins deeply. As the different ability to form long-range contacts between 20 types of amino acids and 4 categories of globular proteins, the statistical properties are thoroughly discussed in this paper. Two parameters NC and ND are defined to confine the valid residues in detail. The relationship between hydrophobicity scales and valid residue percentage of each amino acid is given in the present work and the linear functions are shown in our statistical results. It is concluded that the hydrophobicity scale defined by chemical derivatives of the amino acids and nonpolar phase of large unilamellar vesicle membranes is the most effective technique to characterise the hydrophobic behavior of amino acid residues. Meanwhile, residue percentage Pi and sequential residue length Li of a certain protein i are calculated under different conditions. The statistical results show that the average value of Pi as well as Li of all-α proteins has a minimum among these 4 classes of globular proteins, indicating that all-α proteins are hardly capable of forming long-range contacts one by one along their linear amino acid sequences. All-β proteins have a higher tendency to construct long-range contacts along their primary sequences related to the secondary configurations, i.e. parallel and anti-parallel configurations of β sheets. The investigation of the interior properties of globular proteins give us the connection between the three-dimensional structure and its primary sequence data or secondary configurations, and help us to understand the structure of protein and its folding process well.

  9. Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007

    USGS Publications Warehouse

    Bennion, David

    2009-01-01

    To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.

  10. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (15th) Held in Kansas City, Missouri on 4-8 May 1987

    DTIC Science & Technology

    1987-08-01

    HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band

  11. Variations of attractors and wavelet spectra of the immunofluorescence distributions for women in the pregnant period

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.

    2008-07-01

    Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  12. A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments

    NASA Astrophysics Data System (ADS)

    Gokhale, Sharad; Khare, Mukesh

    Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).

  13. Calculation of recoil implantation profiles using known range statistics

    NASA Technical Reports Server (NTRS)

    Fung, C. D.; Avila, R. E.

    1985-01-01

    A method has been developed to calculate the depth distribution of recoil atoms that result from ion implantation onto a substrate covered with a thin surface layer. The calculation includes first order recoils considering projected range straggles, and lateral straggles of recoils but neglecting lateral straggles of projectiles. Projectile range distributions at intermediate energies in the surface layer are deduced from look-up tables of known range statistics. A great saving of computing time and human effort is thus attained in comparison with existing procedures. The method is used to calculate recoil profiles of oxygen from implantation of arsenic through SiO2 and of nitrogen from implantation of phosphorus through Si3N4 films on silicon. The calculated recoil profiles are in good agreement with results obtained by other investigators using the Boltzmann transport equation and they also compare very well with available experimental results in the literature. The deviation between calculated and experimental results is discussed in relation to lateral straggles. From this discussion, a range of surface layer thickness for which the method applies is recommended.

  14. A study of the effects of strong magnetic fields on the image resolution of PET scanners

    NASA Astrophysics Data System (ADS)

    Burdette, Don J.

    Very high resolution images can be achieved in small animal PET systems utilizing solid state silicon pad detectors. In such systems using detectors with sub-millimeter intrinsic resolutions, the range of the positron is the largest contribution to the image blur. The size of the positron range effect depends on the initial positron energy and hence the radioactive tracer used. For higher energy positron emitters, such as 68Ga and 94mTc, the variation of the annihilation point dominates the spatial resolution. In this study two techniques are investigated to improve the image resolution of PET scanners limited by the range of the positron. One, the positron range can be reduced by embedding the PET field of view in a strong magnetic field. We have developed a silicon pad detector based PET instrument that can operate in strong magnetic fields with an image resolution of 0.7 mm FWHM to study this effect. Two, iterative reconstruction methods can be used to statistically correct for the range of the positron. Both strong magnetic fields and iterative reconstruction algorithms that statistically account for the positron range distribution are investigated in this work.

  15. Radar prediction of absolute rain fade distributions for earth-satellite paths and general methods for extrapolation of fade statistics to other locations

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1982-01-01

    The first absolute rain fade distribution method described establishes absolute fade statistics at a given site by means of a sampled radar data base. The second method extrapolates absolute fade statistics from one location to another, given simultaneously measured fade and rain rate statistics at the former. Both methods employ similar conditional fade statistic concepts and long term rain rate distributions. Probability deviations in the 2-19% range, with an 11% average, were obtained upon comparison of measured and predicted levels at given attenuations. The extrapolation of fade distributions to other locations at 28 GHz showed very good agreement with measured data at three sites located in the continental temperate region.

  16. Velocity distributions of granular gases with drag and with long-range interactions.

    PubMed

    Kohlstedt, K; Snezhko, A; Sapozhnikov, M V; Aranson, I S; Olafsen, J S; Ben-Naim, E

    2005-08-05

    We study velocity statistics of electrostatically driven granular gases. For two different experiments, (i) nonmagnetic particles in a viscous fluid and (ii) magnetic particles in air, the velocity distribution is non-Maxwellian, and its high-energy tail is exponential, P(upsilon) approximately exp(-/upsilon/). This behavior is consistent with the kinetic theory of driven dissipative particles. For particles immersed in a fluid, viscous damping is responsible for the exponential tail, while for magnetic particles, long-range interactions cause the exponential tail. We conclude that velocity statistics of dissipative gases are sensitive to the fluid environment and to the form of the particle interaction.

  17. Statistical physics in foreign exchange currency and stock markets

    NASA Astrophysics Data System (ADS)

    Ausloos, M.

    2000-09-01

    Problems in economy and finance have attracted the interest of statistical physicists all over the world. Fundamental problems pertain to the existence or not of long-, medium- or/and short-range power-law correlations in various economic systems, to the presence of financial cycles and on economic considerations, including economic policy. A method like the detrended fluctuation analysis is recalled emphasizing its value in sorting out correlation ranges, thereby leading to predictability at short horizon. The ( m, k)-Zipf method is presented for sorting out short-range correlations in the sign and amplitude of the fluctuations. A well-known financial analysis technique, the so-called moving average, is shown to raise questions to physicists about fractional Brownian motion properties. Among spectacular results, the possibility of crash predictions has been demonstrated through the log-periodicity of financial index oscillations.

  18. Research study on neutral thermodynamic atmospheric model. [for space shuttle mission and abort trajectory

    NASA Technical Reports Server (NTRS)

    Hargraves, W. R.; Delulio, E. B.; Justus, C. G.

    1977-01-01

    The Global Reference Atmospheric Model is used along with the revised perturbation statistics to evaluate and computer graph various atmospheric statistics along a space shuttle reference mission and abort trajectory. The trajectory plots are height vs. ground range, with height from ground level to 155 km and ground range along the reentry trajectory. Cross sectional plots, height vs. latitude or longitude, are also generated for 80 deg longitude, with heights from 30 km to 90 km and latitude from -90 deg to +90 deg, and for 45 deg latitude, with heights from 30 km to 90 km and longitudes from 180 deg E to 180 deg W. The variables plotted are monthly average pressure, density, temperature, wind components, and wind speed and standard deviations and 99th inter-percentile range for each of these variables.

  19. Prospective, randomized trial comparing diathermy excision and diathermy coagulation for symptomatic, prolapsed hemorrhoids.

    PubMed

    Quah, H M; Seow-Choen, F

    2004-03-01

    This study was designed to compare diathermy excision and diathermy coagulation in the treatment of symptomatic prolapsed piles. Forty-five consecutive patients were randomly assigned to diathermy excision hemorrhoidectomy (Group A, n = 25) and diathermy coagulation (Group B, n = 20) under general anesthesia. The median duration of surgery was ten minutes for both groups. There was no statistical difference in the severity of postoperative pain at rest between the two groups, but Group A patients felt less pain during defecation on the third postoperative day (median, 5 (interquartile range, 3-7) vs. 8 (4-9); P = 0.04) and on the sixth postoperative day (median, 5 (interquartile range, 2-6) vs. 9 (5-10); P = 0.02). There was, however, no statistical difference in postoperative oral analgesics use and patients' satisfaction scores between the two groups. Complication rates were similar except that diathermy coagulation tended to leave some residual skin components of external hemorrhoid especially in very large prolapsed piles. Group A patients resumed work earlier (mean, 12 (range, 4-20) vs. 17 (11-21) days); however, this was not statistically significant ( P = 0.1). Diathermy coagulation of hemorrhoids is a simple technique and may be considered in suitable cases.

  20. Nonlinear histogram binning for quantitative analysis of lung tissue fibrosis in high-resolution CT data

    NASA Astrophysics Data System (ADS)

    Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.

    2007-03-01

    Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.

  1. Sandpile-based model for capturing magnitude distributions and spatiotemporal clustering and separation in regional earthquakes

    NASA Astrophysics Data System (ADS)

    Batac, Rene C.; Paguirigan, Antonino A., Jr.; Tarun, Anjali B.; Longjas, Anthony G.

    2017-04-01

    We propose a cellular automata model for earthquake occurrences patterned after the sandpile model of self-organized criticality (SOC). By incorporating a single parameter describing the probability to target the most susceptible site, the model successfully reproduces the statistical signatures of seismicity. The energy distributions closely follow power-law probability density functions (PDFs) with a scaling exponent of around -1. 6, consistent with the expectations of the Gutenberg-Richter (GR) law, for a wide range of the targeted triggering probability values. Additionally, for targeted triggering probabilities within the range 0.004-0.007, we observe spatiotemporal distributions that show bimodal behavior, which is not observed previously for the original sandpile. For this critical range of values for the probability, model statistics show remarkable comparison with long-period empirical data from earthquakes from different seismogenic regions. The proposed model has key advantages, the foremost of which is the fact that it simultaneously captures the energy, space, and time statistics of earthquakes by just introducing a single parameter, while introducing minimal parameters in the simple rules of the sandpile. We believe that the critical targeting probability parameterizes the memory that is inherently present in earthquake-generating regions.

  2. Computer program documentation for the pasture/range condition assessment processor

    NASA Technical Reports Server (NTRS)

    Mcintyre, K. S.; Miller, T. G. (Principal Investigator)

    1982-01-01

    The processor which drives for the RANGE software allows the user to analyze LANDSAT data containing pasture and rangeland. Analysis includes mapping, generating statistics, calculating vegetative indexes, and plotting vegetative indexes. Routines for using the processor are given. A flow diagram is included.

  3. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  4. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  5. Use of the Global Test Statistic as a Performance Measurement in a Reananlysis of Environmental Health Data

    PubMed Central

    Dymova, Natalya; Hanumara, R. Choudary; Gagnon, Ronald N.

    2009-01-01

    Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies. PMID:19696393

  6. Use of the global test statistic as a performance measurement in a reanalysis of environmental health data.

    PubMed

    Dymova, Natalya; Hanumara, R Choudary; Enander, Richard T; Gagnon, Ronald N

    2009-10-01

    Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies.

  7. Tables of square-law signal detection statistics for Hann spectra with 50 percent overlap

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. Kent

    1991-01-01

    The Search for Extraterrestrial Intelligence, currently being planned by NASA, will require that an enormous amount of data be analyzed in real time by special purpose hardware. It is expected that overlapped Hann data windows will play an important role in this analysis. In order to understand the statistical implication of this approach, it has been necessary to compute detection statistics for overlapped Hann spectra. Tables of signal detection statistics are given for false alarm rates from 10(exp -14) to 10(exp -1) and signal detection probabilities from 0.50 to 0.99; the number of computed spectra ranges from 4 to 2000.

  8. Detector noise statistics in the non-linear regime

    NASA Technical Reports Server (NTRS)

    Shopbell, P. L.; Bland-Hawthorn, J.

    1992-01-01

    The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.

  9. Statistics on Blindness in the Model Reporting Area 1969-1970.

    ERIC Educational Resources Information Center

    Kahn, Harold A.; Moorhead, Helen B.

    Presented in the form of 30 tables are statistics on blindness in 16 states which have agreed to uniform definitions and procedures to improve reliability of data regarding blind persons. The data indicates that rates of blindness were generally higher for nonwhites than for whites with the ratio ranging from almost 10 for glaucoma to minimal for…

  10. Statistical Misconceptions and Rushton's Writings on Race.

    ERIC Educational Resources Information Center

    Cernovsky, Zack Z.

    The term "statistical significance" is often misunderstood or abused to imply a large effect size. A recent example is in the work of J. P. Rushton (1988, 1990) on differences between Negroids and Caucasoids. Rushton used brain size and cranial size as indicators of intelligence, using Pearson "r"s ranging from 0.03 to 0.35.…

  11. Children in the States Data Book, 1998.

    ERIC Educational Resources Information Center

    Children's Defense Fund, Washington, DC.

    This data book from the Children's Defense Fund includes statistics on a range of indicators that measure critical aspects of children's lives in each of the states and the United States as a whole. Statistics are provided in the following categories: (1) population and family characteristics (number of children under age 18 and age 6, number of…

  12. GeoGebra for Mathematical Statistics

    ERIC Educational Resources Information Center

    Hewson, Paul

    2009-01-01

    The GeoGebra software is attracting a lot of interest in the mathematical community, consequently there is a wide range of experience and resources to help use this application. This article briefly outlines how GeoGebra will be of great value in statistical education. The release of GeoGebra is an excellent example of the power of free software…

  13. Australian Vocational Education and Training Statistics: Young People in Education & Training 2013

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2014

    2014-01-01

    The Australian education and training system offers a range of options for young people. This publication provides a summary of the statistics relating to young people aged 15 to 19 years who participated in an education and training activity during 2013 Information on participation is presented for VET in Schools students, higher education…

  14. Australian Vocational Education and Training Statistics: Young People in Education and Training, 2011

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2012

    2012-01-01

    The Australian education and training system offers a range of options for young people. This publication provides a summary of the statistics relating to young people aged 15 to 19 years who participated in an education and training activity during 2011. Information on participation is presented for VET in Schools students, school students,…

  15. DNA viewed as an out-of-equilibrium structure

    NASA Astrophysics Data System (ADS)

    Provata, A.; Nicolis, C.; Nicolis, G.

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  16. Predicting the potential distribution of invasive exotic species using GIS and information-theoretic approaches: A case of ragweed (Ambrosia artemisiifolia L.) distribution in China

    USGS Publications Warehouse

    Hao, Chen; LiJun, Chen; Albright, Thomas P.

    2007-01-01

    Invasive exotic species pose a growing threat to the economy, public health, and ecological integrity of nations worldwide. Explaining and predicting the spatial distribution of invasive exotic species is of great importance to prevention and early warning efforts. We are investigating the potential distribution of invasive exotic species, the environmental factors that influence these distributions, and the ability to predict them using statistical and information-theoretic approaches. For some species, detailed presence/absence occurrence data are available, allowing the use of a variety of standard statistical techniques. However, for most species, absence data are not available. Presented with the challenge of developing a model based on presence-only information, we developed an improved logistic regression approach using Information Theory and Frequency Statistics to produce a relative suitability map. This paper generated a variety of distributions of ragweed (Ambrosia artemisiifolia L.) from logistic regression models applied to herbarium specimen location data and a suite of GIS layers including climatic, topographic, and land cover information. Our logistic regression model was based on Akaike's Information Criterion (AIC) from a suite of ecologically reasonable predictor variables. Based on the results we provided a new Frequency Statistical method to compartmentalize habitat-suitability in the native range. Finally, we used the model and the compartmentalized criterion developed in native ranges to "project" a potential distribution onto the exotic ranges to build habitat-suitability maps. ?? Science in China Press 2007.

  17. DNA viewed as an out-of-equilibrium structure.

    PubMed

    Provata, A; Nicolis, C; Nicolis, G

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  18. Lagrangian statistics in weakly forced two-dimensional turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera, Michael K.; Ecke, Robert E.

    Measurements of Lagrangian single-point and multiple-point statistics in a quasi-two-dimensional stratified layer system are reported. The system consists of a layer of salt water over an immiscible layer of Fluorinert and is forced electromagnetically so that mean-squared vorticity is injected at a well-defined spatial scale r i. Simultaneous cascades develop in which enstrophy flows predominately to small scales whereas energy cascades, on average, to larger scales. Lagrangian correlations and one- and two-point displacements are measured for random initial conditions and for initial positions within topological centers and saddles. Some of the behavior of these quantities can be understood in termsmore » of the trapping characteristics of long-lived centers, the slow motion near strong saddles, and the rapid fluctuations outside of either centers or saddles. We also present statistics of Lagrangian velocity fluctuations using energy spectra in frequency space and structure functions in real space. We compare with complementary Eulerian velocity statistics. We find that simultaneous inverse energy and enstrophy ranges present in spectra are not directly echoed in real-space moments of velocity difference. Nevertheless, the spectral ranges line up well with features of moment ratios, indicating that although the moments are not exhibiting unambiguous scaling, the behavior of the probability distribution functions is changing over short ranges of length scales. Furthermore, implications for understanding weakly forced 2D turbulence with simultaneous inverse and direct cascades are discussed.« less

  19. Lagrangian statistics in weakly forced two-dimensional turbulence

    DOE PAGES

    Rivera, Michael K.; Ecke, Robert E.

    2016-01-14

    Measurements of Lagrangian single-point and multiple-point statistics in a quasi-two-dimensional stratified layer system are reported. The system consists of a layer of salt water over an immiscible layer of Fluorinert and is forced electromagnetically so that mean-squared vorticity is injected at a well-defined spatial scale r i. Simultaneous cascades develop in which enstrophy flows predominately to small scales whereas energy cascades, on average, to larger scales. Lagrangian correlations and one- and two-point displacements are measured for random initial conditions and for initial positions within topological centers and saddles. Some of the behavior of these quantities can be understood in termsmore » of the trapping characteristics of long-lived centers, the slow motion near strong saddles, and the rapid fluctuations outside of either centers or saddles. We also present statistics of Lagrangian velocity fluctuations using energy spectra in frequency space and structure functions in real space. We compare with complementary Eulerian velocity statistics. We find that simultaneous inverse energy and enstrophy ranges present in spectra are not directly echoed in real-space moments of velocity difference. Nevertheless, the spectral ranges line up well with features of moment ratios, indicating that although the moments are not exhibiting unambiguous scaling, the behavior of the probability distribution functions is changing over short ranges of length scales. Furthermore, implications for understanding weakly forced 2D turbulence with simultaneous inverse and direct cascades are discussed.« less

  20. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    USGS Publications Warehouse

    Southard, Rodney E.

    2013-01-01

    The weather and precipitation patterns in Missouri vary considerably from year to year. In 2008, the statewide average rainfall was 57.34 inches and in 2012, the statewide average rainfall was 30.64 inches. This variability in precipitation and resulting streamflow in Missouri underlies the necessity for water managers and users to have reliable streamflow statistics and a means to compute select statistics at ungaged locations for a better understanding of water availability. Knowledge of surface-water availability is dependent on the streamflow data that have been collected and analyzed by the U.S. Geological Survey for more than 100 years at approximately 350 streamgages throughout Missouri. The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, computed streamflow statistics at streamgages through the 2010 water year, defined periods of drought and defined methods to estimate streamflow statistics at ungaged locations, and developed regional regression equations to compute selected streamflow statistics at ungaged locations. Streamflow statistics and flow durations were computed for 532 streamgages in Missouri and in neighboring States of Missouri. For streamgages with more than 10 years of record, Kendall’s tau was computed to evaluate for trends in streamflow data. If trends were detected, the variable length method was used to define the period of no trend. Water years were removed from the dataset from the beginning of the record for a streamgage until no trend was detected. Low-flow frequency statistics were then computed for the entire period of record and for the period of no trend if 10 or more years of record were available for each analysis. Three methods are presented for computing selected streamflow statistics at ungaged locations. The first method uses power curve equations developed for 28 selected streams in Missouri and neighboring States that have multiple streamgages on the same streams. Statistical estimates on one of these streams can be calculated at an ungaged location that has a drainage area that is between 40 percent of the drainage area of the farthest upstream streamgage and within 150 percent of the drainage area of the farthest downstream streamgage along the stream of interest. The second method may be used on any stream with a streamgage that has operated for 10 years or longer and for which anthropogenic effects have not changed the low-flow characteristics at the ungaged location since collection of the streamflow data. A ratio of drainage area of the stream at the ungaged location to the drainage area of the stream at the streamgage was computed to estimate the statistic at the ungaged location. The range of applicability is between 40- and 150-percent of the drainage area of the streamgage, and the ungaged location must be located on the same stream as the streamgage. The third method uses regional regression equations to estimate selected low-flow frequency statistics for unregulated streams in Missouri. This report presents regression equations to estimate frequency statistics for the 10-year recurrence interval and for the N-day durations of 1, 2, 3, 7, 10, 30, and 60 days. Basin and climatic characteristics were computed using geographic information system software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses based on existing digital geospatial data and previous studies. Spatial analyses for geographical bias in the predictive accuracy of the regional regression equations defined three low-flow regions with the State representing the three major physiographic provinces in Missouri. Region 1 includes the Central Lowlands, Region 2 includes the Ozark Plateaus, and Region 3 includes the Mississippi Alluvial Plain. A total of 207 streamgages were used in the regression analyses for the regional equations. Of the 207 U.S. Geological Survey streamgages, 77 were located in Region 1, 120 were located in Region 2, and 10 were located in Region 3. Streamgages located outside of Missouri were selected to extend the range of data used for the independent variables in the regression analyses. Streamgages included in the regression analyses had 10 or more years of record and were considered to be affected minimally by anthropogenic activities or trends. Regional regression analyses identified three characteristics as statistically significant for the development of regional equations. For Region 1, drainage area, longest flow path, and streamflow-variability index were statistically significant. The range in the standard error of estimate for Region 1 is 79.6 to 94.2 percent. For Region 2, drainage area and streamflow variability index were statistically significant, and the range in the standard error of estimate is 48.2 to 72.1 percent. For Region 3, drainage area and streamflow-variability index also were statistically significant with a range in the standard error of estimate of 48.1 to 96.2 percent. Limitations on the use of estimating low-flow frequency statistics at ungaged locations are dependent on the method used. The first method outlined for use in Missouri, power curve equations, were developed to estimate the selected statistics for ungaged locations on 28 selected streams with multiple streamgages located on the same stream. A second method uses a drainage-area ratio to compute statistics at an ungaged location using data from a single streamgage on the same stream with 10 or more years of record. Ungaged locations on these streams may use the ratio of the drainage area at an ungaged location to the drainage area at a streamgage location to scale the selected statistic value from the streamgage location to the ungaged location. This method can be used if the drainage area of the ungaged location is within 40 to 150 percent of the streamgage drainage area. The third method is the use of the regional regression equations. The limits for the use of these equations are based on the ranges of the characteristics used as independent variables and that streams must be affected minimally by anthropogenic activities.

  1. Accuracy evaluation of contour next compared with five blood glucose monitoring systems across a wide range of blood glucose concentrations occurring in a clinical research setting.

    PubMed

    Klaff, Leslie J; Brazg, Ronald; Hughes, Kristen; Tideman, Ann M; Schachner, Holly C; Stenger, Patricia; Pardo, Scott; Dunne, Nancy; Parkes, Joan Lee

    2015-01-01

    This study evaluated the accuracy of Contour(®) Next (CN; Bayer HealthCare LLC, Diabetes Care, Whippany, NJ) compared with five blood glucose monitoring systems (BGMSs) across a wide range of clinically occurring blood glucose levels. Subjects (n=146) were ≥ 18 years and had type 1 or type 2 diabetes. Subjects' glucose levels were safely lowered or raised to provide a wide range of glucose values. Capillary blood samples were tested on six BGMSs and a YSI glucose analyzer (YSI Life Sciences, Inc., Yellow Springs, OH) as the reference. Extreme glucose values were achieved by glucose modification of the blood sample. System accuracy was assessed by mean absolute difference (MAD) and mean absolute relative difference (MARD) across several glucose ranges, with <70 mg/dL evaluated by MAD as the primary end point. In the low glucose range (<70 mg/dL), MAD values were as follows: Accu-Chek(®) Aviva Nano (Roche Diagnostics, Indianapolis, IN), 3.34 mg/dL; CN, 2.03 mg/dL; FreeStyle Lite(®) (FSL; Abbott Diabetes Care, Inc., Alameda, CA), 2.77 mg/dL; OneTouch(®) Ultra(®) 2 (LifeScan, Inc., Milpitas, CA), 10.20 mg/dL; OneTouch(®) Verio(®) Pro (LifeScan, Inc.), 4.53 mg/dL; and Truetrack(®) (Nipro Diagnostics, Inc., Fort Lauderdale, FL), 11.08 mg/dL. The lowest MAD in the low glucose range, from CN, was statistically significantly lower than those of the other BGMSs with the exception of the FSL. CN also had a statistically significantly lower MARD than all other BGMSs in the low glucose range. In the overall glucose range (21-496 mg/dL), CN yielded the lowest MAD and MARD values, which were statistically significantly lower in comparison with the other BGMSs. When compared with other BGMSs, CN demonstrated the lowest mean deviation from the reference value (by MAD and MARD) across multiple glucose ranges.

  2. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    PubMed

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  3. Zonation in the deep benthic megafauna : Application of a general test.

    PubMed

    Gardiner, Frederick P; Haedrich, Richard L

    1978-01-01

    A test based on Maxwell-Boltzman statistics, instead of the formerly suggested but inappropriate Bose-Einstein statistics (Pielou and Routledge, 1976), examines the distribution of the boundaries of species' ranges distributed along a gradient, and indicates whether they are random or clustered (zoned). The test is most useful as a preliminary to the application of more instructive but less statistically rigorous methods such as cluster analysis. The test indicates zonation is marked in the deep benthic megafauna living between 200 and 3000 m, but below 3000 m little zonation may be found.

  4. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  5. The Relationship between Visual Analysis and Five Statistical Analyses in a Simple AB Single-Case Research Design

    ERIC Educational Resources Information Center

    Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi

    2006-01-01

    This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…

  6. Effect of Table Tennis Trainings on Biomotor Capacities in Boys

    ERIC Educational Resources Information Center

    Tas, Murat

    2017-01-01

    The aim of this study is to investigate whether the biomotor capacities of boys doing table tennis trainings are affected. A total of 40 students, as randomly selected 20 test groups and 20 control groups at an age range of 10-12 participated in the research. Statistical analysis of data was performed using Statistic Package for Social Science…

  7. Differences among Myopes, Emmetropes, and Hyperopes.

    DTIC Science & Technology

    1980-04-01

    parasympathetic activity (see Wenger & Cullen, 1965). The second index, Cw, is a coherence statistic (a normalized function with values ranging between zero...farther distances. ACKNOWLEDGMENT The literature search and the statistical analyses presented in this report were conducted at New Mexico State...in infant and child. New York: Paul B. Hoeber, 1949. Gould, G. M. Diagnosis, diseases and therapeutics of ametropia . British Journal of Ophthalmology

  8. Statistical Theory for the "RCT-YES" Software: Design-Based Causal Inference for RCTs. NCEE 2015-4011

    ERIC Educational Resources Information Center

    Schochet, Peter Z.

    2015-01-01

    This report presents the statistical theory underlying the "RCT-YES" software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal…

  9. Private School Statistics: A Review of Private and Federal Data Concerns. Special Report.

    ERIC Educational Resources Information Center

    Orr, David B.

    In 1986, the Center for Education Statistics (CES) initiated a series of meetings with a wide range of private school representatives. At these meetings, a need for more complete information on the data collection efforts of the various private groups was identified, and as a result, CES agreed to investigate the extent and nature of the education…

  10. Actitudes de Estudiantes Universitarios que Tomaron Cursos Introductorios de Estadistica y su Relacion con el Exito Academico en La Disciplina

    ERIC Educational Resources Information Center

    Colon-Rosa, Hector Wm.

    2012-01-01

    Considering the range of changes in the instruction and learning of statistics, several questions emerge regarding how those changes influence students' attitudes. Equally, other questions emerge to reflect that statistics is a fundamental course in the university academic programs because of its relevance to the professional development of the…

  11. The Effects of Sweet, Bitter, Salty and Sour Stimuli on Alpha Rhythm. A Meg Study.

    PubMed

    Kotini, Athanasia; Anninos, Photios; Gemousakakis, Triandafillos; Adamopoulos, Adam

    2016-09-01

    the possible diff erences in processing gustatory stimuli in healthy subjects was investigated by magnetoencephalography (meg). meg recordings were evaluated for 10 healthy volunteers (3 men within the age range 20-46 years, 7 women within the age range 10-28 years), with four diff erent gustatory stimuli: sweet, bi" er, sour and salty. Fast fourier transform was performed on meg epochs recorded for the above conditions and the eff ect of each kind of stimuli on alpha rhythm was examined. A significant higher percent of alpha power was found irrespective of hemispheric side in all gustatory states located mainly at the occipital, le$ and right parietal lobes. One female volunteer experienced no statistically signifi cance when comparing normal with salty and sour taste respectively. Two female volunteers exhibited no statistically signifi cance when comparing their normal with their salty taste. One male volunteer experienced no statistically signifi cance when comparing the normalbitter and normal-salty states correspondingly. All the other subjects showed statistically signifi cant changes in alpha power for the 4 gustatory stimuli. The pattern of activation caused by the four stimuli indicated elevated gustatory processing mechanisms. This cortical activation might have applicability in modulation of brain status.

  12. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  13. Statistical approach to tunneling time in attosecond experiments

    NASA Astrophysics Data System (ADS)

    Demir, Durmuş; Güner, Tuğrul

    2017-11-01

    Tunneling, transport of particles through classically forbidden regions, is a pure quantum phenomenon. It governs numerous phenomena ranging from single-molecule electronics to donor-acceptor transition reactions. The main problem is the absence of a universal method to compute tunneling time. This problem has been attacked in various ways in the literature. Here, in the present work, we show that a statistical approach to the problem, motivated by the imaginary nature of time in the forbidden regions, lead to a novel tunneling time formula which is real and subluminal (in contrast to various known time definitions implying superluminal tunneling). In addition to this, we show explicitly that the entropic time formula is in good agreement with the tunneling time measurements in laser-driven He ionization. Moreover, it sets an accurate range for long-range electron transfer reactions. The entropic time formula is general enough to extend to the photon and phonon tunneling phenomena.

  14. Beam-spin asymmetries from semi-inclusive pion electroproduction

    NASA Astrophysics Data System (ADS)

    Gohn, W.; Avakian, H.; Joo, K.; Ungaro, M.; Adhikari, K. P.; Aghasyan, M.; Amaryan, M. J.; Anderson, M. D.; Anefalos Pereira, S.; Ball, J.; Baltzell, N. A.; Battaglieri, M.; Biselli, A. S.; Bono, J.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Carman, D. S.; Celentano, A.; Chandavar, S.; Charles, G.; Cole, P. L.; Contalbrigo, M.; Cortes, O.; Crede, V.; D'Angelo, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Djalali, C.; Doughty, D.; Dupre, R.; El Alaoui, A.; El Fassi, L.; Eugenio, P.; Fedotov, G.; Fleming, J. A.; Forest, T.; Garçon, M.; Ghandilyan, Y.; Gilfoyle, G. P.; Giovanetti, K. L.; Girod, F. X.; Gothe, R. W.; Griffioen, K. A.; Guegan, B.; Guo, L.; Hafidi, K.; Hanretty, C.; Harrison, N.; Hattawy, Mohammad; Hicks, K.; Ho, D.; Holtrop, M.; Hyde, C.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Jo, H. S.; Keller, D.; Khandaker, M.; Khetarpal, P.; Kim, W.; Klein, F. J.; Koirala, S.; Kubarovsky, V.; Kuhn, S. E.; Kuleshov, S. V.; Lenisa, P.; Livingston, K.; Lu, H. Y.; MacGregor, I. J. D.; Markov, N.; Mayer, M.; McKinnon, B.; Mineeva, T.; Mirazita, M.; Mokeev, V.; Movsisyan, A.; Nadel-Turonski, P.; Niccolai, S.; Niculescu, I.; Osipenko, M.; Ostrovidov, A. I.; Pappalardo, L. L.; Paremuzyan, R.; Park, K.; Pasyuk, E.; Peng, P.; Phillips, J. J.; Pisano, S.; Pozdniakov, S.; Price, J. W.; Procureur, S.; Prok, Y.; Puckett, A. J. R.; Raue, B. A.; Ripani, M.; Ritchie, B. G.; Rizzo, A.; Rosner, G.; Rossi, P.; Roy, P.; Sabatié, F.; Salgado, C.; Schott, D.; Schumacher, R. A.; Seder, E.; Seraydaryan, H.; Sharabian, Y. G.; Simonyan, A.; Smith, G. D.; Sober, D. I.; Sokhan, D.; Stoler, P.; Strakovsky, I. I.; Stepanyan, S.; Strauch, S.; Tang, W.; Tkachenko, S.; Vernarsky, B.; Voskanyan, H.; Voutier, E.; Walford, N. K.; Watts, D. P.; Weinstein, L. B.; Wood, M. H.; Zachariou, N.; Zana, L.; Zhang, J.; Zonta, I.; CLAS Collaboration

    2014-04-01

    We have measured the moment ALUsinϕ corresponding to the polarized electron beam-spin asymmetry in semi-inclusive deep inelastic scattering. ALUsinϕ is a twist-3 quantity providing information about quark-gluon correlations. Data were taken with the CLAS Spectrometer at Jefferson Lab using a 5.498 GeV longitudinally polarized electron beam and an unpolarized liquid hydrogen target. All three pion channels (π+, π0 and π-) were measured simultaneously over a large range of kinematics within the virtuality range Q2≈ 1.0-4.5 GeV2. The observable was measured with better than 1% statistical precision over a large range of z, PT, xB, and Q2, which permits comparison with several reaction models. The discussed measurements provide an upgrade in statistics over previous measurements, and serve as the first evidence for the negative sign of the π- sinϕ moment.

  15. Theory connecting nonlocal sediment transport, earth surface roughness, and the Sadler effect

    NASA Astrophysics Data System (ADS)

    Schumer, Rina; Taloni, Alessandro; Furbish, David Jon

    2017-03-01

    Earth surface evolution, like many natural phenomena typified by fluctuations on a wide range of scales and deterministic smoothing, results in a statistically rough surface. We present theory demonstrating that scaling exponents of topographic and stratigraphic statistics arise from long-time averaging of noisy surface evolution rather than specific landscape evolution processes. This is demonstrated through use of "elastic" Langevin equations that generically describe disturbance from a flat earth surface using a noise term that is smoothed deterministically via sediment transport. When smoothing due to transport is a local process, the geologic record self organizes such that a specific Sadler effect and topographic power spectral density (PSD) emerge. Variations in PSD slope reflect the presence or absence and character of nonlocality of sediment transport. The range of observed stratigraphic Sadler slopes captures the same smoothing feature combined with the presence of long-range spatial correlation in topographic disturbance.

  16. The consentaneous model of the financial markets exhibiting spurious nature of long-range memory

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kononovicius, A.

    2018-09-01

    It is widely accepted that there is strong persistence in the volatility of financial time series. The origin of the observed persistence, or long-range memory, is still an open problem as the observed phenomenon could be a spurious effect. Earlier we have proposed the consentaneous model of the financial markets based on the non-linear stochastic differential equations. The consentaneous model successfully reproduces empirical probability and power spectral densities of volatility. This approach is qualitatively different from models built using fractional Brownian motion. In this contribution we investigate burst and inter-burst duration statistics of volatility in the financial markets employing the consentaneous model. Our analysis provides an evidence that empirical statistical properties of burst and inter-burst duration can be explained by non-linear stochastic differential equations driving the volatility in the financial markets. This serves as an strong argument that long-range memory in finance can have spurious nature.

  17. Long Range Earthquake Interaction in Iceland

    NASA Astrophysics Data System (ADS)

    Goltz, C.

    2003-12-01

    It has been observed that earthquakes can be triggered by similarly sized events at large distances. The phenomenon has recently been shown to be statistically significant at a range up to several source dimensions in global earthquake data. The most appropriate explanation of the phenomenon seems to be criticality of the Earth's crust as e.g. changes in static and dynamic stresses would otherwise be too small to trigger remote events. I present results for a regional (as opposed to global) study of seismicity in Iceland which is based on a high quality reprocessed catalogue. Results include the time-dependent determination of the maximum range of interaction and the correlation length and also address the question whether small events can trigger larger ones. Pitfalls such as data accuracy and geometry as well as boundary effects are thoroughly discussed. A comparison with surrogate data helps to assess the statistical significance of the results.

  18. The contribution of statistical physics to evolutionary biology.

    PubMed

    de Vladar, Harold P; Barton, Nicholas H

    2011-08-01

    Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Methods for estimating selected spring and fall low-flow frequency statistics for ungaged stream sites in Iowa, based on data through June 2014

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; O'Shea, Padraic S.

    2016-09-19

    A statewide study was led to develop regression equations for estimating three selected spring and three selected fall low-flow frequency statistics for ungaged stream sites in Iowa. The estimation equations developed for the six low-flow frequency statistics include spring (April through June) 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years and fall (October through December) 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years. Estimates of the three selected spring statistics are provided for 241 U.S. Geological Survey continuous-record streamgages, and estimates of the three selected fall statistics are provided for 238 of these streamgages, using data through June 2014. Because only 9 years of fall streamflow record were available, three streamgages included in the development of the spring regression equations were not included in the development of the fall regression equations. Because of regulation, diversion, or urbanization, 30 of the 241 streamgages were not included in the development of the regression equations. The study area includes Iowa and adjacent areas within 50 miles of the Iowa border. Because trend analyses indicated statistically significant positive trends when considering the period of record for most of the streamgages, the longest, most recent period of record without a significant trend was determined for each streamgage for use in the study. Geographic information system software was used to measure 63 selected basin characteristics for each of the 211streamgages used to develop the regional regression equations. The study area was divided into three low-flow regions that were defined in a previous study for the development of regional regression equations.Because several streamgages included in the development of regional regression equations have estimates of zero flow calculated from observed streamflow for selected spring and fall low-flow frequency statistics, the final equations for the three low-flow regions were developed using two types of regression analyses—left-censored and generalized-least-squares regression analyses. A total of 211 streamgages were included in the development of nine spring regression equations—three equations for each of the three low-flow regions. A total of 208 streamgages were included in the development of nine fall regression equations—three equations for each of the three low-flow regions. A censoring threshold was used to develop 15 left-censored regression equations to estimate the three fall low-flow frequency statistics for each of the three low-flow regions and to estimate the three spring low-flow frequency statistics for the southern and northwest regions. For the northeast region, generalized-least-squares regression was used to develop three equations to estimate the three spring low-flow frequency statistics. For the northeast region, average standard errors of prediction range from 32.4 to 48.4 percent for the spring equations and average standard errors of estimate range from 56.4 to 73.8 percent for the fall equations. For the northwest region, average standard errors of estimate range from 58.9 to 62.1 percent for the spring equations and from 83.2 to 109.4 percent for the fall equations. For the southern region, average standard errors of estimate range from 43.2 to 64.0 percent for the spring equations and from 78.1 to 78.7 percent for the fall equations.The regression equations are applicable only to stream sites in Iowa with low flows not substantially affected by regulation, diversion, or urbanization and with basin characteristics within the range of those used to develop the equations. The regression equations will be implemented within the U.S. Geological Survey StreamStats Web-based geographic information system application. StreamStats allows users to click on any ungaged stream site and compute estimates of the six selected spring and fall low-flow statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged site are provided. StreamStats also allows users to click on any Iowa streamgage to obtain computed estimates for the six selected spring and fall low-flow statistics.

  20. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  1. Automatic Adaptation to Fast Input Changes in a Time-Invariant Neural Circuit

    PubMed Central

    Bharioke, Arjun; Chklovskii, Dmitri B.

    2015-01-01

    Neurons must faithfully encode signals that can vary over many orders of magnitude despite having only limited dynamic ranges. For a correlated signal, this dynamic range constraint can be relieved by subtracting away components of the signal that can be predicted from the past, a strategy known as predictive coding, that relies on learning the input statistics. However, the statistics of input natural signals can also vary over very short time scales e.g., following saccades across a visual scene. To maintain a reduced transmission cost to signals with rapidly varying statistics, neuronal circuits implementing predictive coding must also rapidly adapt their properties. Experimentally, in different sensory modalities, sensory neurons have shown such adaptations within 100 ms of an input change. Here, we show first that linear neurons connected in a feedback inhibitory circuit can implement predictive coding. We then show that adding a rectification nonlinearity to such a feedback inhibitory circuit allows it to automatically adapt and approximate the performance of an optimal linear predictive coding network, over a wide range of inputs, while keeping its underlying temporal and synaptic properties unchanged. We demonstrate that the resulting changes to the linearized temporal filters of this nonlinear network match the fast adaptations observed experimentally in different sensory modalities, in different vertebrate species. Therefore, the nonlinear feedback inhibitory network can provide automatic adaptation to fast varying signals, maintaining the dynamic range necessary for accurate neuronal transmission of natural inputs. PMID:26247884

  2. Low back pain in professional golfers: the role of associated hip and low back range-of-motion deficits.

    PubMed

    Vad, Vijay B; Bhat, Atul L; Basrai, Dilshaad; Gebeh, Ansu; Aspergren, Donald D; Andrews, James R

    2004-03-01

    Low back pain is fairly prevalent among golfers; however, its precise biomechanical mechanism is often debated. There is a positive correlation between decreased lead hip rotation and lumbar range of motion with a prior history of low back pain in professional golfers. A cross-sectional study. Forty-two consecutive professional male golfers were categorized as group 1 (history of low back pain greater than 2 weeks affecting quality of play within past 1 year) and group 2 (no previous such history). All underwent measurements of hip and lumbar range of motion, FABERE's distance, and finger-to-floor distance. Differences in measurements were analyzed using the Wilcoxon signed rank test. 33% of golfers had previously experienced low back pain. A statistically significant correlation (P <.05) was observed between a history of low back pain with decreased lead hip internal rotation, FABERE's distance, and lumbar extension. No statistically significant difference was noted in nonlead hip range of motion or finger-to-floor distance with history of low back pain. Range-of-motion deficits in the lead hip rotation and lumbar spine extension correlated with a history of low back pain in golfers.

  3. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  4. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.

    PubMed

    Farrell, Mary Beth

    2018-06-01

    This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being measured. A wide confidence interval indicates that if the experiment were repeated multiple times on other samples, the measured statistic would lie within a wide range of possibilities. The confidence interval relies on the SE. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  5. High intensity click statistics from a 10 × 10 avalanche photodiode array

    NASA Astrophysics Data System (ADS)

    Kröger, Johannes; Ahrens, Thomas; Sperling, Jan; Vogel, Werner; Stolz, Heinrich; Hage, Boris

    2017-11-01

    Photon-number measurements are a fundamental technique for the discrimination and characterization of quantum states of light. Beyond the abilities of state-of-the-art devices, we present measurements with an array of 100 avalanche photodiodes exposed to photon-numbers ranging from well below to significantly above one photon per diode. Despite each single diode only discriminating between zero and non-zero photon-numbers we were able to extract a second order moment, which acts as a nonclassicality indicator. We demonstrate a vast enhancement of the applicable intensity range by two orders of magnitude relative to the standard application of such devices. It turns out that the probabilistic mapping of arbitrary photon-numbers on a finite number of registered clicks is not per se a disadvantage compared with true photon counters. Such detector arrays can bridge the gap between single-photon and linear detection, by investigation of the click statistics, without the necessity of photon statistics reconstruction.

  6. Workplace accidents and self-organized criticality

    NASA Astrophysics Data System (ADS)

    Mauro, John C.; Diehl, Brett; Marcellin, Richard F.; Vaughn, Daniel J.

    2018-09-01

    The occurrence of workplace accidents is described within the context of self-organized criticality, a theory from statistical physics that governs a wide range of phenomena across physics, biology, geosciences, economics, and the social sciences. Workplace accident data from the U.S. Bureau of Labor Statistics reveal a power-law relationship between the number of accidents and their severity as measured by the number of days lost from work. This power-law scaling is indicative of workplace accidents being governed by self-organized criticality, suggesting that nearly all workplace accidents have a common underlying cause, independent of their severity. Such power-law scaling is found for all labor categories documented by the U.S. Bureau of Labor Statistics. Our results provide scientific support for the Heinrich accident triangle, with the practical implication that suppressing the rate of severe accidents requires changing the attitude toward workplace safety in general. By creating a culture that values safety, empowers individuals, and strives to continuously improve, accident rates can be suppressed across the full range of severities.

  7. Ball-joint versus single monolateral external fixators for definitive treatment of tibial shaft fractures.

    PubMed

    Beltsios, Michail; Mavrogenis, Andreas F; Savvidou, Olga D; Karamanis, Eirineos; Kokkalis, Zinon T; Papagelopoulos, Panayiotis J

    2014-07-01

    To compare modular monolateral external fixators with single monolateral external fixators for the treatment of open and complex tibial shaft fractures, to determine the optimal construct for fracture union. A total of 223 tibial shaft fractures in 212 patients were treated with a monolateral external fixator from 2005 to 2011; 112 fractures were treated with a modular external fixator with ball-joints (group A), and 111 fractures were treated with a single external fixator without ball-joints (group B). The mean follow-up was 2.9 years. We retrospectively evaluated the operative time for fracture reduction with the external fixator, pain and range of motion of the knee and ankle joints, time to union, rate of malunion, reoperations and revisions of the external fixators, and complications. The time for fracture reduction was statistically higher in group B; the rate of union was statistically higher in group B; the rate of nonunion was statistically higher in group A; the mean time to union was statistically higher in group A; the rate of reoperations was statistically higher in group A; and the rate of revision of the external fixator was statistically higher in group A. Pain, range of motion of the knee and ankle joints, rates of delayed union, malunion and complications were similar. Although modular external fixators are associated with faster intraoperative fracture reduction with the external fixator, single external fixators are associated with significantly better rates of union and reoperations; the rates of delayed union, malunion and complications are similar.

  8. Watershed Regressions for Pesticides (WARP) models for predicting stream concentrations of multiple pesticides

    USGS Publications Warehouse

    Stone, Wesley W.; Crawford, Charles G.; Gilliom, Robert J.

    2013-01-01

    Watershed Regressions for Pesticides for multiple pesticides (WARP-MP) are statistical models developed to predict concentration statistics for a wide range of pesticides in unmonitored streams. The WARP-MP models use the national atrazine WARP models in conjunction with an adjustment factor for each additional pesticide. The WARP-MP models perform best for pesticides with application timing and methods similar to those used with atrazine. For other pesticides, WARP-MP models tend to overpredict concentration statistics for the model development sites. For WARP and WARP-MP, the less-than-ideal sampling frequency for the model development sites leads to underestimation of the shorter-duration concentration; hence, the WARP models tend to underpredict 4- and 21-d maximum moving-average concentrations, with median errors ranging from 9 to 38% As a result of this sampling bias, pesticides that performed well with the model development sites are expected to have predictions that are biased low for these shorter-duration concentration statistics. The overprediction by WARP-MP apparent for some of the pesticides is variably offset by underestimation of the model development concentration statistics. Of the 112 pesticides used in the WARP-MP application to stream segments nationwide, 25 were predicted to have concentration statistics with a 50% or greater probability of exceeding one or more aquatic life benchmarks in one or more stream segments. Geographically, many of the modeled streams in the Corn Belt Region were predicted to have one or more pesticides that exceeded an aquatic life benchmark during 2009, indicating the potential vulnerability of streams in this region.

  9. Technical Note: Statistical dependences between channels in radiochromic film readings. Implications in multichannel dosimetry.

    PubMed

    González-López, Antonio; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen

    2016-05-01

    This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dose ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.

  10. 18F-choline PET/MRI in suspected recurrence of prostate carcinoma.

    PubMed

    Riola-Parada, C; Carreras-Delgado, J L; Pérez-Dueñas, V; Garcerant-Tafur, M; García-Cañamaque, L

    2018-05-21

    To evaluate the usefulness of simultaneous 18 F-choline PET/MRI in the suspicion of prostate cancer recurrence and to relate 18 F-choline PET/MRI detection rate with analytical and pathological variables. 27 patients with prostate cancer who received local therapy as primary treatment underwent a 18 F-choline PET/MRI due to suspicion of recurrence (persistently rising serum PSA level). 18 F-choline PET/MRI findings were validated by anatomopathological analysis, other imaging tests or by biochemical response to oncological treatment. 18 F-choline PET/MRI detected disease in 15 of 27 patients (detection rate 55.56%). 4 (15%) presented exclusively local recurrence, 5 (18%) lymph node metastases and 7 (26%) bone metastases. Mean PSA (PSA med ) at study time was 2.94ng/mL (range 0.18-10ng/mL). PSA med in patients with positive PET/MRI was 3.70ng/mL (range 0.24-10ng/mL), higher than in patients with negative PET/MRI, PSA med 1.97ng/mL (range 0.18-4.38ng/mL), although without statistically significant differences. Gleason score at diagnosis in patients with a positive study was 7.33 (range 6-9) and in patients with a negative study was 7 (range 6-9), without statistically significant differences. 18 F-choline PET/MRI detection rate was considerable despite the relatively low PSA values in our sample. The influence of Gleason score and PSA level on 18 F-choline PET/MRI detection rate was not statistically significant. Copyright © 2018 Sociedad Española de Medicina Nuclear e Imagen Molecular. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Technical Note: Statistical dependences between channels in radiochromic film readings. Implications in multichannel dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    González-López, Antonio, E-mail: antonio.gonzalez7@carm.es; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen

    Purpose: This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Methods: Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dosemore » ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. Results: For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Conclusions: Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.« less

  12. Measuring the Number of M Dwarfs per M Dwarf Using Kepler Eclipsing Binaries

    NASA Astrophysics Data System (ADS)

    Shan, Yutong; Johnson, John A.; Morton, Timothy D.

    2015-11-01

    We measure the binarity of detached M dwarfs in the Kepler field with orbital periods in the range of 1-90 days. Kepler’s photometric precision and nearly continuous monitoring of stellar targets over time baselines ranging from 3 months to 4 years make its detection efficiency for eclipsing binaries nearly complete over this period range and for all radius ratios. Our investigation employs a statistical framework akin to that used for inferring planetary occurrence rates from planetary transits. The obvious simplification is that eclipsing binaries have a vastly improved detection efficiency that is limited chiefly by their geometric probabilities to eclipse. For the M-dwarf sample observed by the Kepler Mission, the fractional incidence of eclipsing binaries implies that there are {0.11}-0.04+0.02 close stellar companions per apparently single M dwarf. Our measured binarity is higher than previous inferences of the occurrence rate of close binaries via radial velocity techniques, at roughly the 2σ level. This study represents the first use of eclipsing binary detections from a high quality transiting planet mission to infer binary statistics. Application of this statistical framework to the eclipsing binaries discovered by future transit surveys will establish better constraints on short-period M+M binary rate, as well as binarity measurements for stars of other spectral types.

  13. Informativeness of Diagnostic Marker Values and the Impact of Data Grouping.

    PubMed

    Ma, Hua; Bandos, Andriy I; Gur, David

    2018-01-01

    Assessing performance of diagnostic markers is a necessary step for their use in decision making regarding various conditions of interest in diagnostic medicine and other fields. Globally useful markers could, however, have ranges of values that are " diagnostically non-informative" . This paper demonstrates that the presence of marker values from diagnostically non-informative ranges could lead to a loss in statistical efficiency during nonparametric evaluation and shows that grouping non-informative values provides a natural resolution to this problem. These points are theoretically proven and an extensive simulation study is conducted to illustrate the possible benefits of using grouped marker values in a number of practically reasonable scenarios. The results contradict the common conjecture regarding the detrimental effect of grouped marker values during performance assessments. Specifically, contrary to the common assumption that grouped marker values lead to bias, grouping non-informative values does not introduce bias and could substantially reduce sampling variability. The proven concept that grouped marker values could be statistically beneficial without detrimental consequences implies that in practice, tied values do not always require resolution whereas the use of continuous diagnostic results without addressing diagnostically non-informative ranges could be statistically detrimental. Based on these findings, more efficient methods for evaluating diagnostic markers could be developed.

  14. An Open Task to Promote Students to Create Statistical Concepts through Modelling

    ERIC Educational Resources Information Center

    Albarracín, Lluís; Aymerich, Àngels; Gorgorió, Núria

    2017-01-01

    This article reports on the solutions of a group of 22 students, aged 15/16 years old, when facing a statistical modelling activity. They were given the salary lists of 5 companies and were asked what could be said about their salary structure; no hint was given. The results show that students not only used a wide range of data concepts and…

  15. Statistical uncertainties of a chiral interaction at next-to-next-to leading order

    DOE PAGES

    Ekström, A.; Carlsson, B. D.; Wendt, K. A.; ...

    2015-02-05

    In this paper, we have quantified the statistical uncertainties of the low-energy coupling-constants (LECs) of an optimized nucleon–nucleon interaction from chiral effective field theory at next-to-next-to-leading order. Finally, in addition, we have propagated the impact of the uncertainties of the LECs to two-nucleon scattering phase shifts, effective range parameters, and deuteron observables.

  16. Interim Guidance on the Use of SiteStat/GridStats and Other Army Corps of Engineers Statistical Techniques to Characterize Military Ranges

    EPA Pesticide Factsheets

    The purpose of this memorandum is to inform recipients of concerns regarding Army Corps of Engineers statistical techniques, provide a list of installations and FWS where SiteStat/GridStats (SS/GS) have been used, and to provide direction on communicating with the public on the use of these 'tools' by USACE.

  17. Methods for estimating selected low-flow frequency statistics for unregulated streams in Kentucky

    USGS Publications Warehouse

    Martin, Gary R.; Arihood, Leslie D.

    2010-01-01

    This report provides estimates of, and presents methods for estimating, selected low-flow frequency statistics for unregulated streams in Kentucky including the 30-day mean low flows for recurrence intervals of 2 and 5 years (30Q2 and 30Q5) and the 7-day mean low flows for recurrence intervals of 5, 10, and 20 years (7Q2, 7Q10, and 7Q20). Estimates of these statistics are provided for 121 U.S. Geological Survey streamflow-gaging stations with data through the 2006 climate year, which is the 12-month period ending March 31 of each year. Data were screened to identify the periods of homogeneous, unregulated flows for use in the analyses. Logistic-regression equations are presented for estimating the annual probability of the selected low-flow frequency statistics being equal to zero. Weighted-least-squares regression equations were developed for estimating the magnitude of the nonzero 30Q2, 30Q5, 7Q2, 7Q10, and 7Q20 low flows. Three low-flow regions were defined for estimating the 7-day low-flow frequency statistics. The explicit explanatory variables in the regression equations include total drainage area and the mapped streamflow-variability index measured from a revised statewide coverage of this characteristic. The percentage of the station low-flow statistics correctly classified as zero or nonzero by use of the logistic-regression equations ranged from 87.5 to 93.8 percent. The average standard errors of prediction of the weighted-least-squares regression equations ranged from 108 to 226 percent. The 30Q2 regression equations have the smallest standard errors of prediction, and the 7Q20 regression equations have the largest standard errors of prediction. The regression equations are applicable only to stream sites with low flows unaffected by regulation from reservoirs and local diversions of flow and to drainage basins in specified ranges of basin characteristics. Caution is advised when applying the equations for basins with characteristics near the applicable limits and for basins with karst drainage features.

  18. The effects of sampling frequency on the climate statistics of the European Centre for Medium-Range Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Phillips, Thomas J.; Gates, W. Lawrence; Arpe, Klaus

    1992-12-01

    The effects of sampling frequency on the first- and second-moment statistics of selected European Centre for Medium-Range Weather Forecasts (ECMWF) model variables are investigated in a simulation of "perpetual July" with a diurnal cycle included and with surface and atmospheric fields saved at hourly intervals. The shortest characteristic time scales (as determined by the e-folding time of lagged autocorrelation functions) are those of ground heat fluxes and temperatures, precipitation and runoff, convective processes, cloud properties, and atmospheric vertical motion, while the longest time scales are exhibited by soil temperature and moisture, surface pressure, and atmospheric specific humidity, temperature, and wind. The time scales of surface heat and momentum fluxes and of convective processes are substantially shorter over land than over oceans. An appropriate sampling frequency for each model variable is obtained by comparing the estimates of first- and second-moment statistics determined at intervals ranging from 2 to 24 hours with the "best" estimates obtained from hourly sampling. Relatively accurate estimation of first- and second-moment climate statistics (10% errors in means, 20% errors in variances) can be achieved by sampling a model variable at intervals that usually are longer than the bandwidth of its time series but that often are shorter than its characteristic time scale. For the surface variables, sampling at intervals that are nonintegral divisors of a 24-hour day yields relatively more accurate time-mean statistics because of a reduction in errors associated with aliasing of the diurnal cycle and higher-frequency harmonics. The superior estimates of first-moment statistics are accompanied by inferior estimates of the variance of the daily means due to the presence of systematic biases, but these probably can be avoided by defining a different measure of low-frequency variability. Estimates of the intradiurnal variance of accumulated precipitation and surface runoff also are strongly impacted by the length of the storage interval. In light of these results, several alternative strategies for storage of the EMWF model variables are recommended.

  19. Detecting response of Douglas-fir plantations to urea fertilizer at three locations in the Oregon Coast Range.

    Treesearch

    Richard E. Miller; Jim Smith; Harry Anderson

    2001-01-01

    Fertilizer trials in coast Douglas-fir (Pseudotsuga menziesii var. menziesii (Mirb.) Franco) in the Oregon Coast Range usually indicate small and statistically nonsignificant response to nitrogen (N) fertilizers. Inherently weak experimental designs of past trials could make them too insensitive to detect growth differences...

  20. Observation of prethermalization in long-range interacting spin chains

    PubMed Central

    Neyenhuis, Brian; Zhang, Jiehang; Hess, Paul W.; Smith, Jacob; Lee, Aaron C.; Richerme, Phil; Gong, Zhe-Xuan; Gorshkov, Alexey V.; Monroe, Christopher

    2017-01-01

    Although statistical mechanics describes thermal equilibrium states, these states may or may not emerge dynamically for a subsystem of an isolated quantum many-body system. For instance, quantum systems that are near-integrable usually fail to thermalize in an experimentally realistic time scale, and instead relax to quasi-stationary prethermal states that can be described by statistical mechanics, when approximately conserved quantities are included in a generalized Gibbs ensemble (GGE). We experimentally study the relaxation dynamics of a chain of up to 22 spins evolving under a long-range transverse-field Ising Hamiltonian following a sudden quench. For sufficiently long-range interactions, the system relaxes to a new type of prethermal state that retains a strong memory of the initial conditions. However, the prethermal state in this case cannot be described by a standard GGE; it rather arises from an emergent double-well potential felt by the spin excitations. This result shows that prethermalization occurs in a broader context than previously thought, and reveals new challenges for a generic understanding of the thermalization of quantum systems, particularly in the presence of long-range interactions. PMID:28875166

  1. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  2. Laparoscopic and open subtotal colectomies have similar short-term results.

    PubMed

    Hoogenboom, Froukje J; Bosker, Robbert J I; Groen, Henk; Meijerink, Wilhelmus J H J; Lamme, Bas; Pierie, Jean Pierre E N

    2013-01-01

    Laparoscopic subtotal colectomy (STC) is a complex procedure. It is possible that short-term benefits for segmental resections cannot be attributed to this complex procedure. This study aims to assess differences in short-term results for laparoscopic versus open STC during a 15-year single-institute experience. We reviewed consecutive patients undergoing laparoscopic or open elective or subacute STC from January 1997 to December 2012. Fifty-six laparoscopic and 50 open STCs were performed. The operation time was significantly longer in the laparoscopic group, median 266 min (range 121-420 min), compared to 153 min (range 90-408 min) in the open group (p < 0.001). Median hospital stay showed no statistical difference, 14 days (range 1-129 days) in the laparoscopic and 13 days (range 1-85 days) in the open group. Between-group postoperative complications were not statistically different. Laparoscopic STC has short-term results similar to the open procedure, except for a longer operation time. The laparoscopic approach for STC is therefore only advisable in selected patients combined with extensive preoperative counseling. Copyright © 2013 S. Karger AG, Basel.

  3. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    NASA Astrophysics Data System (ADS)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  4. Determination of reference ranges for elements in human scalp hair.

    PubMed

    Druyan, M E; Bass, D; Puchyr, R; Urek, K; Quig, D; Harmon, E; Marquardt, W

    1998-06-01

    Expected values, reference ranges, or reference limits are necessary to enable clinicians to apply analytical chemical data in the delivery of health care. Determination of references ranges is not straightforward in terms of either selecting a reference population or performing statistical analysis. In light of logistical, scientific, and economic obstacles, it is understandable that clinical laboratories often combine approaches in developing health associated reference values. A laboratory may choose to: 1. Validate either reference ranges of other laboratories or published data from clinical research or both, through comparison of patients test data. 2. Base the laboratory's reference values on statistical analysis of results from specimens assayed by the clinical reference laboratory itself. 3. Adopt standards or recommendations of regulatory agencies and governmental bodies. 4. Initiate population studies to validate transferred reference ranges or to determine them anew. Effects of external contamination and anecdotal information from clinicians may be considered. The clinical utility of hair analysis is well accepted for some elements. For others, it remains in the realm of clinical investigation. This article elucidates an approach for establishment of reference ranges for elements in human scalp hair. Observed levels of analytes from hair specimens from both our laboratory's total patient population and from a physician-defined healthy American population have been evaluated. Examination of levels of elements often associated with toxicity serves to exemplify the process of determining reference ranges in hair. In addition the approach serves as a model for setting reference ranges for analytes in a variety of matrices.

  5. Study of pre-seismic kHz EM emissions by means of complex systems

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos

    2010-05-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.

  6. Arthroscopic Debridement for Primary Degenerative Osteoarthritis of the Elbow Leads to Significant Improvement in Range of Motion and Clinical Outcomes: A Systematic Review.

    PubMed

    Sochacki, Kyle R; Jack, Robert A; Hirase, Takashi; McCulloch, Patrick C; Lintner, David M; Liberman, Shari R; Harris, Joshua D

    2017-12-01

    The purpose of this investigation was to determine whether arthroscopic debridement of primary elbow osteoarthritis results in statistically significant and clinically relevant improvement in (1) elbow range of motion and (2) clinical outcomes with (3) low complication and reoperation rates. A systematic review was registered with PROSPERO and performed using PRISMA guidelines. Databases were searched for studies that investigated the outcomes of arthroscopic debridement for the treatment of primary osteoarthritis of the elbow in adult human patients. Study methodological quality was analyzed. Studies that included post-traumatic arthritis were excluded. Elbow motion and all elbow-specific patient-reported outcome scores were eligible for analysis. Comparisons between preoperative and postoperative values from each study were made using 2-sample Z-tests (http://in-silico.net/tools/statistics/ztest) using a P value < .05. Nine articles (209 subjects, 213 elbows, 187 males, 22 females, mean age 45.7 ± 7.1 years, mean follow-up 41.7 ± 16.3. months; 75% right, 25% left; 79% dominant elbow, 21% nondominant) were analyzed. Elbow extension (23.4°-10.7°, Δ 12.7°), flexion (115.9°-128.7°, Δ 12.8°), and global arc of motion (94.5°-117.6°, Δ 23.1°) had statistically significant and clinically relevant improvement following arthroscopic debridement (P < .0001 for all). There was also a statistically significant (P < .0001) and clinically relevant improvement in the Mayo Elbow Performance Score (60.7-84.6, Δ 23.9) postoperatively. Six patients (2.8%) had postoperative complications. Nine (4.2%) underwent reoperation. Elbow arthroscopic debridement for primary degenerative osteoarthritis results in statistically significant and clinically relevant improvement in elbow range of motion and clinical outcomes with low complication and reoperation rates. Systematic review of level IV studies. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  7. The Prognostic Significance of Elevated Serum Ferritin Levels Prior to Transplantation in Patients With Lymphoma Who Underwent Autologous Hematopoietic Stem Cell Transplantation (autoHSCT): Role of Iron Overload.

    PubMed

    Sivgin, Serdar; Karamustafaoglu, Mehmet Fatih; Yildizhan, Esra; Zararsiz, Gokmen; Kaynar, Leylagul; Eser, Bulent; Cetin, Mustafa; Unal, Ali

    2016-08-01

    Hematopoietic stem cell transplantation is a common and preferred treatment of lymphomas in many centers. Our goal was to determine the association between pretransplant iron overload and survival in patients who underwent autologous hematopoietic stem cell transplantation (autoHSCT). A total of 165 patients with lymphoma, who underwent autoHSCT between the years of 2007 and 2014, were included in this study. Ferritin levels were used to determine iron status; the cut-off value was 500 ng/mL. The relationship between iron overload and survival was assessed by statistical analysis. The median ferritin level in the normal ferritin (ferritin < 500) group was 118 ng/mL (range, 9-494 ng/mL) and in the high-ferritin group (ferritin ≥ 500), it was 908 ng/mL (range, 503-4549 ng/mL). A total of 64 (38.8%) patients died during follow-up. Of these patients that died, 52 (81.25%) were in the high-ferritin group, and 12 (18.75%) were in the normal ferritin group (P ≤ .001). Twelve (14.1%) of 85 patients died in the normal ferritin group, and 52 (65.0%) of 80 patients died in the high-ferritin group. The overall mortality was significantly higher in the high-ferritin group (P < .001). The median overall survival was 42 months (range, 25-56 months) in the normal-ferritin group and20 months (range, 5-46) in the high-ferritin group. The difference between the groups was statistically significant (P < .001). The median disease-free survival was 39 months (range, 16-56) in the normal ferritin group and 10 months (range, 3-29) in the high-ferritin group. The difference between the groups was statistically significant (P < .001). Elevated serum ferritin levels might predict poorer survival in autoHSCT recipients. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Topographically supported customized ablation for the management of decentered laser in situ keratomileusis.

    PubMed

    Kymionis, George D; Panagopoulou, Sophia I; Aslanides, Ioannis M; Plainis, Sotiris; Astyrakakis, Nikolaos; Pallikaris, Ioannis G

    2004-05-01

    To evaluate the efficacy, predictability, and safety of topographically supported customized ablations (TOSCAs) for decentered ablations following laser in situ keratomileusis (LASIK). Prospective nonrandomized clinical trial. Nine patients (11 eyes) with LASIK-induced decentered ablations underwent TOSCA following flap lifting. Topographically supported customized ablation was performed using a corneal topographer to obtain a customized ablation profile, combined with a flying spot laser. Mean follow-up was 9.22 +/- 2.82 months (range 6-12 months). No intra- or postoperative complications were observed. Manifest refraction (spherical equivalent) did not change significantly (pre-TOSCA: -0.14 +/- 1.58 diopters [range, -1.75 to +3.00 diopters] to +0.46 +/- 1.02 diopters [range, -1.00 to +1.75 diopters]; P =.76), whereas there was a statistically significant reduction in the refractive astigmatism (pre-TOSCA: -1.55 +/- 0.60 diopters [range, -3.00 to -0.75 diopters] to -0.70 +/- 0.56 diopters [range, -2.00 to -0.25 diopters]; P =.003). Mean uncorrected visual acuity improved significantly (P <.001) from 0.45 +/- 0.16 (range, 0.2-0.7) to 0.76 +/- 0.29 (range, 0.2-1.2) at last follow-up. Mean best-corrected visual acuity improved from 0.74 +/- 0.22 (range, 0.4-1.0) to 0.95 +/- 0.20 (range, 0.6-1.2; P =.002). Eccentricity showed a statistically significant reduction after TOSCA treatment (pre-TOSCA: 1.59 +/- 0.46 mm [range, 0.88-2.23 mm]; post-TOSCA: 0.29 +/- 0.09 mm [range, 0.18-0.44 mm]; P <.001). In our small sample, enhancement LASIK procedures with TOSCA appear to improve uncorrected and best-corrected visual acuity as well as eccentricity in patients with LASIK-induced decentered ablation.

  9. Establishing a learning foundation in a dynamically changing world: Insights from artificial language work

    NASA Astrophysics Data System (ADS)

    Gonzales, Kalim

    It is argued that infants build a foundation for learning about the world through their incidental acquisition of the spatial and temporal regularities surrounding them. A challenge is that learning occurs across multiple contexts whose statistics can greatly differ. Two artificial language studies with 12-month-olds demonstrate that infants come prepared to parse statistics across contexts using the temporal and perceptual features that distinguish one context from another. These results suggest that infants can organize their statistical input with a wider range of features that typically considered. Possible attention, decision making, and memory mechanisms are discussed.

  10. Covariance approximation for fast and accurate computation of channelized Hotelling observer statistics

    NASA Astrophysics Data System (ADS)

    Bonetto, P.; Qi, Jinyi; Leahy, R. M.

    2000-08-01

    Describes a method for computing linear observer statistics for maximum a posteriori (MAP) reconstructions of PET images. The method is based on a theoretical approximation for the mean and covariance of MAP reconstructions. In particular, the authors derive here a closed form for the channelized Hotelling observer (CHO) statistic applied to 2D MAP images. The theoretical analysis models both the Poission statistics of PET data and the inhomogeneity of tracer uptake. The authors show reasonably good correspondence between these theoretical results and Monte Carlo studies. The accuracy and low computational cost of the approximation allow the authors to analyze the observer performance over a wide range of operating conditions and parameter settings for the MAP reconstruction algorithm.

  11. SU-F-J-197: A Novel Intra-Beam Range Detection and Adaptation Strategy for Particle Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, M; Jiang, S; Shao, Y

    2016-06-15

    Purpose: In-vivo range detection/verification is crucial in particle therapy for effective and safe delivery. The state-of-art techniques are not sufficient for in-vivo on-line range verification due to conflicts among patient dose, signal statistics and imaging time. We propose a novel intra-beam range detection and adaptation strategy for particle therapy. Methods: This strategy uses the planned mid-range spots as probing beams without adding extra radiation to patients. Such choice of probing beams ensures the Bragg peaks to remain inside the tumor even with significant range variation from the plan. It offers sufficient signal statistics for in-beam positron emission tomography (PET) duemore » to high positron activity of therapeutic dose. The probing beam signal can be acquired and reconstructed using in-beam PET that allows for delineation of the Bragg peaks and detection of range shift with ease of detection enabled by single-layered spots. If the detected range shift is within a pre-defined tolerance, the remaining spots will be delivered as the original plan. Otherwise, a fast re-optimization using range-shifted beamlets and accounting for the probing beam dose is applied to consider the tradeoffs posed by the online anatomy. Simulated planning and delivery studies were used to demonstrate the effectiveness of the proposed techniques. Results: Simulations with online range variations due to shifts of various foreign objects into the beam path showed successful delineation of the Bragg peaks as a result of delivering probing beams. Without on-line delivery adaptation, dose distribution was significantly distorted. In contrast, delivery adaptation incorporating detected range shift recovered well the planned dose. Conclusion: The proposed intra-beam range detection and adaptation utilizing the planned mid-range spots as probing beams, which illuminate the beam range with strong and accurate PET signals, is a safe, practical, yet effective approach to address range uncertainty issues in particle therapy.« less

  12. Evaluation of the performance of statistical tests used in making cleanup decisions at Superfund sites. Part 1: Choosing an appropriate statistical test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berman, D.W.; Allen, B.C.; Van Landingham, C.B.

    1998-12-31

    The decision rules commonly employed to determine the need for cleanup are evaluated both to identify conditions under which they lead to erroneous conclusions and to quantify the rate that such errors occur. Their performance is also compared with that of other applicable decision rules. The authors based the evaluation of decision rules on simulations. Results are presented as power curves. These curves demonstrate that the degree of statistical control achieved is independent of the form of the null hypothesis. The loss of statistical control that occurs when a decision rule is applied to a data set that does notmore » satisfy the rule`s validity criteria is also clearly demonstrated. Some of the rules evaluated do not offer the formal statistical control that is an inherent design feature of other rules. Nevertheless, results indicate that such informal decision rules may provide superior overall control of error rates, when their application is restricted to data exhibiting particular characteristics. The results reported here are limited to decision rules applied to uncensored and lognormally distributed data. To optimize decision rules, it is necessary to evaluate their behavior when applied to data exhibiting a range of characteristics that bracket those common to field data. The performance of decision rules applied to data sets exhibiting a broader range of characteristics is reported in the second paper of this study.« less

  13. Long-range correlation in cosmic microwave background radiation.

    PubMed

    Movahed, M Sadegh; Ghasemi, F; Rahvar, Sohrab; Tabar, M Reza Rahimi

    2011-08-01

    We investigate the statistical anisotropy and gaussianity of temperature fluctuations of Cosmic Microwave Background (CMB) radiation data from the Wilkinson Microwave Anisotropy Probe survey, using the Multifractal Detrended Fluctuation Analysis, Rescaled Range, and Scaled Windowed Variance methods. Multifractal Detrended Fluctuation Analysis shows that CMB fluctuations has a long-range correlation function with a multifractal behavior. By comparing the shuffled and surrogate series of CMB data, we conclude that the multifractality nature of the temperature fluctuation of CMB radiation is mainly due to the long-range correlations, and the map is consistent with a gaussian distribution.

  14. Detecting Genomic Clustering of Risk Variants from Sequence Data: Cases vs. Controls

    PubMed Central

    Schaid, Daniel J.; Sinnwell, Jason P.; McDonnell, Shannon K.; Thibodeau, Stephen N.

    2013-01-01

    As the ability to measure dense genetic markers approaches the limit of the DNA sequence itself, taking advantage of possible clustering of genetic variants in, and around, a gene would benefit genetic association analyses, and likely provide biological insights. The greatest benefit might be realized when multiple rare variants cluster in a functional region. Several statistical tests have been developed, one of which is based on the popular Kulldorff scan statistic for spatial clustering of disease. We extended another popular spatial clustering method – Tango’s statistic – to genomic sequence data. An advantage of Tango’s method is that it is rapid to compute, and when single test statistic is computed, its distribution is well approximated by a scaled chi-square distribution, making computation of p-values very rapid. We compared the Type-I error rates and power of several clustering statistics, as well as the omnibus sequence kernel association test (SKAT). Although our version of Tango’s statistic, which we call “Kernel Distance” statistic, took approximately half the time to compute than the Kulldorff scan statistic, it had slightly less power than the scan statistic. Our results showed that the Ionita-Laza version of Kulldorff’s scan statistic had the greatest power over a range of clustering scenarios. PMID:23842950

  15. Atmospheric microwave refractivity and refraction

    NASA Technical Reports Server (NTRS)

    Yu, E.; Hodge, D. B.

    1980-01-01

    The atmospheric refractivity can be expressed as a function of temperature, pressure, water vapor content, and operating frequency. Based on twenty-year meteorological data, statistics of the atmospheric refractivity were obtained. These statistics were used to estimate the variation of dispersion, attenuation, and refraction effects on microwave and millimeter wave signals propagating along atmospheric paths. Bending angle, elevation angle error, and range error were also developed for an exponentially tapered, spherical atmosphere.

  16. Growth of Jobs with Above Average Earnings Projected at All Education Levels. Issues in Labor Statistics. Summary 94-2.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, Washington, DC.

    The Bureau of Labor Statistics projects national employment to grow by almost 26.4 million over the 1992-2005 period. The majority of these new jobs will be in higher-paying occupations. Entry requirements of the new jobs in occupations having above-average earnings will range from no more than a high school education to a bachelor's degree or…

  17. The Southampton-York Natural Scenes (SYNS) dataset: Statistics of surface attitude

    PubMed Central

    Adams, Wendy J.; Elder, James H.; Graf, Erich W.; Leyland, Julian; Lugtigheid, Arthur J.; Muryy, Alexander

    2016-01-01

    Recovering 3D scenes from 2D images is an under-constrained task; optimal estimation depends upon knowledge of the underlying scene statistics. Here we introduce the Southampton-York Natural Scenes dataset (SYNS: https://syns.soton.ac.uk), which provides comprehensive scene statistics useful for understanding biological vision and for improving machine vision systems. In order to capture the diversity of environments that humans encounter, scenes were surveyed at random locations within 25 indoor and outdoor categories. Each survey includes (i) spherical LiDAR range data (ii) high-dynamic range spherical imagery and (iii) a panorama of stereo image pairs. We envisage many uses for the dataset and present one example: an analysis of surface attitude statistics, conditioned on scene category and viewing elevation. Surface normals were estimated using a novel adaptive scale selection algorithm. Across categories, surface attitude below the horizon is dominated by the ground plane (0° tilt). Near the horizon, probability density is elevated at 90°/270° tilt due to vertical surfaces (trees, walls). Above the horizon, probability density is elevated near 0° slant due to overhead structure such as ceilings and leaf canopies. These structural regularities represent potentially useful prior assumptions for human and machine observers, and may predict human biases in perceived surface attitude. PMID:27782103

  18. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  19. Gene coexpression measures in large heterogeneous samples using count statistics.

    PubMed

    Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan

    2014-11-18

    With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.

  20. A comparative study of internal laser-assisted and conventional liposuction: a look at the influence of drugs and major surgery on laboratory postoperative values

    PubMed Central

    Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy

    2013-01-01

    Background Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. Purpose The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. Material and methods We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student’s t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Results Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (±57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (±45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (±633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). Conclusion We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction. PMID:24143076

  1. A comparative study of internal laser-assisted and conventional liposuction: a look at the influence of drugs and major surgery on laboratory postoperative values.

    PubMed

    Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy

    2013-01-01

    Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student's t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (± 57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (± 45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (± 633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction.

  2. Anchor enhanced capsulorraphy in bunionectomies using an L-shaped capsulotomy.

    PubMed

    Gould, John S; Ali, Sheriff; Fowler, Rachel; Fleisig, Glenn S

    2003-01-01

    The objective of this study was to investigate potential benefit of a suture anchor-enhanced capsulorraphy in the early maintenance of correction in bunionectomies. We compared, retrospectively, in successive series, the loss of correction of the Hallux Valgus (HV) and intermetatarsal (IM) angle, in those repaired with an L-shaped capsulorraphy enhanced with anchors to those without. Intraoperative and second week postoperative simulated weightbearing anterior posterior (AP) X-rays were used to evaluate results. By using only intraoperative and early postoperative X-rays, we should have effectively eliminated extraneous factors that might have influenced our results. A Total of 106 cases were investigated, 65 of which were repaired using anchors, the remaining 41 without. In the anchor group, 38 underwent a proximal metatarsal concentric shelf osteotomy (CSO)/modified McBride procedure, while the remaining 27 had a distal Chevron correction. In the without-anchor group, 21 had a CSO/modified McBride procedure while 20 underwent the Chevron procedure. In the without-anchor group, the average HV and IM loss of correction was 4.60 degrees (range, -2 to 21 degrees) and 0.6 degrees (range, -1 to 9 degrees) respectively. In the anchor group, the corresponding loss was 2.8 degrees (range, -3 to 17 degrees) and 0.6 degrees (range, -2 to 14 degrees) respectively. These results, when statistically analyzed, demonstrated that while the IM angle change was not statistically significant, the HV angle change was statistically significant, implying that the anchor plays a significant role in maintaining the surgical correction in both the distal Chevron and CSO/ modified McBride bunionectomies.

  3. Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications

    ERIC Educational Resources Information Center

    Pabon, Peter; Ternstrom, Sten; Lamarche, Anick

    2011-01-01

    Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…

  4. The ideal Kolmogorov inertial range and constant

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1993-01-01

    The energy transfer statistics measured in numerically simulated flows are found to be nearly self-similar for wavenumbers in the inertial range. Using the measured self-similar form, an 'ideal' energy transfer function and the corresponding energy flux rate were deduced. From this flux rate, the Kolmogorov constant was calculated to be 1.5, in excellent agreement with experiments.

  5. Characterization Methods for Small Estuarine Systems in the Mid-Atlantic Region of the United States

    EPA Science Inventory

    Various statistical methods were applied to spatially discrete data from 14 intensively sampled small estuarine systems in the mid-Atlantic U.S. The number of sites per system ranged from 6 to 37. The surface area of the systems ranged from 1.9 to 193.4 km2. Parameters examined ...

  6. Cyberspace Math Models

    DTIC Science & Technology

    2013-06-01

    or indicators are used as long range memory measurements. Hurst and Holder exponents are the most important and popular parameters. Traditionally...the relation between two important parameters, the Hurst exponent (measurement of global long range memory) and the Entropy (measurement of...empirical results and future study. II. BACKGROUND We recall briey the mathematical and statistical definitions and properties of the Hurst exponents

  7. A Stochastic Model of Space-Time Variability of Mesoscale Rainfall: Statistics of Spatial Averages

    NASA Technical Reports Server (NTRS)

    Kundu, Prasun K.; Bell, Thomas L.

    2003-01-01

    A characteristic feature of rainfall statistics is that they depend on the space and time scales over which rain data are averaged. A previously developed spectral model of rain statistics that is designed to capture this property, predicts power law scaling behavior for the second moment statistics of area-averaged rain rate on the averaging length scale L as L right arrow 0. In the present work a more efficient method of estimating the model parameters is presented, and used to fit the model to the statistics of area-averaged rain rate derived from gridded radar precipitation data from TOGA COARE. Statistical properties of the data and the model predictions are compared over a wide range of averaging scales. An extension of the spectral model scaling relations to describe the dependence of the average fraction of grid boxes within an area containing nonzero rain (the "rainy area fraction") on the grid scale L is also explored.

  8. A systematic review of the quality of statistical methods employed for analysing quality of life data in cancer randomised controlled trials.

    PubMed

    Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew

    2017-09-01

    Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.

  9. Statistical scaling of geometric characteristics in stochastically generated pore microstructures

    DOE PAGES

    Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee

    2015-05-21

    In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less

  10. Low statistical power in biomedical science: a review of three human research domains.

    PubMed

    Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R

    2017-02-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.

  11. Low statistical power in biomedical science: a review of three human research domains

    PubMed Central

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  12. Combining super-ensembles and statistical emulation to improve a regional climate and vegetation model

    NASA Astrophysics Data System (ADS)

    Hawkins, L. R.; Rupp, D. E.; Li, S.; Sarah, S.; McNeall, D. J.; Mote, P.; Betts, R. A.; Wallom, D.

    2017-12-01

    Changing regional patterns of surface temperature, precipitation, and humidity may cause ecosystem-scale changes in vegetation, altering the distribution of trees, shrubs, and grasses. A changing vegetation distribution, in turn, alters the albedo, latent heat flux, and carbon exchanged with the atmosphere with resulting feedbacks onto the regional climate. However, a wide range of earth-system processes that affect the carbon, energy, and hydrologic cycles occur at sub grid scales in climate models and must be parameterized. The appropriate parameter values in such parameterizations are often poorly constrained, leading to uncertainty in predictions of how the ecosystem will respond to changes in forcing. To better understand the sensitivity of regional climate to parameter selection and to improve regional climate and vegetation simulations, we used a large perturbed physics ensemble and a suite of statistical emulators. We dynamically downscaled a super-ensemble (multiple parameter sets and multiple initial conditions) of global climate simulations using a 25-km resolution regional climate model HadRM3p with the land-surface scheme MOSES2 and dynamic vegetation module TRIFFID. We simultaneously perturbed land surface parameters relating to the exchange of carbon, water, and energy between the land surface and atmosphere in a large super-ensemble of regional climate simulations over the western US. Statistical emulation was used as a computationally cost-effective tool to explore uncertainties in interactions. Regions of parameter space that did not satisfy observational constraints were eliminated and an ensemble of parameter sets that reduce regional biases and span a range of plausible interactions among earth system processes were selected. This study demonstrated that by combining super-ensemble simulations with statistical emulation, simulations of regional climate could be improved while simultaneously accounting for a range of plausible land-atmosphere feedback strengths.

  13. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  15. The international forum of ophthalmic simulation: developing a virtual reality training curriculum for ophthalmology.

    PubMed

    Saleh, George M; Lamparter, Julia; Sullivan, Paul M; O'Sullivan, Fiona; Hussain, Badrul; Athanasiadis, Ioannis; Litwin, Andre S; Gillan, Stewart N

    2013-06-01

    To investigate the effect of a structured, supervised, cataract simulation programme on ophthalmic surgeons in their first year of training, and to evaluate the level of skill transfer. Trainees with minimal intraocular and simulator experience in their first year of ophthalmology undertook a structured, sequential, customised, virtual reality (VR) cataract training programme developed through the International Forum of Ophthalmic Simulation. A set of one-handed, bimanual, static and dynamic tasks were evaluated before and after the course and scores obtained. Statistical significance was evaluated with the Wilcoxon sign-rank test. The median precourse score of 101.50/400 (IQR 58.75-145.75) was significantly improved after completing the training programme ((postcourse score: 302/400, range: 266.25-343), p<0.001). While improvement was evident and found to be statistically significant in all parameters, greatest improvements were found for capsulorhexis and antitremor training ((Capsulorhexis: precourse score=0/100, range 0-4.5; postcourse score=81/100, range 13-87.75; p=0.002), (antitremor training: precourse score=0/100, range 0-0; postcourse score=80/100, range 60.25-91.50; p=0.001)). Structured and supervised VR training can offer a significant level of skills transfer to novice ophthalmic surgeons. VR training at the earliest stage of ophthalmic surgical training may, therefore, be of benefit.

  16. Ladar imaging detection of salient map based on PWVD and Rényi entropy

    NASA Astrophysics Data System (ADS)

    Xu, Yuannan; Zhao, Yuan; Deng, Rong; Dong, Yanbing

    2013-10-01

    Spatial-frequency information of a given image can be extracted by associating the grey-level spatial data with one of the well-known spatial/spatial-frequency distributions. The Wigner-Ville distribution (WVD) has a good characteristic that the images can be represented in spatial/spatial-frequency domains. For intensity and range images of ladar, through the pseudo Wigner-Ville distribution (PWVD) using one or two dimension window, the statistical property of Rényi entropy is studied. We also analyzed the change of Rényi entropy's statistical property in the ladar intensity and range images when the man-made objects appear. From this foundation, a novel method for generating saliency map based on PWVD and Rényi entropy is proposed. After that, target detection is completed when the saliency map is segmented using a simple and convenient threshold method. For the ladar intensity and range images, experimental results show the proposed method can effectively detect the military vehicles from complex earth background with low false alarm.

  17. Statistical testing of the full-range leadership theory in nursing.

    PubMed

    Kanste, Outi; Kääriäinen, Maria; Kyngäs, Helvi

    2009-12-01

    The aim of this study is to test statistically the structure of the full-range leadership theory in nursing. The data were gathered by postal questionnaires from nurses and nurse leaders working in healthcare organizations in Finland. A follow-up study was performed 1 year later. The sample consisted of 601 nurses and nurse leaders, and the follow-up study had 78 respondents. Theory was tested through structural equation modelling, standard regression analysis and two-way anova. Rewarding transformational leadership seems to promote and passive laissez-faire leadership to reduce willingness to exert extra effort, perceptions of leader effectiveness and satisfaction with the leader. Active management-by-exception seems to reduce willingness to exert extra effort and perception of leader effectiveness. Rewarding transformational leadership remained as a strong explanatory factor of all outcome variables measured 1 year later. The data supported the main structure of the full-range leadership theory, lending support to the universal nature of the theory.

  18. The reliability of endoscopic examination in assessment of arytenoid cartilage movement in horses. Part I: Subjective and objective laryngeal evaluation.

    PubMed

    Hackett, R P; Ducharme, N G; Fubini, S L; Erb, H N

    1991-01-01

    Videorecordings of the laryngeal activity of 108 unsedated horses were obtained at rest by passing a flexible videoendoscope into the nasopharynx through the right ventral meatus. All videotaped images were reviewed once, and 72 were reviewed twice, by three veterinarians. Laryngeal cartilage movement was assessed subjectively with a five-tier grading system. The mean intraobserver agreement was 83.3% (range, 75.0%-90.2%) with a kappa statistic of .65 to .98. The mean interobserver agreement was 79.0% (range, 70.4%-80.6%) with a kappa statistic of .51 to .90. A computer program was developed to measure the left:right ratio of the rima glottidis. The mean left:right ratio for horses assigned a median laryngeal grade of I was 0.84 (range, 0.55-1.03); for grade II, 0.82 (0.50-1.12); for grade III, 0.59 (0.39-0.91); and for grade IV, 0.24 (0.07-0.35).

  19. Multiple alignment-free sequence comparison

    PubMed Central

    Ren, Jie; Song, Kai; Sun, Fengzhu; Deng, Minghua; Reinert, Gesine

    2013-01-01

    Motivation: Recently, a range of new statistics have become available for the alignment-free comparison of two sequences based on k-tuple word content. Here, we extend these statistics to the simultaneous comparison of more than two sequences. Our suite of statistics contains, first, and , extensions of statistics for pairwise comparison of the joint k-tuple content of all the sequences, and second, , and , averages of sums of pairwise comparison statistics. The two tasks we consider are, first, to identify sequences that are similar to a set of target sequences, and, second, to measure the similarity within a set of sequences. Results: Our investigation uses both simulated data as well as cis-regulatory module data where the task is to identify cis-regulatory modules with similar transcription factor binding sites. We find that although for real data, all of our statistics show a similar performance, on simulated data the Shepp-type statistics are in some instances outperformed by star-type statistics. The multiple alignment-free statistics are more sensitive to contamination in the data than the pairwise average statistics. Availability: Our implementation of the five statistics is available as R package named ‘multiAlignFree’ at be http://www-rcf.usc.edu/∼fsun/Programs/multiAlignFree/multiAlignFreemain.html. Contact: reinert@stats.ox.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23990418

  20. Statistical competencies for medical research learners: What is fundamental?

    PubMed

    Enders, Felicity T; Lindsell, Christopher J; Welty, Leah J; Benn, Emma K T; Perkins, Susan M; Mayo, Matthew S; Rahbar, Mohammad H; Kidwell, Kelley M; Thurston, Sally W; Spratt, Heidi; Grambow, Steven C; Larson, Joseph; Carter, Rickey E; Pollock, Brad H; Oster, Robert A

    2017-06-01

    It is increasingly essential for medical researchers to be literate in statistics, but the requisite degree of literacy is not the same for every statistical competency in translational research. Statistical competency can range from 'fundamental' (necessary for all) to 'specialized' (necessary for only some). In this study, we determine the degree to which each competency is fundamental or specialized. We surveyed members of 4 professional organizations, targeting doctorally trained biostatisticians and epidemiologists who taught statistics to medical research learners in the past 5 years. Respondents rated 24 educational competencies on a 5-point Likert scale anchored by 'fundamental' and 'specialized.' There were 112 responses. Nineteen of 24 competencies were fundamental. The competencies considered most fundamental were assessing sources of bias and variation (95%), recognizing one's own limits with regard to statistics (93%), identifying the strengths, and limitations of study designs (93%). The least endorsed items were meta-analysis (34%) and stopping rules (18%). We have identified the statistical competencies needed by all medical researchers. These competencies should be considered when designing statistical curricula for medical researchers and should inform which topics are taught in graduate programs and evidence-based medicine courses where learners need to read and understand the medical research literature.

  1. Appraising the self-assessed support needs of Turkish women with breast cancer.

    PubMed

    Erci, B; Karabulut, N

    2007-03-01

    The purposes of this study were to establish the range of needs of women with breast cancer and to examine how women's needs might form clusters that could provide the basis for developing a standardized scale of needs for use by local breast care nurses in the evaluation of care. The sample consisted of 143 women with breast cancer who were admitted to the outpatient and inpatient oncology clinics in a university hospital in Erzurum, Turkey. The data were collected by questionnaire, and included demographic characteristics and the self-assessed support needs of women with breast cancer. Statistical analyses have shown that the standardized scale of needs has statistically acceptable levels of reliability and validity. The women's support needs mostly clustered in Family and Friends (79%) and After Care (78.3%). The most frequently required support category was Family and Friend; however, the women were in need of support of all categories. In terms of age ranges, there are statistically significant differences in relation to Femininity and Body Image, and Family and Friends of the seven categories. Women experienced a high level of needs associated with a diagnosis of breast cancer. The results in this study should increase awareness among cancer care professionals about a range of psychosocial needs and may help them target particular patient groups for particular support interventions.

  2. Statistical and linguistic features of DNA sequences

    NASA Technical Reports Server (NTRS)

    Havlin, S.; Buldyrev, S. V.; Goldberger, A. L.; Mantegna, R. N.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    We present evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationary" feature of the sequence of base pairs by applying a new algorithm called Detrended Fluctuation Analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and noncoding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to all eukaryotic DNA sequences (33 301 coding and 29 453 noncoding) in the entire GenBank database. We describe a simple model to account for the presence of long-range power-law correlations which is based upon a generalization of the classic Levy walk. Finally, we describe briefly some recent work showing that the noncoding sequences have certain statistical features in common with natural languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function. We suggest that noncoding regions in plants and invertebrates may display a smaller entropy and larger redundancy than coding regions, further supporting the possibility that noncoding regions of DNA may carry biological information.

  3. High resolution probabilistic precipitation forecast over Spain combining the statistical downscaling tool PROMETEO and the AEMET short range EPS system (AEMET/SREPS)

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Santos, C.; Garcia-Moya, J. A.; Gutierrez, J. M.; Orfila, B.

    2009-04-01

    The Short-Range Ensemble Prediction System (SREPS) is a multi-LAM (UM, HIRLAM, MM5, LM and HRM) multi analysis/boundary conditions (ECMWF, UKMetOffice, DWD and GFS) run twice a day by AEMET (72 hours lead time) over a European domain, with a total of 5 (LAMs) x 4 (GCMs) = 20 members. One of the main goals of this project is analyzing the impact of models and boundary conditions in the short-range high-resolution forecasted precipitation. A previous validation of this method has been done considering a set of climate networks in Spain, France and Germany, by interpolating the prediction to the gauge locations (SREPS, 2008). In this work we compare these results with those obtained by using a statistical downscaling method to post-process the global predictions, obtaining an "advanced interpolation" for the local precipitation using climate network precipitation observations. In particular, we apply the PROMETEO downscaling system based on analogs and compare the SREPS ensemble of 20 members with the PROMETEO statistical ensemble of 5 (analog ensemble) x 4 (GCMs) = 20 members. Moreover, we will also compare the performance of a combined approach post-processing the SREPS outputs using the PROMETEO system. References: SREPS 2008. 2008 EWGLAM-SRNWP Meeting (http://www.aemet.es/documentos/va/divulgacion/conferencias/prediccion/Ewglam/PRED_CSantos.pdf)

  4. Assessment of the midflexion rotational laxity in posterior-stabilized total knee arthroplasty.

    PubMed

    Hino, Kazunori; Kutsuna, Tatsuhiko; Oonishi, Yoshio; Watamori, Kunihiko; Kiyomatsu, Hiroshi; Iseki, Yasutake; Watanabe, Seiji; Ishimaru, Yasumitsu; Miura, Hiromasa

    2017-11-01

    To evaluate changes in midflexion rotational laxity before and after posterior-stabilized (PS)-total knee arthroplasty (TKA). Twenty-nine knees that underwent PS-TKA were evaluated. Manual mild passive rotational stress was applied to the knees, and the internal-external rotational angle was measured automatically by a navigation system at 30°, 45°, 60°, and 90° of knee flexion. The post-operative internal rotational laxity was statistically significantly increased compared to the preoperative level at 30°, 45°, 60°, and 90° of flexion. The post-operative external rotational laxity was statistically significantly decreased compared to the preoperative level at 45° and 60° of flexion. The post-operative internal-external rotational laxity was statistically significantly increased compared to the preoperative level only at 30° of flexion. The preoperative and post-operative rotational laxity showed a significant correlation at 30°, 45°, 60°, and 90° of flexion. Internal-external rotational laxity increases at the initial flexion range due to resection of both the anterior or posterior cruciate ligaments and retention of the collateral ligaments in PS-TKA. Preoperative and post-operative rotational laxity indicated a significant correlation at the midflexion range. This study showed that a large preoperative rotational laxity increased the risk of a large post-operative laxity, especially at the initial flexion range in PS-TKA. III.

  5. Effect of ultrasound frequency on the Nakagami statistics of human liver tissues.

    PubMed

    Tsui, Po-Hsiang; Zhou, Zhuhuang; Lin, Ying-Hsiu; Hung, Chieh-Ming; Chung, Shih-Jou; Wan, Yung-Liang

    2017-01-01

    The analysis of the backscattered statistics using the Nakagami parameter is an emerging ultrasound technique for assessing hepatic steatosis and fibrosis. Previous studies indicated that the echo amplitude distribution of a normal liver follows the Rayleigh distribution (the Nakagami parameter m is close to 1). However, using different frequencies may change the backscattered statistics of normal livers. This study explored the frequency dependence of the backscattered statistics in human livers and then discussed the sources of ultrasound scattering in the liver. A total of 30 healthy participants were enrolled to undergo a standard care ultrasound examination on the liver, which is a natural model containing diffuse and coherent scatterers. The liver of each volunteer was scanned from the right intercostal view to obtain image raw data at different central frequencies ranging from 2 to 3.5 MHz. Phantoms with diffuse scatterers only were also made to perform ultrasound scanning using the same protocol for comparisons with clinical data. The Nakagami parameter-frequency correlation was evaluated using Pearson correlation analysis. The median and interquartile range of the Nakagami parameter obtained from livers was 1.00 (0.98-1.05) for 2 MHz, 0.93 (0.89-0.98) for 2.3 MHz, 0.87 (0.84-0.92) for 2.5 MHz, 0.82 (0.77-0.88) for 3.3 MHz, and 0.81 (0.76-0.88) for 3.5 MHz. The Nakagami parameter decreased with the increasing central frequency (r = -0.67, p < 0.0001). However, the effect of ultrasound frequency on the statistical distribution of the backscattered envelopes was not found in the phantom results (r = -0.147, p = 0.0727). The current results demonstrated that the backscattered statistics of normal livers is frequency-dependent. Moreover, the coherent scatterers may be the primary factor to dominate the frequency dependence of the backscattered statistics in a liver.

  6. Regional regression equations for estimation of natural streamflow statistics in Colorado

    USGS Publications Warehouse

    Capesius, Joseph P.; Stephens, Verlin C.

    2009-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Water Conservation Board and the Colorado Department of Transportation, developed regional regression equations for estimation of various streamflow statistics that are representative of natural streamflow conditions at ungaged sites in Colorado. The equations define the statistical relations between streamflow statistics (response variables) and basin and climatic characteristics (predictor variables). The equations were developed using generalized least-squares and weighted least-squares multilinear regression reliant on logarithmic variable transformation. Streamflow statistics were derived from at least 10 years of streamflow data through about 2007 from selected USGS streamflow-gaging stations in the study area that are representative of natural-flow conditions. Basin and climatic characteristics used for equation development are drainage area, mean watershed elevation, mean watershed slope, percentage of drainage area above 7,500 feet of elevation, mean annual precipitation, and 6-hour, 100-year precipitation. For each of five hydrologic regions in Colorado, peak-streamflow equations that are based on peak-streamflow data from selected stations are presented for the 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year instantaneous-peak streamflows. For four of the five hydrologic regions, equations based on daily-mean streamflow data from selected stations are presented for 7-day minimum 2-, 10-, and 50-year streamflows and for 7-day maximum 2-, 10-, and 50-year streamflows. Other equations presented for the same four hydrologic regions include those for estimation of annual- and monthly-mean streamflow and streamflow-duration statistics for exceedances of 10, 25, 50, 75, and 90 percent. All equations are reported along with salient diagnostic statistics, ranges of basin and climatic characteristics on which each equation is based, and commentary of potential bias, which is not otherwise removed by log-transformation of the variables of the equations from interpretation of residual plots. The predictor-variable ranges can be used to assess equation applicability for ungaged sites in Colorado.

  7. Statistical detection of patterns in unidimensional distributions by continuous wavelet transforms

    NASA Astrophysics Data System (ADS)

    Baluev, R. V.

    2018-04-01

    Objective detection of specific patterns in statistical distributions, like groupings or gaps or abrupt transitions between different subsets, is a task with a rich range of applications in astronomy: Milky Way stellar population analysis, investigations of the exoplanets diversity, Solar System minor bodies statistics, extragalactic studies, etc. We adapt the powerful technique of the wavelet transforms to this generalized task, making a strong emphasis on the assessment of the patterns detection significance. Among other things, our method also involves optimal minimum-noise wavelets and minimum-noise reconstruction of the distribution density function. Based on this development, we construct a self-closed algorithmic pipeline aimed to process statistical samples. It is currently applicable to single-dimensional distributions only, but it is flexible enough to undergo further generalizations and development.

  8. Psychophysical Map Stability in Bilateral Sequential Cochlear Implantation: Comparing Current Audiology Methods to a New Statistical Definition.

    PubMed

    Domville-Lewis, Chloe; Santa Maria, Peter L; Upson, Gemma; Chester-Browne, Ronel; Atlas, Marcus D

    2015-01-01

    The purpose of this study was to establish a statistical definition for stability in cochlear implant maps. Once defined, this study aimed to compare the duration taken to achieve a stable map in first and second implants in patients who underwent sequential bilateral cochlear implantation. This article also sought to evaluate a number of factors that potentially affect map stability. A retrospective cohort study of 33 patients with sensorineural hearing loss who received sequential bilateral cochlear implantation (Cochlear, Sydney, Australia), performed by the senior author. Psychophysical parameters of hearing threshold scores, comfort scores, and the dynamic range were measured for the apical, medial, and basal portions of the cochlear implant electrode at a range of intervals postimplantation. Stability was defined statistically as a less than 10% difference in threshold, comfort, and dynamic range scores over three consecutive mapping sessions. A senior cochlear implant audiologist, blinded to implant order and the statistical results, separately analyzed these psychophysical map parameters using current assessment methods. First and second implants were compared for duration to achieve stability, age, gender, the duration of deafness, etiology of deafness, time between the insertion of the first and second implant, and the presence or absence of preoperative hearing aids were evaluated and its relationship to stability. Statistical analysis included performing a two-tailed Student's t tests and least squares regression analysis, with a statistical significance set at p ≤ 0.05. There was a significant positive correlation between the devised statistical definition and the current audiology methods for assessing stability, with a Pearson correlation coefficient r = 0.36 and a least squares regression slope (b) of 0.41, df(58), 95% confidence interval 0.07 to 0.55 (p = 0.004). The average duration from device switch on to stability in the first implant was 87 days using current audiology methods and 81 days using the statistical definition, with no statistically significant difference between assessment methods (p = 0.2). The duration to achieve stability in the second implant was 51 days using current audiology methods and 60 days using the statistical method, and again no difference between the two assessment methods (p = 0.13). There was a significant reduction in the time to achieve stability in second implants for both audiology and statistical methods (p < 0.001 and p = 0.02, respectively). There was a difference in duration to achieve stability based on electrode array region, with basal portions taking longer to stabilize than apical in the first implant (p = 0.02) and both apical and medial segments in second implants (p = 0.004 and p = 0.01, respectively). No factors that were evaluated in this study, including gender, age, etiology of deafness, duration of deafness, time between implant insertion, and the preoperative hearing aid status, were correlated with stability duration in either stability assessment method. Our statistical definition can accurately predict cochlear implant map stability when compared with current audiology practices. Cochlear implants that are implanted second tend to stabilize sooner than the first, which has a significant impact on counseling before a second implant. No factors evaluated affected the duration required to achieve stability in this study.

  9. Measurement-device-independent quantum key distribution with source state errors and statistical fluctuation

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Yu, Zong-Wen; Wang, Xiang-Bin

    2017-03-01

    We show how to calculate the secure final key rate in the four-intensity decoy-state measurement-device-independent quantum key distribution protocol with both source errors and statistical fluctuations with a certain failure probability. Our results rely only on the range of only a few parameters in the source state. All imperfections in this protocol have been taken into consideration without assuming any specific error patterns of the source.

  10. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  11. A randomized evaluation of a computer-based physician's workstation: design considerations and baseline results.

    PubMed Central

    Rotman, B. L.; Sullivan, A. N.; McDonald, T.; DeSmedt, P.; Goodnature, D.; Higgins, M.; Suermondt, H. J.; Young, C. Y.; Owens, D. K.

    1995-01-01

    We are performing a randomized, controlled trial of a Physician's Workstation (PWS), an ambulatory care information system, developed for use in the General Medical Clinic (GMC) of the Palo Alto VA. Goals for the project include selecting appropriate outcome variables and developing a statistically powerful experimental design with a limited number of subjects. As PWS provides real-time drug-ordering advice, we retrospectively examined drug costs and drug-drug interactions in order to select outcome variables sensitive to our short-term intervention as well as to estimate the statistical efficiency of alternative design possibilities. Drug cost data revealed the mean daily cost per physician per patient was 99.3 cents +/- 13.4 cents, with a range from 0.77 cent to 1.37 cents. The rate of major interactions per prescription for each physician was 2.9% +/- 1%, with a range from 1.5% to 4.8%. Based on these baseline analyses, we selected a two-period parallel design for the evaluation, which maximized statistical power while minimizing sources of bias. PMID:8563376

  12. Multiplicative processes in visual cognition

    NASA Astrophysics Data System (ADS)

    Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.

    2014-03-01

    The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.

  13. Rasch fit statistics and sample size considerations for polytomous data.

    PubMed

    Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael

    2008-05-29

    Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire - 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges.

  14. Rasch fit statistics and sample size considerations for polytomous data

    PubMed Central

    Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael

    2008-01-01

    Background Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Methods Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire – 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. Results The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. Conclusion It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges. PMID:18510722

  15. Statistical analysis of Thematic Mapper Simulator data for the geobotanical discrimination of rock types in southwest Oregon

    NASA Technical Reports Server (NTRS)

    Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.

    1984-01-01

    An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.

  16. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  17. Examination of Solar Cycle Statistical Model and New Prediction of Solar Cycle 23

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.

    2000-01-01

    Sunspot numbers in the current solar cycle 23 were estimated by using a statistical model with the accumulating cycle sunspot data based on the odd-even behavior of historical sunspot cycles from 1 to 22. Since cycle 23 has progressed and the accurate solar minimum occurrence has been defined, the statistical model is validated by comparing the previous prediction with the new measured sunspot number; the improved sunspot projection in short range of future time is made accordingly. The current cycle is expected to have a moderate level of activity. Errors of this model are shown to be self-correcting as cycle observations become available.

  18. Statistical summaries of water-quality data for two coal areas of Jackson County, Colorado

    USGS Publications Warehouse

    Kuhn, Gerhard

    1982-01-01

    Statistical summaries of water-quality data are compiled for eight streams in two separate coal areas of Jackson County, Colo. The quality-of-water data were collected from October 1976 to September 1980. For inorganic constituents, the maximum, minimum, and mean concentrations, as well as other statistics are presented; for minor elements, only the maximum, minimum, and mean values are included. Least-squares equations (regressions) are also given relating specific conductance of the streams to the concentration of the major ions. The observed range of specific conductance was 85 to 1,150 micromhos per centimeter for the eight sites. (USGS)

  19. Dual Level Statistical Investigation of Equilibrium Solubility in Simulated Fasted and Fed Intestinal Fluid

    PubMed Central

    2017-01-01

    The oral route is the preferred option for drug administration but contains the inherent issue of drug absorption from the gastro-intestinal tract (GIT) in order to elicit systemic activity. A prerequisite for absorption is drug dissolution, which is dependent upon drug solubility in the variable milieu of GIT fluid, with poorly soluble drugs presenting a formulation and biopharmaceutical challenge. Multiple factors within GIT fluid influence solubility ranging from pH to the concentration and ratio of amphiphilic substances, such as phospholipid, bile salt, monoglyceride, and cholesterol. To aid in vitro investigation simulated intestinal fluids (SIF) covering the fasted and fed state have been developed. SIF media is complex and statistical design of experiment (DoE) investigations have revealed the range of solubility values possible within each state due to physiological variability along with the media factors and factor interactions which influence solubility. However, these studies require large numbers of experiments (>60) and are not feasible or sensible within a drug development setting. In the current study a smaller dual level, reduced experimental number (20) DoE providing three arms covering the fasted and fed states along with a combined analysis has been investigated. The results indicate that this small scale investigation is feasible and provides solubility ranges that encompass published data in human and simulated fasted and fed fluids. The measured fasted and fed solubility ranges are in agreement with published large scale DoE results in around half of the cases, with the differences due to changes in media composition between studies. Indicating that drug specific behaviors are being determined and that careful media factor and concentration level selection is required in order to determine a physiologically relevant solubility range. The study also correctly identifies the major single factor or factors which influence solubility but it is evident that lower significance factors (for example bile salt) are not picked up due to the lower sample number employed. A similar issue is present with factor interactions with only a limited number available for study and generally not determined to have a significant solubility impact due to the lower statistical power of the study. The study indicates that a reduced experimental number DoE is feasible, will provide solubility range results with identification of major solubility factors however statistical limitations restrict the analysis. The approach therefore represents a useful initial screening tool that can guide further in depth analysis of a drug’s behavior in gastrointestinal fluids. PMID:29072917

  20. Modeling forest biomass and growth: Coupling long-term inventory and LiDAR data

    Treesearch

    Chad Babcock; Andrew O. Finley; Bruce D. Cook; Aaron Weiskittel; Christopher W. Woodall

    2016-01-01

    Combining spatially-explicit long-term forest inventory and remotely sensed information from Light Detection and Ranging (LiDAR) datasets through statistical models can be a powerful tool for predicting and mapping above-ground biomass (AGB) at a range of geographic scales. We present and examine a novel modeling approach to improve prediction of AGB and estimate AGB...

  1. The effects of range-of-motion therapy on the plantar pressures of patients with diabetes mellitus.

    PubMed

    Goldsmith, Jon R; Lidtke, Roy H; Shott, Susan

    2002-10-01

    A randomized controlled study of 19 patients with diabetes mellitus (10 men, 9 women) was undertaken to determine the effects of home exercise therapy on joint mobility and plantar pressures. Of the 19 subjects, 9 subjects performed unsupervised active and passive range-of-motion exercises of the joints in their feet. Each subject was evaluated for joint stiffness and peak plantar pressures at the beginning and conclusion of the study. After only 1 month of therapy, a statistically significant average decrease of 4.2% in peak plantar pressures was noted in the subjects performing the range-of-motion exercises. In the control group, an average increase of 4.4% in peak plantar pressures was noted. Although the joint mobility data revealed no statistically significant differences between the groups, there was a trend for a decrease in joint stiffness in the treatment group. The results of this study demonstrate that an unsupervised range-of-motion exercise program can reduce peak plantar pressures in the diabetic foot. Given that high plantar pressures have been linked to diabetic neuropathic ulceration, it may be possible to reduce the risk of such ulceration with this therapy.

  2. A simulation of GPS and differential GPS sensors

    NASA Technical Reports Server (NTRS)

    Rankin, James M.

    1993-01-01

    The Global Positioning System (GPS) is a revolutionary advance in navigation. Users can determine latitude, longitude, and altitude by receiving range information from at least four satellites. The statistical accuracy of the user's position is directly proportional to the statistical accuracy of the range measurement. Range errors are caused by clock errors, ephemeris errors, atmospheric delays, multipath errors, and receiver noise. Selective Availability, which the military uses to intentionally degrade accuracy for non-authorized users, is a major error source. The proportionality constant relating position errors to range errors is the Dilution of Precision (DOP) which is a function of the satellite geometry. Receivers separated by relatively short distances have the same satellite and atmospheric errors. Differential GPS (DGPS) removes these errors by transmitting pseudorange corrections from a fixed receiver to a mobile receiver. The corrected pseudorange at the moving receiver is now corrupted only by errors from the receiver clock, multipath, and measurement noise. This paper describes a software package that models position errors for various GPS and DGPS systems. The error model is used in the Real-Time Simulator and Cockpit Technology workstation simulations at NASA-LaRC. The GPS/DGPS sensor can simulate enroute navigation, instrument approaches, or on-airport navigation.

  3. Application of the Kombucha 'tea fungus' for the enhancement of antioxidant and starch hydrolase inhibitory properties of ten herbal teas.

    PubMed

    Watawana, Mindani I; Jayawardena, Nilakshi; Choo, Candy; Waisundara, Viduranga Y

    2016-03-01

    Ten herbal teas (Acacia arabica, Aegle marmelos flower, A. marmelos root bark, Aerva lanata, Asteracantha longifolia, Cassia auriculata, Hemidesmus indicus, Hordeum vulgare, Phyllanthus emblica, Tinospora cordifolia) were fermented with the Kombucha 'tea fungus'. The pH values of the fermented beverages ranged from 4.0 to 6.0 by day 7, while the titratable acidity ranged from 2.5 to 5.0g/mL (P<0.05). Gallic acid had statistically significantly increased (P<0.05) in almost all the samples by day 7. The Oxygen radical absorbance capacity assay indicated 5 of the Kombucha beverages to have statistically significant increases (P<0.05) by day 7. The α-amylase inhibitory activities ranged from 52.5 to 67.2μg/mL in terms of IC50 values following fermentation, while the α-glucosidase inhibitory activities ranged from 95.2 to 196.1μg/mL. In conclusion, an enhancement of the antioxidant and starch hydrolase inhibitory potential of the herbal teas was observed by adding the tea fungus. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Effects of spatial frequency bands on perceptual decision: it is not the stimuli but the comparison.

    PubMed

    Rotshtein, Pia; Schofield, Andrew; Funes, María J; Humphreys, Glyn W

    2010-08-24

    Observers performed three between- and two within-category perceptual decisions with hybrid stimuli comprising low and high spatial frequency (SF) images. We manipulated (a) attention to, and (b) congruency of information in the two SF bands. Processing difficulty of the different SF bands varied across different categorization tasks: house-flower, face-house, and valence decisions were easier when based on high SF bands, while flower-face and gender categorizations were easier when based on low SF bands. Larger interference also arose from response relevant distracters that were presented in the "preferred" SF range of the task. Low SF effects were facilitated by short exposure durations. The results demonstrate that decisions are affected by an interaction of task and SF range and that the information from the non-attended SF range interfered at the decision level. A further analysis revealed that overall differences in the statistics of image features, in particular differences of orientation information between two categories, were associated with decision difficulty. We concluded that the advantage of using information from one SF range over another depends on the specific task requirements that built on the differences of the statistical properties between the compared categories.

  5. The Statistical Analysis of Global Oxygen ENAs Sky Maps from IBEX-Lo: Implication on the ENA sources

    NASA Astrophysics Data System (ADS)

    Park, J.; Kucharek, H.; Moebius, E.; Bochsler, P. A.

    2013-12-01

    Energetic Neutral Atoms (ENAs) created in the interstellar medium and heliospheric interface have been observed by the Interstellar Boundary Explorer (IBEX) orbiting the Earth on a highly elliptical trajectory since 2008. The science payload on this small spacecraft consists of two highly sensitive single-pixel ENA cameras: the IBEX-Lo sensor covering the energy ranges from 0.01 to 2 keV and the IBEX-Hi sensor covering the energy ranges from 0.3 to 6 keV. In order to measure the incident ENAs, the IBEX-Lo sensor uses a conversion surface to convert neutrals to negative ions. After passing an electrostatic analyzer, they are separated by species (H and heavier species) via a time-of-flight mass spectrometer. All-sky H ENA maps over three years were completed and show two significant features: the interstellar H and He neutral flow is shown at the low energy ranges (0.01 to 0.11 keV) and the ribbon appears at the higher energies (0.21 to 1.35 keV). Like in the hydrogen sky maps, the interstellar O+Ne neutral flow appears in all-sky O ENA maps at the energy ranges from 0.21 to 0.87 keV The distributed heliospheric Oxygen ENAs over the entire energy ranges is determined from very low counting statistics. In this study, we therefore apply the Cash's C statistics (Cash, 1979) and determine the upper and lower confidence limits (Gehrels, 1986) for the statistical significance among all events in all-sky O ENA maps. These newly created sky maps specifically show the distributed heliospheric O ENA flux surrounding the interstellar O+Ne neutral flow. This enhancement distributed ENA flux will provide us new insights into the ion population creation the ENA emission. It seems that there is no signature of ribbon in all-sky O ENA maps. If one assumes that the generation mechanism of the ribbon is the same for hydrogen and oxygen, the location of source ion population may be closer to the heliosheath. In this poster we will discuss all the results of this study and their implications for the source regions and populations in detail.

  6. Establishing Consensus Turbulence Statistics for Hot Subsonic Jets

    NASA Technical Reports Server (NTRS)

    Bridges, James; Werner, Mark P.

    2010-01-01

    Many tasks in fluids engineering require knowledge of the turbulence in jets. There is a strong, although fragmented, literature base for low order statistics, such as jet spread and other meanvelocity field characteristics. Some sources, particularly for low speed cold jets, also provide turbulence intensities that are required for validating Reynolds-averaged Navier-Stokes (RANS) Computational Fluid Dynamics (CFD) codes. There are far fewer sources for jet spectra and for space-time correlations of turbulent velocity required for aeroacoustics applications, although there have been many singular publications with various unique statistics, such as Proper Orthogonal Decomposition, designed to uncover an underlying low-order dynamical description of turbulent jet flow. As the complexity of the statistic increases, the number of flows for which the data has been categorized and assembled decreases, making it difficult to systematically validate prediction codes that require high-level statistics over a broad range of jet flow conditions. For several years, researchers at NASA have worked on developing and validating jet noise prediction codes. One such class of codes, loosely called CFD-based or statistical methods, uses RANS CFD to predict jet mean and turbulent intensities in velocity and temperature. These flow quantities serve as the input to the acoustic source models and flow-sound interaction calculations that yield predictions of far-field jet noise. To develop this capability, a catalog of turbulent jet flows has been created with statistics ranging from mean velocity to space-time correlations of Reynolds stresses. The present document aims to document this catalog and to assess the accuracies of the data, e.g. establish uncertainties for the data. This paper covers the following five tasks: Document acquisition and processing procedures used to create the particle image velocimetry (PIV) datasets. Compare PIV data with hotwire and laser Doppler velocimetry (LDV) data published in the open literature. Compare different datasets acquired at roughly the same flow conditions to establish uncertainties. Create a consensus dataset for a range of hot jet flows, including uncertainty bands. Analyze this consensus dataset for self-consistency and compare jet characteristics to those of the open literature. One final objective fulfilled by this work was the demonstration of a universal scaling for the jet flow fields, at least within the region of interest to aeroacoustics. The potential core length and the spread rate of the half-velocity radius were used to collapse of the mean and turbulent velocity fields over the first 20 jet diameters in a highly satisfying manner.

  7. KERNELHR: A program for estimating animal home ranges

    USGS Publications Warehouse

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  8. Fragility of Results in Ophthalmology Randomized Controlled Trials: A Systematic Review.

    PubMed

    Shen, Carl; Shamsudeen, Isabel; Farrokhyar, Forough; Sabri, Kourosh

    2018-05-01

    Evidence-based medicine is guided by our interpretation of randomized controlled trials (RCTs) that address important clinical questions. Evaluation of the robustness of statistically significant outcomes adds a crucial element to the global assessment of trial findings. The purpose of this systematic review was to determine the robustness of ophthalmology RCTs through application of the Fragility Index (FI), a novel metric of the robustness of statistically significant outcomes. Systematic review. A literature search (MEDLINE) was performed for all RCTs published in top ophthalmology journals and ophthalmology-related RCTs published in high-impact journals in the past 10 years. Two reviewers independently screened 1811 identified articles for inclusion if they (1) were a human ophthalmology-related trial, (2) had a 1:1 prospective study design, and (3) reported a statistically significant dichotomous outcome in the abstract. All relevant data, including outcome, P value, number of patients in each group, number of events in each group, number of patients lost to follow-up, and trial characteristics, were extracted. The FI of each RCT was calculated and multivariate regression applied to determine predictive factors. The 156 trials had a median sample size of 91.5 (range, 13-2593) patients/eyes, and a median of 28 (range, 4-2217) events. The median FI of the included trials was 2 (range, 0-48), meaning that if 2 non-events were switched to events in the treatment group, the result would lose its statistical significance. A quarter of all trials had an FI of 1 or less, and 75% of trials had an FI of 6 or less. The FI was less than the number of missing data points in 52.6% of trials. Predictive factors for FI by multivariate regression included smaller P value (P < 0.001), larger sample size (P = 0.001), larger number of events (P = 0.011), and journal impact factor (P = 0.029). In ophthalmology trials, statistically significant dichotomous results are often fragile, meaning that a difference of only a couple of events can change the statistical significance. An application of the FI in RCTs may aid in the interpretation of results and assessment of quality of evidence. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  9. Predicting Outcomes After Chemo-Embolization in Patients with Advanced-Stage Hepatocellular Carcinoma: An Evaluation of Different Radiologic Response Criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunn, Andrew J., E-mail: agunn@uabmc.edu; Sheth, Rahul A.; Luber, Brandon

    2017-01-15

    PurposeThe purpse of this study was to evaluate the ability of various radiologic response criteria to predict patient outcomes after trans-arterial chemo-embolization with drug-eluting beads (DEB-TACE) in patients with advanced-stage (BCLC C) hepatocellular carcinoma (HCC).Materials and methodsHospital records from 2005 to 2011 were retrospectively reviewed. Non-infiltrative lesions were measured at baseline and on follow-up scans after DEB-TACE according to various common radiologic response criteria, including guidelines of the World Health Organization (WHO), Response Evaluation Criteria in Solid Tumors (RECIST), the European Association for the Study of the Liver (EASL), and modified RECIST (mRECIST). Statistical analysis was performed to see which,more » if any, of the response criteria could be used as a predictor of overall survival (OS) or time-to-progression (TTP).Results75 patients met inclusion criteria. Median OS and TTP were 22.6 months (95 % CI 11.6–24.8) and 9.8 months (95 % CI 7.1–21.6), respectively. Univariate and multivariate Cox analyses revealed that none of the evaluated criteria had the ability to be used as a predictor for OS or TTP. Analysis of the C index in both univariate and multivariate models showed that the evaluated criteria were not accurate predictors of either OS (C-statistic range: 0.51–0.58 in the univariate model; range: 0.54–0.58 in the multivariate model) or TTP (C-statistic range: 0.55–0.59 in the univariate model; range: 0.57–0.61 in the multivariate model).ConclusionCurrent response criteria are not accurate predictors of OS or TTP in patients with advanced-stage HCC after DEB-TACE.« less

  10. Predicting Outcomes After Chemo-Embolization in Patients with Advanced-Stage Hepatocellular Carcinoma: An Evaluation of Different Radiologic Response Criteria.

    PubMed

    Gunn, Andrew J; Sheth, Rahul A; Luber, Brandon; Huynh, Minh-Huy; Rachamreddy, Niranjan R; Kalva, Sanjeeva P

    2017-01-01

    The purpse of this study was to evaluate the ability of various radiologic response criteria to predict patient outcomes after trans-arterial chemo-embolization with drug-eluting beads (DEB-TACE) in patients with advanced-stage (BCLC C) hepatocellular carcinoma (HCC). Hospital records from 2005 to 2011 were retrospectively reviewed. Non-infiltrative lesions were measured at baseline and on follow-up scans after DEB-TACE according to various common radiologic response criteria, including guidelines of the World Health Organization (WHO), Response Evaluation Criteria in Solid Tumors (RECIST), the European Association for the Study of the Liver (EASL), and modified RECIST (mRECIST). Statistical analysis was performed to see which, if any, of the response criteria could be used as a predictor of overall survival (OS) or time-to-progression (TTP). 75 patients met inclusion criteria. Median OS and TTP were 22.6 months (95 % CI 11.6-24.8) and 9.8 months (95 % CI 7.1-21.6), respectively. Univariate and multivariate Cox analyses revealed that none of the evaluated criteria had the ability to be used as a predictor for OS or TTP. Analysis of the C index in both univariate and multivariate models showed that the evaluated criteria were not accurate predictors of either OS (C-statistic range: 0.51-0.58 in the univariate model; range: 0.54-0.58 in the multivariate model) or TTP (C-statistic range: 0.55-0.59 in the univariate model; range: 0.57-0.61 in the multivariate model). Current response criteria are not accurate predictors of OS or TTP in patients with advanced-stage HCC after DEB-TACE.

  11. Statistical Validation of Image Segmentation Quality Based on a Spatial Overlap Index1

    PubMed Central

    Zou, Kelly H.; Warfield, Simon K.; Bharatha, Aditya; Tempany, Clare M.C.; Kaus, Michael R.; Haker, Steven J.; Wells, William M.; Jolesz, Ferenc A.; Kikinis, Ron

    2005-01-01

    Rationale and Objectives To examine a statistical validation method based on the spatial overlap between two sets of segmentations of the same anatomy. Materials and Methods The Dice similarity coefficient (DSC) was used as a statistical validation metric to evaluate the performance of both the reproducibility of manual segmentations and the spatial overlap accuracy of automated probabilistic fractional segmentation of MR images, illustrated on two clinical examples. Example 1: 10 consecutive cases of prostate brachytherapy patients underwent both preoperative 1.5T and intraoperative 0.5T MR imaging. For each case, 5 repeated manual segmentations of the prostate peripheral zone were performed separately on preoperative and on intraoperative images. Example 2: A semi-automated probabilistic fractional segmentation algorithm was applied to MR imaging of 9 cases with 3 types of brain tumors. DSC values were computed and logit-transformed values were compared in the mean with the analysis of variance (ANOVA). Results Example 1: The mean DSCs of 0.883 (range, 0.876–0.893) with 1.5T preoperative MRI and 0.838 (range, 0.819–0.852) with 0.5T intraoperative MRI (P < .001) were within and at the margin of the range of good reproducibility, respectively. Example 2: Wide ranges of DSC were observed in brain tumor segmentations: Meningiomas (0.519–0.893), astrocytomas (0.487–0.972), and other mixed gliomas (0.490–0.899). Conclusion The DSC value is a simple and useful summary measure of spatial overlap, which can be applied to studies of reproducibility and accuracy in image segmentation. We observed generally satisfactory but variable validation results in two clinical applications. This metric may be adapted for similar validation tasks. PMID:14974593

  12. Wave chaos in a randomly inhomogeneous waveguide: spectral analysis of the finite-range evolution operator.

    PubMed

    Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S

    2013-01-01

    The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.

  13. Nocturnal oxygen saturation profiles of healthy term infants

    PubMed Central

    Terrill, Philip Ian; Dakin, Carolyn; Hughes, Ian; Yuill, Maggie; Parsley, Chloe

    2015-01-01

    Objective Pulse oximetry is used extensively in hospital and home settings to measure arterial oxygen saturation (SpO2). Interpretation of the trend and range of SpO2 values observed in infants is currently limited by a lack of reference ranges using current devices, and may be augmented by development of cumulative frequency (CF) reference-curves. This study aims to provide reference oxygen saturation values from a prospective longitudinal cohort of healthy infants. Design Prospective longitudinal cohort study. Setting Sleep-laboratory. Patients 34 healthy term infants were enrolled, and studied at 2 weeks, 3, 6, 12 and 24 months of age (N=30, 25, 27, 26, 20, respectively). Interventions Full overnight polysomnography, including 2 s averaging pulse oximetry (Masimo Radical). Main outcome measurements Summary SpO2 statistics (mean, median, 5th and 10th percentiles) and SpO2 CF plots were calculated for each recording. CF reference-curves were then generated for each study age. Analyses were repeated with sleep-state stratifications and inclusion of manual artefact removal. Results Median nocturnal SpO2 values ranged between 98% and 99% over the first 2 years of life and the CF reference-curves shift right by 1% between 2 weeks and 3 months. CF reference-curves did not change with manual artefact removal during sleep and did not vary between rapid eye movement (REM) and non-REM sleep. Manual artefact removal did significantly change summary statistics and CF reference-curves during wake. Conclusions SpO2 CF curves provide an intuitive visual tool for evaluating whether an individual's nocturnal SpO2 distribution falls within the range of healthy age-matched infants, thereby complementing summary statistics in the interpretation of extended oximetry recordings in infants. PMID:25063836

  14. PV System Component Fault and Failure Compilation and Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Gooding, Renee Lynne

    This report describes data collection and analysis of solar photovoltaic (PV) equipment events, which consist of faults and fa ilures that occur during the normal operation of a distributed PV system or PV power plant. We present summary statistics from locations w here maintenance data is being collected at various intervals, as well as reliability statistics gathered from that da ta, consisting of fault/failure distributions and repair distributions for a wide range of PV equipment types.

  15. Improving Range Estimation of a 3-Dimensional Flash Ladar via Blind Deconvolution

    DTIC Science & Technology

    2010-09-01

    12 2.1.4 Optical Imaging as a Linear and Nonlinear System 15 2.1.5 Coherence Theory and Laser Light Statistics . . . 16 2.2 Deconvolution...rather than deconvolution. 2.1.5 Coherence Theory and Laser Light Statistics. Using [24] and [25], this section serves as background on coherence theory...the laser light incident on the detector surface. The image intensity related to different types of coherence is governed by the laser light’s spatial

  16. United States Air Force Statistical Digest, Fiscal Year 1975. 13th Edition

    DTIC Science & Technology

    1976-04-15

    USAF Statistical Digest. FUNCTIONS The Forces have the following primary tasks: STRATEGIC STRATEGIC OFFENSIVE DEFENSIVE Long-range weapons delivery... FUNCTIONAL MISSION - AS OF END FY 1975 NON - "l(’lrJIFTEI)- OPERHING OPERA TrNG TOTAL MISsrON-OfSIGN AC TI V~ ACT! VE ACTIVE INACTIVE TOTAL...INVENTORY BY FUNCTIONAL DISTRIBUTION - BY MISSION AND DESIGN - AS OF END OF FY 1975 227 MOOIFIEO- MISSION-OESIGN A-37 AC-BO TOTAL HTACK NON - OPERATING

  17. Limited data tomographic image reconstruction via dual formulation of total variation minimization

    NASA Astrophysics Data System (ADS)

    Jang, Kwang Eun; Sung, Younghun; Lee, Kangeui; Lee, Jongha; Cho, Seungryong

    2011-03-01

    The X-ray mammography is the primary imaging modality for breast cancer screening. For the dense breast, however, the mammogram is usually difficult to read due to tissue overlap problem caused by the superposition of normal tissues. The digital breast tomosynthesis (DBT) that measures several low dose projections over a limited angle range may be an alternative modality for breast imaging, since it allows the visualization of the cross-sectional information of breast. The DBT, however, may suffer from the aliasing artifact and the severe noise corruption. To overcome these problems, a total variation (TV) regularized statistical reconstruction algorithm is presented. Inspired by the dual formulation of TV minimization in denoising and deblurring problems, we derived a gradient-type algorithm based on statistical model of X-ray tomography. The objective function is comprised of a data fidelity term derived from the statistical model and a TV regularization term. The gradient of the objective function can be easily calculated using simple operations in terms of auxiliary variables. After a descending step, the data fidelity term is renewed in each iteration. Since the proposed algorithm can be implemented without sophisticated operations such as matrix inverse, it provides an efficient way to include the TV regularization in the statistical reconstruction method, which results in a fast and robust estimation for low dose projections over the limited angle range. Initial tests with an experimental DBT system confirmed our finding.

  18. Three-dimensional Visualization of Ultrasound Backscatter Statistics by Window-modulated Compounding Nakagami Imaging.

    PubMed

    Zhou, Zhuhuang; Wu, Shuicai; Lin, Man-Yen; Fang, Jui; Liu, Hao-Li; Tsui, Po-Hsiang

    2018-05-01

    In this study, the window-modulated compounding (WMC) technique was integrated into three-dimensional (3D) ultrasound Nakagami imaging for improving the spatial visualization of backscatter statistics. A 3D WMC Nakagami image was produced by summing and averaging a number of 3D Nakagami images (number of frames denoted as N) formed using sliding cubes with varying side lengths ranging from 1 to N times the transducer pulse. To evaluate the performance of the proposed 3D WMC Nakagami imaging method, agar phantoms with scatterer concentrations ranging from 2 to 64 scatterers/mm 3 were made, and six stages of fatty liver (zero, one, two, four, six, and eight weeks) were induced in rats by methionine-choline-deficient diets (three rats for each stage, total n = 18). A mechanical scanning system with a 5-MHz focused single-element transducer was used for ultrasound radiofrequency data acquisition. The experimental results showed that 3D WMC Nakagami imaging was able to characterize different scatterer concentrations. Backscatter statistics were visualized with various numbers of frames; N = 5 reduced the estimation error of 3D WMC Nakagami imaging in visualizing the backscatter statistics. Compared with conventional 3D Nakagami imaging, 3D WMC Nakagami imaging improved the image smoothness without significant image resolution degradation, and it can thus be used for describing different stages of fatty liver in rats.

  19. Implementation of an F-statistic all-sky search for continuous gravitational waves in Virgo VSR1 data

    NASA Astrophysics Data System (ADS)

    Aasi, J.; Abbott, B. P.; Abbott, R.; Abbott, T.; Abernathy, M. R.; Accadia, T.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Affeldt, C.; Agathos, M.; Aggarwal, N.; Aguiar, O. D.; Ain, A.; Ajith, P.; Alemic, A.; Allen, B.; Allocca, A.; Amariutei, D.; Andersen, M.; Anderson, R.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C.; Areeda, J.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Austin, L.; Aylott, B. E.; Babak, S.; Baker, P. T.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barbet, M.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Bauchrowitz, J.; Bauer, Th S.; Behnke, B.; Bejger, M.; Beker, M. G.; Belczynski, C.; Bell, A. S.; Bell, C.; Bergmann, G.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Beyersdorf, P. T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Biscans, S.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bloemen, S.; Blom, M.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bond, C.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Borkowski, K.; Boschi, V.; Bose, Sukanta; Bosi, L.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brückner, F.; Buchman, S.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Burman, R.; Buskulic, D.; Buy, C.; Cadonati, L.; Cagnoli, G.; Calderón Bustillo, J.; Calloni, E.; Camp, J. B.; Campsie, P.; Cannon, K. C.; Canuel, B.; Cao, J.; Capano, C. D.; Carbognani, F.; Carbone, L.; Caride, S.; Castiglia, A.; Caudill, S.; Cavalier, F.; Cavalieri, R.; Celerier, C.; Cella, G.; Cepeda, C.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chao, S.; Charlton, P.; Chassande Mottin, E.; Chen, X.; Chen, Y.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Chow, J.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P. F.; Colla, A.; Collette, C.; Colombini, M.; Cominsky, L.; Conte, A.; Cook, D.; Corbitt, T. R.; Cordier, M.; Cornish, N.; Corpuz, A.; Corsi, A.; Costa, C. A.; Coughlin, M. W.; Coughlin, S.; Coulon, J. P.; Countryman, S.; Couvares, P.; Coward, D. M.; Cowart, M.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dahl, K.; Dal Canton, T.; Damjanic, M.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dattilo, V.; Daveloza, H.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Dayanga, T.; Debreczeni, G.; Degallaix, J.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M.; Di Fiore, L.; Di Lieto, A.; Di Palma, I.; Di Virgilio, A.; Donath, A.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorosh, O.; Dossa, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Dwyer, S.; Eberle, T.; Edo, T.; Edwards, M.; Effler, A.; Eggenstein, H.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Endrőczi, G.; Essick, R.; Etzel, T.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fehrmann, H.; Fejer, M. M.; Feldbaum, D.; Feroz, F.; Ferrante, I.; Ferrini, F.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Fisher, R. P.; Flaminio, R.; Fournier, J. D.; Franco, S.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gair, J.; Gammaitoni, L.; Gaonkar, S.; Garufi, F.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, C.; Gleason, J.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gordon, N.; Gorodetsky, M. L.; Gossan, S.; Goßler, S.; Gouaty, R.; Gräf, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Groot, P.; Grote, H.; Grover, K.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gushwa, K.; Gustafson, E. K.; Gustafson, R.; Hammer, D.; Hammond, G.; Hanke, M.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hart, M.; Hartman, M. T.; Haster, C. J.; Haughian, K.; Heidmann, A.; Heintze, M.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Heptonstall, A. W.; Heurs, M.; Hewitson, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Hooper, S.; Hopkins, P.; Hosken, D. J.; Hough, J.; Howell, E. J.; Hu, Y.; Huerta, E.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh, M.; Huynh Dinh, T.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Iyer, B. R.; Izumi, K.; Jacobson, M.; James, E.; Jang, H.; Jaranowski, P.; Ji, Y.; Jiménez Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karlen, J.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, H.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Keiser, G. M.; Keitel, D.; Kelley, D. B.; Kells, W.; Khalaidovski, A.; Khalili, F. Y.; Khazanov, E. A.; Kim, C.; Kim, K.; Kim, N.; Kim, N. G.; Kim, Y. M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Klimenko, S.; Kline, J.; Koehlenbeck, S.; Kokeyama, K.; Kondrashov, V.; Koranda, S.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kremin, A.; Kringel, V.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, A.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Kwee, P.; Landry, M.; Lantz, B.; Larson, S.; Lasky, P. D.; Lawrie, C.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, J.; Leonardi, M.; Leong, J. R.; Le Roux, A.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B.; Lewis, J.; Li, T. G. F.; Libbrecht, K.; Libson, A.; Lin, A. C.; Littenberg, T. B.; Litvine, V.; Lockerbie, N. A.; Lockett, V.; Lodhia, D.; Loew, K.; Logue, J.; Lombardi, A. L.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J.; Lubinski, M. J.; Lück, H.; Luijten, E.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Macarthur, J.; Macdonald, E. P.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana Sandoval, F.; Mageswaran, M.; Maglione, C.; Mailand, K.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Manca, G. M.; Mandel, I.; Mandic, V.; Mangano, V.; Mangini, N.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A.; Maros, E.; Marque, J.; Martelli, F.; Martin, I. W.; Martin, R. M.; Martinelli, L.; Martynov, D.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McIver, J.; McLin, K.; Meacher, D.; Meadors, G. D.; Mehmet, M.; Meidam, J.; Meinders, M.; Melatos, A.; Mendell, G.; Mercer, R. A.; Meshkov, S.; Messenger, C.; Meyers, P.; Miao, H.; Michel, C.; Mikhailov, E. E.; Milano, L.; Milde, S.; Miller, J.; Minenkov, Y.; Mingarelli, C. M. F.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moe, B.; Moesta, P.; Mohan, M.; Mohapatra, S. R. P.; Moraru, D.; Moreno, G.; Morgado, N.; Morriss, S. R.; Mossavi, K.; Mours, B.; Lowry, C. M. Mow; Mueller, C. L.; Mueller, G.; Mukherjee, S.; Mullavey, A.; Munch, J.; Murphy, D.; Murray, P. G.; Mytidis, A.; Nagy, M. F.; Nanda Kumar, D.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nelemans, G.; Neri, I.; Neri, M.; Newton, G.; Nguyen, T.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Ochsner, E.; O'Dell, J.; Oelker, E.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oppermann, P.; O'Reilly, B.; O'Shaughnessy, R.; Osthelder, C.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Padilla, C.; Pai, A.; Palashov, O.; Palomba, C.; Pan, H.; Pan, Y.; Pankow, C.; Paoletti, F.; Paoletti, R.; Papa, M. A.; Paris, H.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Pedraza, M.; Penn, S.; Perreca, A.; Phelps, M.; Pichot, M.; Pickenpack, M.; Piergiovanni, F.; Pierro, V.; Pietka, M.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poeld, J.; Poggiani, R.; Poteomkin, A.; Powell, J.; Prasad, J.; Premachandra, S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Qin, J.; Quetschke, V.; Quintero, E.; Quiroga, G.; Quitzow James, R.; Raab, F. J.; Rabeling, D. S.; Rácz, I.; Radkins, H.; Raffai, P.; Raja, S.; Rajalakshmi, G.; Rakhmanov, M.; Ramet, C.; Ramirez, K.; Rapagnani, P.; Raymond, V.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Reid, S.; Reitze, D. H.; Rhoades, E.; Ricci, F.; Riles, K.; Robertson, N. A.; Robinet, F.; Rocchi, A.; Rodruck, M.; Rolland, L.; Rollins, J. G.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Salemi, F.; Sammut, L.; Sandberg, V.; Sanders, J. R.; Sannibale, V.; Santiago Prieto, I.; Saracco, E.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Savage, R.; Scheuer, J.; Schilling, R.; Schnabel, R.; Schofield, R. M. S.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Shaddock, D.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Shoemaker, D. H.; Sidery, T. L.; Siellez, K.; Siemens, X.; Sigg, D.; Simakov, D.; Singer, A.; Singer, L.; Singh, R.; Sintes, A. M.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M.; Smith, R. J. E.; Smith Lefebvre, N. D.; Son, E. J.; Sorazu, B.; Souradeep, T.; Sperandio, L.; Staley, A.; Stebbins, J.; Steinlechner, J.; Steinlechner, S.; Stephens, B. C.; Steplewski, S.; Stevenson, S.; Stone, R.; Stops, D.; Strain, K. A.; Straniero, N.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Susmithan, S.; Sutton, P. J.; Swinkels, B.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tarabrin, S. P.; Taylor, R.; ter Braack, A. P. M.; Thirugnanasambandam, M. P.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Toncelli, A.; Tonelli, M.; Torre, O.; Torres, C. V.; Torrie, C. I.; Travasso, F.; Traylor, G.; Tse, M.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Urbanek, K.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; vanden Brand, J. F. J.; VanDen Broeck, C.; vander Putten, S.; vander Sluys, M. V.; van Heijningen, J.; van Veggel, A. A.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Verma, S. S.; Vetrano, F.; Viceré, A.; Finley, R. Vincent; Vinet, J. Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Vousden, W. D.; Vyachanin, S. P.; Wade, A.; Wade, L.; Wade, M.; Walker, M.; Wallace, L.; Wang, M.; Wang, X.; Ward, R. L.; Was, M.; Weaver, B.; Wei, L. W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Wiesner, K.; Wilkinson, C.; Williams, K.; Williams, L.; Williams, R.; Williams, T.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M.; Winkler, W.; Wipf, C. C.; Wiseman, A. G.; Wittel, H.; Woan, G.; Worden, J.; Yablon, J.; Yakushin, I.; Yamamoto, H.; Yancey, C. C.; Yang, H.; Yang, Z.; Yoshida, S.; Yvert, M.; Zadrożny, A.; Zanolin, M.; Zendri, J. P.; Zhang, Fan; Zhang, L.; Zhao, C.; Zhu, X. J.; Zucker, M. E.; Zuraw, S.; Zweizig, J.

    2014-08-01

    We present an implementation of the F-statistic to carry out the first search in data from the Virgo laser interferometric gravitational wave detector for periodic gravitational waves from a priori unknown, isolated rotating neutron stars. We searched a frequency f0 range from 100 Hz to 1 kHz and the frequency dependent spindown f1 range from -1.6({{f}_{0}}/100\\;Hz)\\times {{10}^{-9}} Hz s-1 to zero. A large part of this frequency-spindown space was unexplored by any of the all-sky searches published so far. Our method consisted of a coherent search over two-day periods using the ℱ-statistic, followed by a search for coincidences among the candidates from the two-day segments. We have introduced a number of novel techniques and algorithms that allow the use of the fast Fourier transform (FFT) algorithm in the coherent part of the search resulting in a fifty-fold speed-up in computation of the F-statistic with respect to the algorithm used in the other pipelines. No significant gravitational wave signal was found. The sensitivity of the search was estimated by injecting signals into the data. In the most sensitive parts of the detector band more than 90% of signals would have been detected with dimensionless gravitational-wave amplitude greater than 5\\times {{10}^{-24}}.

  20. Statistical scaling of pore-scale Lagrangian velocities in natural porous media.

    PubMed

    Siena, M; Guadagnini, A; Riva, M; Bijeljic, B; Pereira Nunes, J P; Blunt, M J

    2014-08-01

    We investigate the scaling behavior of sample statistics of pore-scale Lagrangian velocities in two different rock samples, Bentheimer sandstone and Estaillades limestone. The samples are imaged using x-ray computer tomography with micron-scale resolution. The scaling analysis relies on the study of the way qth-order sample structure functions (statistical moments of order q of absolute increments) of Lagrangian velocities depend on separation distances, or lags, traveled along the mean flow direction. In the sandstone block, sample structure functions of all orders exhibit a power-law scaling within a clearly identifiable intermediate range of lags. Sample structure functions associated with the limestone block display two diverse power-law regimes, which we infer to be related to two overlapping spatially correlated structures. In both rocks and for all orders q, we observe linear relationships between logarithmic structure functions of successive orders at all lags (a phenomenon that is typically known as extended power scaling, or extended self-similarity). The scaling behavior of Lagrangian velocities is compared with the one exhibited by porosity and specific surface area, which constitute two key pore-scale geometric observables. The statistical scaling of the local velocity field reflects the behavior of these geometric observables, with the occurrence of power-law-scaling regimes within the same range of lags for sample structure functions of Lagrangian velocity, porosity, and specific surface area.

  1. Long-range memory and non-Markov statistical effects in human sensorimotor coordination

    NASA Astrophysics Data System (ADS)

    M. Yulmetyev, Renat; Emelyanova, Natalya; Hänggi, Peter; Gafarov, Fail; Prokhorov, Alexander

    2002-12-01

    In this paper, the non-Markov statistical processes and long-range memory effects in human sensorimotor coordination are investigated. The theoretical basis of this study is the statistical theory of non-stationary discrete non-Markov processes in complex systems (Phys. Rev. E 62, 6178 (2000)). The human sensorimotor coordination was experimentally studied by means of standard dynamical tapping test on the group of 32 young peoples with tap numbers up to 400. This test was carried out separately for the right and the left hand according to the degree of domination of each brain hemisphere. The numerical analysis of the experimental results was made with the help of power spectra of the initial time correlation function, the memory functions of low orders and the first three points of the statistical spectrum of non-Markovity parameter. Our observations demonstrate, that with the regard to results of the standard dynamic tapping-test it is possible to divide all examinees into five different dynamic types. We have introduced the conflict coefficient to estimate quantitatively the order-disorder effects underlying life systems. The last one reflects the existence of disbalance between the nervous and the motor human coordination. The suggested classification of the neurophysiological activity represents the dynamic generalization of the well-known neuropsychological types and provides the new approach in a modern neuropsychology.

  2. Normal Distribution of CD8+ T-Cell-Derived ELISPOT Counts within Replicates Justifies the Reliance on Parametric Statistics for Identifying Positive Responses.

    PubMed

    Karulin, Alexey Y; Caspell, Richard; Dittrich, Marcus; Lehmann, Paul V

    2015-03-02

    Accurate assessment of positive ELISPOT responses for low frequencies of antigen-specific T-cells is controversial. In particular, it is still unknown whether ELISPOT counts within replicate wells follow a theoretical distribution function, and thus whether high power parametric statistics can be used to discriminate between positive and negative wells. We studied experimental distributions of spot counts for up to 120 replicate wells of IFN-γ production by CD8+ T-cell responding to EBV LMP2A (426 - 434) peptide in human PBMC. The cells were tested in serial dilutions covering a wide range of average spot counts per condition, from just a few to hundreds of spots per well. Statistical analysis of the data using diagnostic Q-Q plots and the Shapiro-Wilk normality test showed that in the entire dynamic range of ELISPOT spot counts within replicate wells followed a normal distribution. This result implies that the Student t-Test and ANOVA are suited to identify positive responses. We also show experimentally that borderline responses can be reliably detected by involving more replicate wells, plating higher numbers of PBMC, addition of IL-7, or a combination of these. Furthermore, we have experimentally verified that the number of replicates needed for detection of weak responses can be calculated using parametric statistics.

  3. Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses

    PubMed Central

    Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.

    2017-01-01

    Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654

  4. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  5. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study

    NASA Astrophysics Data System (ADS)

    Flintoft, I. D.; Robinson, M. P.; Melia, G. C. R.; Marvin, A. C.; Dawson, J. F.

    2014-07-01

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m2 for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human exposure assessments made with particular phantoms to a population with a range of individual morphologies.

  6. Average absorption cross-section of the human body measured at 1-12 GHz in a reverberant chamber: results of a human volunteer study.

    PubMed

    Flintoft, I D; Robinson, M P; Melia, G C R; Marvin, A C; Dawson, J F

    2014-07-07

    The electromagnetic absorption cross-section (ACS) averaged over polarization and angle-of-incidence of 60 ungrounded adult subjects was measured at microwave frequencies of 1-12 GHz in a reverberation chamber. Average ACS is important in non-ionizing dosimetry and exposure studies, and is closely related to the whole-body averaged specific absorption rate (WBSAR). The average ACS was measured with a statistical uncertainty of less than 3% and high frequency resolution for individuals with a range of body shapes and sizes allowing the statistical distribution of WBSAR over a real population with individual internal and external morphologies to be determined. The average ACS of all subjects was found to vary from 0.15 to 0.4 m(2); for an individual subject it falls with frequency over 1-6 GHz, and then rises slowly over the 6-12 GHz range in which few other studies have been conducted. Average ACS and WBSAR are then used as a surrogate for worst-case ACS/WBSAR, in order to study their variability across a real population compared to literature results from simulations using numerical phantoms with a limited range of anatomies. Correlations with body morphological parameters such as height, mass and waist circumference have been investigated: the strongest correlation is with body surface area (BSA) at all frequencies above 1 GHz, however direct proportionality to BSA is not established until above 5 GHz. When the average ACS is normalized to the BSA, the resulting absorption efficiency shows a negative correlation with the estimated thickness of subcutaneous body fat. Surrogate models and statistical analysis of the measurement data are presented and compared to similar models from the literature. The overall dispersion of measured average WBSAR of the sample of the UK population studied is consistent with the dispersion of simulated worst-case WBSAR across multiple numerical phantom families. The statistical results obtained allow the calibration of human exposure assessments made with particular phantoms to a population with a range of individual morphologies.

  7. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  8. Streamflow monitoring and statistics for development of water rights claims for Wild and Scenic Rivers, Owyhee Canyonlands Wilderness, Idaho, 2012

    USGS Publications Warehouse

    Wood, Molly S.; Fosness, Ryan L.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Bureau of Land Management (BLM), collected streamflow data in 2012 and estimated streamflow statistics for stream segments designated "Wild," "Scenic," or "Recreational" under the National Wild and Scenic Rivers System in the Owyhee Canyonlands Wilderness in southwestern Idaho. The streamflow statistics were used by BLM to develop and file a draft, federal reserved water right claim in autumn 2012 to protect federally designated "outstanding remarkable values" in the stream segments. BLM determined that the daily mean streamflow equaled or exceeded 20 and 80 percent of the time during bimonthly periods (two periods per month) and the bankfull streamflow are important streamflow thresholds for maintaining outstanding remarkable values. Prior to this study, streamflow statistics estimated using available datasets and tools for the Owyhee Canyonlands Wilderness were inaccurate for use in the water rights claim. Streamflow measurements were made at varying intervals during February–September 2012 at 14 monitoring sites; 2 of the monitoring sites were equipped with telemetered streamgaging equipment. Synthetic streamflow records were created for 11 of the 14 monitoring sites using a partial‑record method or a drainage-area-ratio method. Streamflow records were obtained directly from an operating, long-term streamgage at one monitoring site, and from discontinued streamgages at two monitoring sites. For 10 sites analyzed using the partial-record method, discrete measurements were related to daily mean streamflow at a nearby, telemetered “index” streamgage. Resulting regression equations were used to estimate daily mean and annual peak streamflow at the monitoring sites during the full period of record for the index sites. A synthetic streamflow record for Sheep Creek was developed using a drainage-area-ratio method, because measured streamflows did not relate well to any index site to allow use of the partial-record method. The synthetic and actual daily mean streamflow records were used to estimate daily mean streamflow that was exceeded 80, 50, and 20 percent of the time (80-, 50-, and 20-percent exceedances) for bimonthly and annual periods. Bankfull streamflow statistics were calculated by fitting the synthetic and actual annual peak streamflow records to a log Pearson Type III distribution using Bulletin 17B guidelines in the U.S. Geological Survey PeakFQ program. The coefficients of determination (R2) for the regressions between the monitoring and index sites ranged from 0.74 for Wickahoney Creek to 0.98 for the West Fork Bruneau River and Deep Creek. Confidence in computed streamflow statistics is highest among other sites for the East Fork Owyhee River and the West Fork Bruneau River on the basis of regression statistics, visual fit of the related data, and the range and number of streamflow measurements. Streamflow statistics for sites with the greatest uncertainty included Big Jacks, Little Jacks, Cottonwood, Wickahoney, and Sheep Creeks. The uncertainty in computed streamflow statistics was due to a number of factors which included the distance of index sites relative to monitoring sites, relatively low streamflow conditions that occurred during the study, and the limited number and range of streamflow measurements. However, the computed streamflow statistics are considered the best possible estimates given available datasets in the remote study area. Streamflow measurements over a wider range of hydrologic and climatic conditions would improve the relations between streamflow characteristics at monitoring and index sites. Additionally, field surveys are needed to verify if the streamflows selected for the water rights claims are sufficient for maintaining outstanding remarkable values in the Wild and Scenic rivers included in the study.

  9. Comparison of unitary associations and probabilistic ranking and scaling as applied to mesozoic radiolarians

    NASA Astrophysics Data System (ADS)

    Baumgartner, Peter O.

    A database on Middle Jurassic-Early Cretaceous radiolarians consisting of first and final occurrences of 110 species in 226 samples from 43 localities was used to compute Unitary Associations and probabilistic ranking and scaling (RASC), in order to test deterministic versus probabilistic quantitative biostratigraphic methods. Because the Mesozoic radiolarian fossil record is mainly dissolution-controlled, the sequence of events differs greatly from section to section. The scatter of local first and final appearances along a time scale is large compared to the species range; it is asymmetrical, with a maximum near the ends of the range and it is non-random. Thus, these data do not satisfy the statistical assumptions made in ranking and scaling. Unitary Associations produce maximum ranges of the species relative to each other by stacking cooccurrence data from all sections and therefore compensate for the local dissolution effects. Ranking and scaling, based on the assumption of a normal random distribution of the events, produces average ranges which are for most species much shorter than the maximum UA-ranges. There are, however, a number of species with similar ranges in both solutions. These species are believed to be the most dissolution-resistant and, therefore, the most reliable ones for the definition of biochronozones. The comparison of maximum and average ranges may be a powerful tool to test reliability of species for biochronology. Dissolution-controlled fossil data yield high crossover frequencies and therefore small, statistically insignificant interfossil distances. Scaling has not produced a useful sequence for this type of data.

  10. A survey of statistics in three UK general practice journal

    PubMed Central

    Rigby, Alan S; Armstrong, Gillian K; Campbell, Michael J; Summerton, Nick

    2004-01-01

    Background Many medical specialities have reviewed the statistical content of their journals. To our knowledge this has not been done in general practice. Given the main role of a general practitioner as a diagnostician we thought it would be of interest to see whether the statistical methods reported reflect the diagnostic process. Methods Hand search of three UK journals of general practice namely the British Medical Journal (general practice section), British Journal of General Practice and Family Practice over a one-year period (1 January to 31 December 2000). Results A wide variety of statistical techniques were used. The most common methods included t-tests and Chi-squared tests. There were few articles reporting likelihood ratios and other useful diagnostic methods. There was evidence that the journals with the more thorough statistical review process reported a more complex and wider variety of statistical techniques. Conclusions The BMJ had a wider range and greater diversity of statistical methods than the other two journals. However, in all three journals there was a dearth of papers reflecting the diagnostic process. Across all three journals there were relatively few papers describing randomised controlled trials thus recognising the difficulty of implementing this design in general practice. PMID:15596014

  11. Predicting and downscaling ENSO impacts on intraseasonal precipitation statistics in California: The 1997/98 event

    USGS Publications Warehouse

    Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.

    2000-01-01

    Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.

  12. On the fractal characterization of Paretian Poisson processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  13. Self-organization of cosmic radiation pressure instability. II - One-dimensional simulations

    NASA Technical Reports Server (NTRS)

    Hogan, Craig J.; Woods, Jorden

    1992-01-01

    The clustering of statistically uniform discrete absorbing particles moving solely under the influence of radiation pressure from uniformly distributed emitters is studied in a simple one-dimensional model. Radiation pressure tends to amplify statistical clustering in the absorbers; the absorbing material is swept into empty bubbles, the biggest bubbles grow bigger almost as they would in a uniform medium, and the smaller ones get crushed and disappear. Numerical simulations of a one-dimensional system are used to support the conjecture that the system is self-organizing. Simple statistics indicate that a wide range of initial conditions produce structure approaching the same self-similar statistical distribution, whose scaling properties follow those of the attractor solution for an isolated bubble. The importance of the process for large-scale structuring of the interstellar medium is briefly discussed.

  14. Statistical significance test for transition matrices of atmospheric Markov chains

    NASA Technical Reports Server (NTRS)

    Vautard, Robert; Mo, Kingtse C.; Ghil, Michael

    1990-01-01

    Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.

  15. Launch commit criteria performance trending analysis, phase 1, revision A. SRM and QA mission services

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.

  16. Statistics of the residual refraction errors in laser ranging data

    NASA Technical Reports Server (NTRS)

    Gardner, C. S.

    1977-01-01

    A theoretical model for the range error covariance was derived by assuming that the residual refraction errors are due entirely to errors in the meteorological data which are used to calculate the atmospheric correction. The properties of the covariance function are illustrated by evaluating the theoretical model for the special case of a dense network of weather stations uniformly distributed within a circle.

  17. Report on the lunar ranging at McDonald Observatory, 1 February - 31 May 1976

    NASA Technical Reports Server (NTRS)

    Palm, C. S.; Wiant, J. R.

    1976-01-01

    The four spring lunations produced 105 acquisitions, including the 2000th range measurement made at McDonald Observatory. Statistics were normal for the spring months. Laser and electronics problems are noted. The Loran-C station delay was corrected. Preliminary doubles data is shown. New magnetic tape data formats are presented. R and D efforts include a new laser modification design.

  18. Sediment oxygen demand in the lower Willamette River, Oregon, 1994

    USGS Publications Warehouse

    Caldwell, James M.; Doyle, Micelis C.

    1995-01-01

    Sediment samples were collected near each chamber and analyzed for percent water, percent sand, and percent organics. The sand content ranged from 0.1 to 6.2 percent and averaged 1.8 percent. The organic content ranged from 1.4 to 9.6 and averaged 5.6 percent. No statistically significant correlations were found between these sediment characteristics and sediment oxygen demand.

  19. Measurements of small-scale statistics and probability density functions in passively heated shear flow

    NASA Astrophysics Data System (ADS)

    Ferchichi, Mohsen

    This study is an experimental investigation consisting of two parts. In the first part, the fine structure of uniformly sheared turbulence was investigated within the framework of Kolmogorov's (1941) similarity hypotheses. The second part, consisted of the study of the scalar mixing in uniformly sheared turbulence with an imposed mean scalar gradient, with the emphasis on measurements relevant to the probability density function formulation and on scalar derivative statistics. The velocity fine structure was invoked from statistics of the streamwise and transverse derivatives of the streamwise velocity as well as velocity differences and structure functions, measured with hot wire anemometry for turbulence Reynolds numbers, Relambda, in the range between 140 and 660. The streamwise derivative skewness and flatness agreed with previously reported results in that they increased with increasing Relambda with the flatness increasing at a higher rate. The skewness of the transverse derivative decreased with increasing Relambda, and the flatness of this derivative increased with Relambda but a lower rate than the streamwise derivative flatness. The high order (up to sixth) transverse structure functions of the streamwise velocity showed the same trends as the corresponding streamwise structure functions. In the second pan of tins experimental study, an army of heated ribbons was introduced into the flow to produce a constant mean temperature gradient, such that the temperature acted as a passive scalar. The Re lambda in this study varied from 184 to 253. Cold wire thermometry and hot wire anemometry were used for simultaneous measurements of temperature and velocity. The scalar pdf was found to be nearly Gaussian. Various tests of joint statistics of the scalar and its rate of destruction revealed that the scalar dissipation rate was essentially independent of the scalar value. The measured joint statistics of the scalar and the velocity suggested that they were nearly jointly normal and that the normalized conditioned expectations varied linearly with the scalar with slopes corresponding to the scalar-velocity correlation coefficients. Finally, the measured streamwise and transverse scalar derivatives and differences revealed that the scalar fine structure was intermittent not only in the dissipative range, but in the inertial range as well.

  20. Using the U.S. Geological Survey National Water Quality Laboratory LT-MDL to Evaluate and Analyze Data

    USGS Publications Warehouse

    Bonn, Bernadine A.

    2008-01-01

    A long-term method detection level (LT-MDL) and laboratory reporting level (LRL) are used by the U.S. Geological Survey?s National Water Quality Laboratory (NWQL) when reporting results from most chemical analyses of water samples. Changing to this method provided data users with additional information about their data and often resulted in more reported values in the low concentration range. Before this method was implemented, many of these values would have been censored. The use of the LT-MDL and LRL presents some challenges for the data user. Interpreting data in the low concentration range increases the need for adequate quality assurance because even small contamination or recovery problems can be relatively large compared to concentrations near the LT-MDL and LRL. In addition, the definition of the LT-MDL, as well as the inclusion of low values, can result in complex data sets with multiple censoring levels and reported values that are less than a censoring level. Improper interpretation or statistical manipulation of low-range results in these data sets can result in bias and incorrect conclusions. This document is designed to help data users use and interpret data reported with the LTMDL/ LRL method. The calculation and application of the LT-MDL and LRL are described. This document shows how to extract statistical information from the LT-MDL and LRL and how to use that information in USGS investigations, such as assessing the quality of field data, interpreting field data, and planning data collection for new projects. A set of 19 detailed examples are included in this document to help data users think about their data and properly interpret lowrange data without introducing bias. Although this document is not meant to be a comprehensive resource of statistical methods, several useful methods of analyzing censored data are demonstrated, including Regression on Order Statistics and Kaplan-Meier Estimation. These two statistical methods handle complex censored data sets without resorting to substitution, thereby avoiding a common source of bias and inaccuracy.

  1. Acetabular revisions using porous tantalum components: A retrospective study with 5-10 years follow-up

    PubMed Central

    Evola, Francesco Roberto; Costarella, Luciano; Evola, Giuseppe; Barchitta, Martina; Agodi, Antonella; Sessa, Giuseppe

    2017-01-01

    AIM To evaluate the clinical and X-ray results of acetabular components and tantalum augments in prosthetic hip revisions. METHODS Fifty-eight hip prostheses with primary failure of the acetabular component were reviewed with tantalum implants. The clinical records and X-rays of these cases were retrospectively reviewed. Bone defect evaluations were based on preoperative CT scans and classified according to Paprosky criteria of Radiolucent lines and periprosthetic gaps; implant mobilization and osteolysis were evaluated by X-ray. An ad hoc database was created and statistical analyses were performed with SPSS software (IBM SPSS Statistics for Windows, version 23.0). Statistical analyses were carried out using the Student’s t test for independent and paired samples. A P value of < 0.05 was considered statistically significant and cumulative survival was calculated by the Kaplan-Meier method. RESULTS The mean follow-up was 87.6 ± 25.6 mo (range 3-120 mo). 25 cases (43.1%) were classified as minor defects, and 33 cases (56.9%) as major defects. The preoperative HHS rating improved significantly from a mean of 40.7 ± 6.1 (range: 29-53) before revision, to a mean of 85.8 ± 6.1 (range: 70-94) at the end of the follow-up (Student’s t test for paired samples: P < 0.001). Considering HHS only at the end of follow-up, no statistically significant difference was observed between patients with a major or minor defect (Student’s t test for independent samples: P > 0.05). Radiolucent lines were found in 4 implants (6.9%). Postoperative acetabular gaps were observed in 5 hips (8.6%). No signs of implant mobilization or areas of periprosthetic osteolysis were found in the x-rays at the final follow-up. Only 3 implants failed: 1 case of infection and 2 cases of instability. Defined as the end-point, cumulative survival at 10 years was 95% (for all reasons) and 100% for aseptic loosening of the acetabular component. CONCLUSION The medium-term use of prosthetic tantalum components in prosthetic hip revisions is safe and effective in a wide variety of acetabular bone defects. PMID:28808626

  2. Acetabular revisions using porous tantalum components: A retrospective study with 5-10 years follow-up.

    PubMed

    Evola, Francesco Roberto; Costarella, Luciano; Evola, Giuseppe; Barchitta, Martina; Agodi, Antonella; Sessa, Giuseppe

    2017-07-18

    To evaluate the clinical and X-ray results of acetabular components and tantalum augments in prosthetic hip revisions. Fifty-eight hip prostheses with primary failure of the acetabular component were reviewed with tantalum implants. The clinical records and X-rays of these cases were retrospectively reviewed. Bone defect evaluations were based on preoperative CT scans and classified according to Paprosky criteria of Radiolucent lines and periprosthetic gaps; implant mobilization and osteolysis were evaluated by X-ray. An ad hoc database was created and statistical analyses were performed with SPSS software (IBM SPSS Statistics for Windows, version 23.0). Statistical analyses were carried out using the Student's t test for independent and paired samples. A P value of < 0.05 was considered statistically significant and cumulative survival was calculated by the Kaplan-Meier method. The mean follow-up was 87.6 ± 25.6 mo (range 3-120 mo). 25 cases (43.1%) were classified as minor defects, and 33 cases (56.9%) as major defects. The preoperative HHS rating improved significantly from a mean of 40.7 ± 6.1 (range: 29-53) before revision, to a mean of 85.8 ± 6.1 (range: 70-94) at the end of the follow-up (Student's t test for paired samples: P < 0.001). Considering HHS only at the end of follow-up, no statistically significant difference was observed between patients with a major or minor defect (Student's t test for independent samples: P > 0.05). Radiolucent lines were found in 4 implants (6.9%). Postoperative acetabular gaps were observed in 5 hips (8.6%). No signs of implant mobilization or areas of periprosthetic osteolysis were found in the x-rays at the final follow-up. Only 3 implants failed: 1 case of infection and 2 cases of instability. Defined as the end-point, cumulative survival at 10 years was 95% (for all reasons) and 100% for aseptic loosening of the acetabular component. The medium-term use of prosthetic tantalum components in prosthetic hip revisions is safe and effective in a wide variety of acetabular bone defects.

  3. PERSEUS QC: preparing statistic data sets

    NASA Astrophysics Data System (ADS)

    Belokopytov, Vladimir; Khaliulin, Alexey; Ingerov, Andrey; Zhuk, Elena; Gertman, Isaac; Zodiatis, George; Nikolaidis, Marios; Nikolaidis, Andreas; Stylianou, Stavros

    2017-09-01

    The Desktop Oceanographic Data Processing Module was developed for visual analysis of interdisciplinary cruise measurements. The program provides the possibility of data selection based on different criteria, map plotting, sea horizontal sections, and sea depth vertical profiles. The data selection in the area of interest can be specified according to a set of different physical and chemical parameters complimented by additional parameters, such as the cruise number, ship name, and time period. The visual analysis of a set of vertical profiles in the selected area allows to determine the quality of the data, their location and the time of the in-situ measurements and to exclude any questionable data from the statistical analysis. For each selected set of profiles, the average vertical profile, the minimal and maximal values of the parameter under examination and the root mean square (r.m.s.) are estimated. These estimates are compared with the parameter ranges, set for each sub-region by MEDAR/MEDATLAS-II and SeaDataNet2 projects. In the framework of the PERSEUS project, certain parameters which lacked a range were calculated from scratch, while some of the previously used ranges were re-defined using more comprehensive data sets based on SeaDataNet2, SESAME and PERSEUS projects. In some cases we have used additional sub- regions to redefine the ranges ore precisely. The recalculated ranges are used to improve the PERSEUS Data Quality Control.

  4. Online incidental statistical learning of audiovisual word sequences in adults: a registered report.

    PubMed

    Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy

    2018-02-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r  = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.

  5. Online incidental statistical learning of audiovisual word sequences in adults: a registered report

    PubMed Central

    Duta, Mihaela; Thompson, Paul

    2018-01-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876

  6. Alignments of parity even/odd-only multipoles in CMB

    NASA Astrophysics Data System (ADS)

    Aluri, Pavan K.; Ralston, John P.; Weltman, Amanda

    2017-12-01

    We compare the statistics of parity even and odd multipoles of the cosmic microwave background (CMB) sky from Planck full mission temperature measurements. An excess power in odd multipoles compared to even multipoles has previously been found on large angular scales. Motivated by this apparent parity asymmetry, we evaluate directional statistics associated with even compared to odd multipoles, along with their significances. Primary tools are the Power tensor and Alignment tensor statistics. We limit our analysis to the first 60 multipoles i.e. l = [2, 61]. We find no evidence for statistically unusual alignments of even parity multipoles. More than one independent statistic finds evidence for alignments of anisotropy axes of odd multipoles, with a significance equivalent to ∼2σ or more. The robustness of alignment axes is tested by making Galactic cuts and varying the multipole range. Very interestingly, the region spanned by the (a)symmetry axes is found to broadly contain other parity (a)symmetry axes previously observed in the literature.

  7. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  8. Evaluating Random Error in Clinician-Administered Surveys: Theoretical Considerations and Clinical Applications of Interobserver Reliability and Agreement.

    PubMed

    Bennett, Rebecca J; Taljaard, Dunay S; Olaithe, Michelle; Brennan-Jones, Chris; Eikelboom, Robert H

    2017-09-18

    The purpose of this study is to raise awareness of interobserver concordance and the differences between interobserver reliability and agreement when evaluating the responsiveness of a clinician-administered survey and, specifically, to demonstrate the clinical implications of data types (nominal/categorical, ordinal, interval, or ratio) and statistical index selection (for example, Cohen's kappa, Krippendorff's alpha, or interclass correlation). In this prospective cohort study, 3 clinical audiologists, who were masked to each other's scores, administered the Practical Hearing Aid Skills Test-Revised to 18 adult owners of hearing aids. Interobserver concordance was examined using a range of reliability and agreement statistical indices. The importance of selecting statistical measures of concordance was demonstrated with a worked example, wherein the level of interobserver concordance achieved varied from "no agreement" to "almost perfect agreement" depending on data types and statistical index selected. This study demonstrates that the methodology used to evaluate survey score concordance can influence the statistical results obtained and thus affect clinical interpretations.

  9. Statistical Analysis of Compressive and Flexural Test Results on the Sustainable Adobe Reinforced with Steel Wire Mesh

    NASA Astrophysics Data System (ADS)

    Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen

    2018-04-01

    It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.

  10. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  11. Extreme-value statistics of work done in stretching a polymer in a gradient flow.

    PubMed

    Vucelja, M; Turitsyn, K S; Chertkov, M

    2015-02-01

    We analyze the statistics of work generated by a gradient flow to stretch a nonlinear polymer. We obtain the large deviation function (LDF) of the work in the full range of appropriate parameters by combining analytical and numerical tools. The LDF shows two distinct asymptotes: "near tails" are linear in work and dominated by coiled polymer configurations, while "far tails" are quadratic in work and correspond to preferentially fully stretched polymers. We find the extreme value statistics of work for several singular elastic potentials, as well as the mean and the dispersion of work near the coil-stretch transition. The dispersion shows a maximum at the transition.

  12. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    NASA Astrophysics Data System (ADS)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  13. Natural Gas Pipeline Statistics

    DOT National Transportation Integrated Search

    1980-01-01

    Federal regulation CFR 49, part 191 requires that all gas pipeline operators file annual reports with the U.S. Department of Transportation's Materials Transportation Bureau. These reports contain a wide range of safety and operational data involving...

  14. Forest statistics for eastern Oregon, 1977.

    Treesearch

    Thomas O. Farrenkopf

    1982-01-01

    This report summarizes a 1977 inventory of timber resources in 17 Oregon counties east of the crest of the Cascade Range. Detailed data on forest area, timber volume, growth, mortality, and harvest are presented.

  15. Inactivation disinfection property of Moringa Oleifera seed extract: optimization and kinetic studies

    NASA Astrophysics Data System (ADS)

    Idris, M. A.; Jami, M. S.; Hammed, A. M.

    2017-05-01

    This paper presents the statistical optimization study of disinfection inactivation parameters of defatted Moringa oleifera seed extract on Pseudomonas aeruginosa bacterial cells. Three level factorial design was used to estimate the optimum range and the kinetics of the inactivation process was also carried. The inactivation process involved comparing different disinfection models of Chicks-Watson, Collins-Selleck and Homs models. The results from analysis of variance (ANOVA) of the statistical optimization process revealed that only contact time was significant. The optimum disinfection range of the seed extract was 125 mg/L, 30 minutes and 120rpm agitation. At the optimum dose, the inactivation kinetics followed the Collin-Selleck model with coefficient of determination (R2) of 0.6320. This study is the first of its kind in determining the inactivation kinetics of pseudomonas aeruginosa using the defatted seed extract.

  16. Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-10-01

    Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burton, E.; Wang, L.; Gonder, J.

    This presentation discusses the fuel savings potential from future in-motion wireless power transfer. There is an extensive overlap in road usage apparent across regional vehicle population, which occurs primarily on high-capacity roads--1% of roads are used for 25% of the vehicle miles traveled. Interstates and highways make up between 2.5% and 4% of the total roads within the Consolidated Statistical Areas (CSAs), which represent groupings of metropolitan and/or micropolitan statistical areas. Mileage traveled on the interstates and highways ranges from 54% in California to 24% in Chicago. Road electrification could remove range restrictions of electric vehicles and increase the fuelmore » savings of PHEVs or HEVs if implemented on a large scale. If 1% of the road miles within a geographic area are electrified, 25% of the fuel used by a 'fleet' of vehicles enabled with the technology could be displaced.« less

  18. Sensitivity of submersed freshwater macrophytes and endpoints in laboratory toxicity tests.

    PubMed

    Arts, Gertie H P; Belgers, J Dick M; Hoekzema, Conny H; Thissen, Jac T N M

    2008-05-01

    The toxicological sensitivity and variability of a range of macrophyte endpoints were statistically tested with data from chronic, non-axenic, macrophyte toxicity tests. Five submersed freshwater macrophytes, four pesticides/biocides and 13 endpoints were included in the statistical analyses. Root endpoints, reflecting root growth, were most sensitive in the toxicity tests, while endpoints relating to biomass, growth and shoot length were less sensitive. The endpoints with the lowest coefficients of variation were not necessarily the endpoints, which were toxicologically most sensitive. Differences in sensitivity were in the range of 10-1000 for different macrophyte-specific endpoints. No macrophyte species was consistently the most sensitive. Criteria to select endpoints in macrophyte toxicity tests should include toxicological sensitivity, variance and ecological relevance. Hence, macrophyte toxicity tests should comprise an array of endpoints, including very sensitive endpoints like those relating to root growth.

  19. Statistical average estimates of high latitude field-aligned currents from the STARE and SABRE coherent VHF radar systems

    NASA Astrophysics Data System (ADS)

    Kosch, M. J.; Nielsen, E.

    Two bistatic VHF radar systems, STARE and SABRE, have been employed to estimate ionospheric electric fields in the geomagnetic latitude range 61.1 - 69.3° (geographic latitude range 63.8 - 72.6°) over northern Scandinavia. 173 days of good backscatter from all four radars have been analysed during the period 1982 to 1986, from which the average ionospheric divergence electric field versus latitude and time is calculated. The average magnetic field-aligned currents are computed using an AE-dependent empirical model of the ionospheric conductance. Statistical Birkeland current estimates are presented for high and low values of the Kp and AE indices as well as positive and negative orientations of the IMF B z component. The results compare very favourably to other ground-based and satellite measurements.

  20. Optimization models for degrouping population data.

    PubMed

    Bermúdez, Silvia; Blanquero, Rafael

    2016-07-01

    In certain countries population data are available in grouped form only, usually as quinquennial age groups plus a large open-ended range for the elderly. However, official statistics call for data by individual age since many statistical operations, such as the calculation of demographic indicators, require the use of ungrouped population data. In this paper a number of mathematical models are proposed which, starting from population data given in age groups, enable these ranges to be degrouped into age-specific population values without leaving a fractional part. Unlike other existing procedures for disaggregating demographic data, ours makes it possible to process several years' data simultaneously in a coherent way, and provides accurate results longitudinally as well as transversally. This procedure is also shown to be helpful in dealing with degrouped population data affected by noise, such as those affected by the age-heaping phenomenon.

  1. Comparison of fluorescence microscopy and solid-phase cytometry methods for counting bacteria in water

    USGS Publications Warehouse

    Lisle, John T.; Hamilton, Martin A.; Willse, Alan R.; McFeters, Gordon A.

    2004-01-01

    Total direct counts of bacterial abundance are central in assessing the biomass and bacteriological quality of water in ecological and industrial applications. Several factors have been identified that contribute to the variability in bacterial abundance counts when using fluorescent microscopy, the most significant of which is retaining an adequate number of cells per filter to ensure an acceptable level of statistical confidence in the resulting data. Previous studies that have assessed the components of total-direct-count methods that contribute to this variance have attempted to maintain a bacterial cell abundance value per filter of approximately 106 cells filter-1. In this study we have established the lower limit for the number of bacterial cells per filter at which the statistical reliability of the abundance estimate is no longer acceptable. Our results indicate that when the numbers of bacterial cells per filter were progressively reduced below 105, the microscopic methods increasingly overestimated the true bacterial abundance (range, 15.0 to 99.3%). The solid-phase cytometer only slightly overestimated the true bacterial abundances and was more consistent over the same range of bacterial abundances per filter (range, 8.9 to 12.5%). The solid-phase cytometer method for conducting total direct counts of bacteria was less biased and performed significantly better than any of the microscope methods. It was also found that microscopic count data from counting 5 fields on three separate filters were statistically equivalent to data from counting 20 fields on a single filter.

  2. Stability of INFIT and OUTFIT Compared to Simulated Estimates in Applied Setting.

    PubMed

    Hodge, Kari J; Morgan, Grant B

    Residual-based fit statistics are commonly used as an indication of the extent to which the item response data fit the Rash model. Fit statistic estimates are influenced by sample size and rules-of thumb estimates may result in incorrect conclusions about the extent to which the model fits the data. Estimates obtained in this analysis were compared to 250 simulated data sets to examine the stability of the estimates. All INFIT estimates were within the rule-of-thumb range of 0.7 to 1.3. However, only 82% of the INFIT estimates fell within the 2.5th and 97.5th percentile of the simulated item's INFIT distributions using this 95% confidence-like interval. This is a 18 percentage point difference in items that were classified as acceptable. Fourty-eight percent of OUTFIT estimates fell within the 0.7 to 1.3 rule- of-thumb range. Whereas 34% of OUTFIT estimates fell within the 2.5th and 97.5th percentile of the simulated item's OUTFIT distributions. This is a 13 percentage point difference in items that were classified as acceptable. When using the rule-of- thumb ranges for fit estimates the magnitude of misfit was smaller than with the 95% confidence interval of the simulated distribution. The findings indicate that the use of confidence intervals as critical values for fit statistics leads to different model data fit conclusions than traditional rule of thumb critical values.

  3. Body mass index and acoustic voice parameters: is there a relationship.

    PubMed

    Souza, Lourdes Bernadete Rocha de; Santos, Marquiony Marques Dos

    2017-05-06

    Specific elements such as weight and body volume can interfere in voice production and consequently in its acoustic parameters, which is why it is important for the clinician to be aware of these relationships. To investigate the relationship between body mass index and the average acoustic voice parameters. Observational, cross-sectional descriptive study. The sample consisted of 84 women, aged between 18 and 40years, an average of 26.83 (±6.88). The subjects were grouped according to body mass index: 19 underweight; 23 normal ranges, 20 overweight and 22 obese and evaluated the fundamental frequency of the sustained vowel [a] and the maximum phonation time of the vowels [a], [i], [u], using PRAAT software. The data were submitted to the Kruskal-Wallis test to verify if there were differences between the groups regarding the study variables. All variables showed statistically significant results and were subjected to non-parametric test Mann-Whitney. Regarding to the average of the fundamental frequency, there was statistically significant difference between groups with underweight and overweight and obese; normal range and overweight and obese. The average maximum phonation time revealed statistically significant difference between underweight and obese individuals; normal range and obese; overweight and obese. Body mass index influenced the average fundamental frequency of overweight and obese individuals evaluated in this study. Obesity influenced in reducing maximum phonation time average. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  4. Determining the contribution of long-range transport, regional and local source areas, to PM10 mass loading in Hessen, Germany using a novel multi-receptor based statistical approach

    NASA Astrophysics Data System (ADS)

    Garg, Saryu; Sinha, Baerbel

    2017-10-01

    This study uses two newly developed statistical source apportionment models, MuSAM and MuReSAM, to perform quantitative statistical source apportionment of PM10 at multiple receptor sites in South Hessen. MuSAM uses multi-site back trajectory data to quantify the contribution of long-range transport, while MuReSAM uses wind speed and direction as proxy for regional transport and quantifies the contribution of regional source areas. On average, between 7.8 and 9.1 μg/m3 of PM10 (∼50%) at receptor sites in South Hessen is contributed by long-range transport. The dominant source regions are Eastern, South Eastern, and Southern Europe. 32% of the PM10 at receptor sites in South Hessen is contributed by regional source areas (2.8-9.41 μg/m3). This fraction varies from <20% at remote sites to >40% for urban stations. Sources located within a 2 km radius around the receptor site are responsible for 7%-20% of the total PM10 mass (0.7-4.4 μg/m3). The perturbation study of the traffic flow due to the closing and reopening of the Schiersteiner Brücke revealed that the contribution of the bridge to PM10 mass loadings at two nearby receptor sites increased by approximately 120% after it reopened and became a bottleneck, although in absolute terms, the increase is small.

  5. Unlawful Discrimination DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Included is a review of the 4.0 description and items, followed by the proposed modifications to the factor. The current DEOCS (4.0) contains multiple...Officer (E7 – E9) 586 10.8% Junior Officer (O1 – O3) 474 9% Senior Officer (O4 and above) 391 6.1% Descriptive Statistics and Reliability This section...displays descriptive statistics for the items on the Unlawful Discrimination scale. All items had a range from 1 to 7 (strongly disagree to strongly

  6. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  7. Lies, Damned Lies, and Statistics (in Geology)

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter

    2009-11-01

    According to Karl Popper's epistemology of critical rationalism, scientists should formulate falsifiable hypotheses rather than produce ad hoc answers to empirical observations. In other words, we should predict and test rather than merely explain [Popper, 1959]. Sometimes, statistical tests such as chi-square, t, or Kolmogorov-Smirnov are used to make deductions more “objective.” Such tests are used in a wide range of geological subdisciplines [see Reimann and Filzmoser, 2000; Anderson and Johnson, 1999; Lørup et al., 1998; Sircombe and Hazelton, 2004].

  8. Manual of Protective Action Guides and Protective Actions for Nuclear Incidents. Revision

    DTIC Science & Technology

    1980-06-01

    acceptable dose. Simce the PAC In based on a projected does, It is used only In an expost facto effort to minmisea the risk from an swest which is occurring...statistical evaluation of epidmiological studies In groups of people wbo had been ezposed to radiation. Decisions concerning statistical effects on...protective actions. The Reactor Saety Study Indicates, for ezample, that major releases y beoon in the range of one-half how to as uab as 30 hours after an

  9. Success rates of a skeletal anchorage system in orthodontics: A retrospective analysis.

    PubMed

    Lam, Raymond; Goonewardene, Mithran S; Allan, Brent P; Sugawara, Junji

    2018-01-01

    To evaluate the premise that skeletal anchorage with SAS miniplates are highly successful and predictable for a range of complex orthodontic movements. This retrospective cross-sectional analysis consisted of 421 bone plates placed by one clinician in 163 patients (95 female, 68 male, mean age 29.4 years ± 12.02). Simple descriptive statistics were performed for a wide range of malocclusions and desired movements to obtain success, complication, and failure rates. The success rate of skeletal anchorage system miniplates was 98.6%, where approximately 40% of cases experienced mild complications. The most common complication was soft tissue inflammation, which was amenable to focused oral hygiene and antiseptic rinses. Infection occurred in approximately 15% of patients where there was a statistically significant correlation with poor oral hygiene. The most common movements were distalization and intrusion of teeth. More than a third of the cases involved complex movements in more than one plane of space. The success rate of skeletal anchorage system miniplates is high and predictable for a wide range of complex orthodontic movements.

  10. The introduction of hydrogen bond and hydrophobicity effects into the rotational isomeric states model for conformational analysis of unfolded peptides.

    PubMed

    Engin, Ozge; Sayar, Mehmet; Erman, Burak

    2009-01-13

    Relative contributions of local and non-local interactions to the unfolded conformations of peptides are examined by using the rotational isomeric states model which is a Markov model based on pairwise interactions of torsion angles. The isomeric states of a residue are well described by the Ramachandran map of backbone torsion angles. The statistical weight matrices for the states are determined by molecular dynamics simulations applied to monopeptides and dipeptides. Conformational properties of tripeptides formed from combinations of alanine, valine, tyrosine and tryptophan are investigated based on the Markov model. Comparison with molecular dynamics simulation results on these tripeptides identifies the sequence-distant long-range interactions that are missing in the Markov model. These are essentially the hydrogen bond and hydrophobic interactions that are obtained between the first and the third residue of a tripeptide. A systematic correction is proposed for incorporating these long-range interactions into the rotational isomeric states model. Preliminary results suggest that the Markov assumption can be improved significantly by renormalizing the statistical weight matrices to include the effects of the long-range correlations.

  11. Extended Statistical Short-Range Guidance for Peak Wind Speed Analyses at the Shuttle Landing Facility: Phase II Results

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2003-01-01

    This report describes the results from Phase II of the AMU's Short-Range Statistical Forecasting task for peak winds at the Shuttle Landing Facility (SLF). The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The 45th Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A seven year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. A PC-based Graphical User Interface (GUI) tool was created to display the data quickly.

  12. The introduction of hydrogen bond and hydrophobicity effects into the rotational isomeric states model for conformational analysis of unfolded peptides

    NASA Astrophysics Data System (ADS)

    Engin, Ozge; Sayar, Mehmet; Erman, Burak

    2009-03-01

    Relative contributions of local and non-local interactions to the unfolded conformations of peptides are examined by using the rotational isomeric states model which is a Markov model based on pairwise interactions of torsion angles. The isomeric states of a residue are well described by the Ramachandran map of backbone torsion angles. The statistical weight matrices for the states are determined by molecular dynamics simulations applied to monopeptides and dipeptides. Conformational properties of tripeptides formed from combinations of alanine, valine, tyrosine and tryptophan are investigated based on the Markov model. Comparison with molecular dynamics simulation results on these tripeptides identifies the sequence-distant long-range interactions that are missing in the Markov model. These are essentially the hydrogen bond and hydrophobic interactions that are obtained between the first and the third residue of a tripeptide. A systematic correction is proposed for incorporating these long-range interactions into the rotational isomeric states model. Preliminary results suggest that the Markov assumption can be improved significantly by renormalizing the statistical weight matrices to include the effects of the long-range correlations.

  13. The joint methane profiles retrieval approach from GOSAT TIR and SWIR spectra

    NASA Astrophysics Data System (ADS)

    Zadvornykh, Ilya V.; Gribanov, Konstantin G.; Zakharov, Vyacheslav I.; Imasu, Ryoichi

    2017-11-01

    In this paper we present a method, using methane as example, which allows more accurate greenhouse gases retrieval in the Earth's atmosphere. Using the new version of the FIRE-ARMS software, supplemented with the VLIDORT vector radiation transfer model, we carried out joint methane retrieval from TIR (Thermal Infrared Range) and SWIR (ShortWavelength Infrared Range) GOSAT spectra using optimal estimation method. MACC reanalysis data from the European Center for Medium-Range Forecasts (ECMWF), supplemented by data from aircraft measurements of the HIPPO experiment were used as a statistical ensemble.

  14. Associating an ionospheric parameter with major earthquake occurrence throughout the world

    NASA Astrophysics Data System (ADS)

    Ghosh, D.; Midya, S. K.

    2014-02-01

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.

  15. Effects of the interaction range on structural phases of flexible polymers.

    PubMed

    Gross, J; Neuhaus, T; Vogel, T; Bachmann, M

    2013-02-21

    We systematically investigate how the range of interaction between non-bonded monomers influences the formation of structural phases of elastic, flexible polymers. Massively parallel replica-exchange simulations of a generic, coarse-grained model, performed partly on graphics processing units and in multiple-gaussian modified ensembles, pave the way for the construction of the structural phase diagram, parametrized by interaction range and temperature. Conformational transitions between gas-like, liquid, and diverse solid (pseudo) phases are identified by microcanonical statistical inflection-point analysis. We find evidence for finite-size effects that cause the crossover of "collapse" and "freezing" transitions for very short interaction ranges.

  16. Range of interaction in an opinion evolution model of ideological self-positioning: Contagion, hesitance and polarization

    NASA Astrophysics Data System (ADS)

    Gimenez, M. Cecilia; Paz García, Ana Pamela; Burgos Paci, Maxi A.; Reinaudi, Luis

    2016-04-01

    The evolution of public opinion using tools and concepts borrowed from Statistical Physics is an emerging area within the field of Sociophysics. In the present paper, a Statistical Physics model was developed to study the evolution of the ideological self-positioning of an ensemble of agents. The model consists of an array of L components, each one of which represents the ideology of an agent. The proposed mechanism is based on the ;voter model;, in which one agent can adopt the opinion of another one if the difference of their opinions lies within a certain range. The existence of ;undecided; agents (i.e. agents with no definite opinion) was implemented in the model. The possibility of radicalization of an agent's opinion upon interaction with another one was also implemented. The results of our simulations are compared to statistical data taken from the Latinobarómetro databank for the cases of Argentina, Chile, Brazil and Uruguay in the last decade. Among other results, the effect of taking into account the undecided agents is the formation of a single peak at the middle of the ideological spectrum (which corresponds to a centrist ideological position), in agreement with the real cases studied.

  17. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water.

    PubMed

    Lamm, Steven H; Ferdosi, Hamid; Dissen, Elisabeth K; Li, Ji; Ahn, Jaeil

    2015-12-07

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1-1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100-150 µg/L arsenic.

  18. Dynamics and Statistical Mechanics of Rotating and non-Rotating Vortical Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Chjan

    Three projects were analyzed with the overall aim of developing a computational/analytical model for estimating values of the energy, angular momentum, enstrophy and total variation of fluid height at phase transitions between disordered and self-organized flow states in planetary atmospheres. It is believed that these transitions in equilibrium statistical mechanics models play a role in the construction of large-scale, stable structures including super-rotation in the Venusian atmosphere and the formation of the Great Red Spot on Jupiter. Exact solutions of the spherical energy-enstrophy models for rotating planetary atmospheres by Kac's method of steepest descent predicted phase transitions to super-rotating solid-bodymore » flows at high energy to enstrophy ratio for all planetary spins and to sub-rotating modes if the planetary spin is large enough. These canonical statistical ensembles are well-defined for the long-range energy interactions that arise from 2D fluid flows on compact oriented manifolds such as the surface of the sphere and torus. This is because in Fourier space available through Hodge theory, the energy terms are exactly diagonalizable and hence has zero range, leading to well-defined heat baths.« less

  19. Small sample mediation testing: misplaced confidence in bootstrapped confidence intervals.

    PubMed

    Koopman, Joel; Howe, Michael; Hollenbeck, John R; Sin, Hock-Peng

    2015-01-01

    Bootstrapping is an analytical tool commonly used in psychology to test the statistical significance of the indirect effect in mediation models. Bootstrapping proponents have particularly advocated for its use for samples of 20-80 cases. This advocacy has been heeded, especially in the Journal of Applied Psychology, as researchers are increasingly utilizing bootstrapping to test mediation with samples in this range. We discuss reasons to be concerned with this escalation, and in a simulation study focused specifically on this range of sample sizes, we demonstrate not only that bootstrapping has insufficient statistical power to provide a rigorous hypothesis test in most conditions but also that bootstrapping has a tendency to exhibit an inflated Type I error rate. We then extend our simulations to investigate an alternative empirical resampling method as well as a Bayesian approach and demonstrate that they exhibit comparable statistical power to bootstrapping in small samples without the associated inflated Type I error. Implications for researchers testing mediation hypotheses in small samples are presented. For researchers wishing to use these methods in their own research, we have provided R syntax in the online supplemental materials. (c) 2015 APA, all rights reserved.

  20. Networking—a statistical physics perspective

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  1. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water

    PubMed Central

    Lamm, Steven H.; Ferdosi, Hamid; Dissen, Elisabeth K.; Li, Ji; Ahn, Jaeil

    2015-01-01

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1–1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100–150 µg/L arsenic. PMID:26690190

  2. Sedimentological analysis and bed thickness statistics from a Carboniferous deep-water channel-levee complex: Myall Trough, SE Australia

    NASA Astrophysics Data System (ADS)

    Palozzi, Jason; Pantopoulos, George; Maravelis, Angelos G.; Nordsvan, Adam; Zelilidis, Avraam

    2018-02-01

    This investigation presents an outcrop-based integrated study of internal division analysis and statistical treatment of turbidite bed thickness applied to a Carboniferous deep-water channel-levee complex in the Myall Trough, southeast Australia. Turbidite beds of the studied succession are characterized by a range of sedimentary structures grouped into two main associations, a thick-bedded and a thin-bedded one, that reflect channel-fill and overbank/levee deposits, respectively. Three vertically stacked channel-levee cycles have been identified. Results of statistical analysis of bed thickness, grain-size and internal division patterns applied on the studied channel-levee succession, indicate that turbidite bed thickness data seem to be well characterized by a bimodal lognormal distribution, which is possibly reflecting the difference between deposition from lower-density flows (in a levee/overbank setting) and very high-density flows (in a channel fill setting). Power law and exponential distributions were observed to hold only for the thick-bedded parts of the succession and cannot characterize the whole bed thickness range of the studied sediments. The succession also exhibits non-random clustering of bed thickness and grain-size measurements. The studied sediments are also characterized by the presence of statistically detected fining-upward sandstone packets. A novel quantitative approach (change-point analysis) is proposed for the detection of those packets. Markov permutation statistics also revealed the existence of order in the alternation of internal divisions in the succession expressed by an optimal internal division cycle reflecting two main types of gravity flow events deposited within both thick-bedded conglomeratic and thin-bedded sandstone associations. The analytical methods presented in this study can be used as additional tools for quantitative analysis and recognition of depositional environments in hydrocarbon-bearing research of ancient deep-water channel-levee settings.

  3. Lower incisor inclination regarding different reference planes.

    PubMed

    Zataráin, Brenda; Avila, Josué; Moyaho, Angeles; Carrasco, Rosendo; Velasco, Carmen

    2016-09-01

    The purpose of this study was to assess the degree of lower incisor inclination with respect to different reference planes. It was an observational, analytical, longitudinal, prospective study conducted on 100 lateral cephalograms which were corrected according to the photograph in natural head position in order to draw the true vertical plane (TVP). The incisor mandibular plane angle (IMPA) was compensated to eliminate the variation of the mandibular plane growth type with the formula "FMApx.- 25 (FMA) + IMPApx. = compensated IMPA (IMPACOM)". As the data followed normal distribution determined by the KolmogorovSmirnov test, parametric tests were used for the statistical analysis, Ttest, ANOVA and Pearson coefficient correlation test. Statistical analysis was performed using a statistical significance of p <0.05. There is correlation between TVP and NB line (NB) (0.8614), Frankfort mandibular incisor angle (FMIA) (0.8894), IMPA (0.6351), Apo line (Apo) (0.609), IMPACOM (0.8895) and McHorris angle (MH) (0.7769). ANOVA showed statistically significant differences between the means for the 7 variables with 95% confidence level, P=0.0001. The multiple range test showed no significant difference among means: APoNB (0.88), IMPAMH (0.36), IMPANB (0.65), FMIAIMPACOM (0.01), FMIATVP (0.18), TVPIMPACOM (0.17). There was correlation among all reference planes. There were statistically significant differences among the means of the planes measured, except for IMPACOM, FMIA and TVP. The IMPA differed significantly from the IMPACOM. The compensated IMPA and the FMIA did not differ significantly from the TVP. The true horizontal plane was mismatched with Frankfort plane in 84% of the sample with a range of 19°. The true vertical plane is adequate for measuring lower incisor inclination. Sociedad Argentina de Investigación Odontológica.

  4. Statistical Analysis of the Skaion Network Security Dataset

    DTIC Science & Technology

    2012-09-01

    DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=True, Tab:=False, _ Semicolon:=False, Comma:=False, Space...Selection.TextToColumns Destination:=Range(“E1”), DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=True, Tab:=False...True Columns(“F:F”).Select Selection.TextToColumns Destination:=Range(“F1”), DataType :=xlDelimited, _ TextQualifier:=xlDoubleQuote

  5. A hint of Poincaré dodecahedral topology in the WMAP first year sky map

    NASA Astrophysics Data System (ADS)

    Roukema, B. F.; Lew, B.; Cechowska, M.; Marecki, A.; Bajtlik, S.

    2004-09-01

    It has recently been suggested by Luminet et al. (\\cite{LumNat03}) that the WMAP data are better matched by a geometry in which the topology is that of a Poincaré dodecahedral model and the curvature is ``slightly'' spherical, rather than by an (effectively) infinite flat model. A general back-to-back matched circles analysis by Cornish et al. (\\cite{CSSK03}) for angular radii in the range 25-90 °, using a correlation statistic for signal detection, failed to support this. In this paper, a matched circles analysis specifically designed to detect dodecahedral patterns of matched circles is performed over angular radii in the range 1-40\\ddeg on the one-year WMAP data. Signal detection is attempted via a correlation statistic and an rms difference statistic. Extreme value distributions of these statistics are calculated for one orientation of the 36\\ddeg ``screw motion'' (Clifford translation) when matching circles, for the opposite screw motion, and for a zero (unphysical) rotation. The most correlated circles appear for circle radii of \\alpha =11 ± 1 \\ddeg, for the left-handed screw motion, but not for the right-handed one, nor for the zero rotation. The favoured six dodecahedral face centres in galactic coordinates are (\\lII,\\bII) ≈ (252\\ddeg,+65\\ddeg), (51\\ddeg,+51\\ddeg), (144\\ddeg,+38\\ddeg), (207\\ddeg,+10\\ddeg), (271\\ddeg,+3\\ddeg), (332\\ddeg,+25\\ddeg) and their opposites. The six pairs of circles independently each favour a circle angular radius of 11 ± 1\\ddeg. The temperature fluctuations along the matched circles are plotted and are clearly highly correlated. Whether or not these six circle pairs centred on dodecahedral faces match via a 36\\ddeg rotation only due to unexpected statistical properties of the WMAP ILC map, or whether they match due to global geometry, it is clear that the WMAP ILC map has some unusual statistical properties which mimic a potentially interesting cosmological signal.

  6. Broadband classification and statistics of echoes from aggregations of fish measured by long-range, mid-frequency sonar.

    PubMed

    Jones, Benjamin A; Stanton, Timothy K; Colosi, John A; Gauss, Roger C; Fialkowski, Joseph M; Michael Jech, J

    2017-06-01

    For horizontal-looking sonar systems operating at mid-frequencies (1-10 kHz), scattering by fish with resonant gas-filled swimbladders can dominate seafloor and surface reverberation at long-ranges (i.e., distances much greater than the water depth). This source of scattering, which can be difficult to distinguish from other sources of scattering in the water column or at the boundaries, can add spatio-temporal variability to an already complex acoustic record. Sparsely distributed, spatially compact fish aggregations were measured in the Gulf of Maine using a long-range broadband sonar with continuous spectral coverage from 1.5 to 5 kHz. Observed echoes, that are at least 15 decibels above background levels in the horizontal-looking sonar data, are classified spectrally by the resonance features as due to swimbladder-bearing fish. Contemporaneous multi-frequency echosounder measurements (18, 38, and 120 kHz) and net samples are used in conjunction with physics-based acoustic models to validate this approach. Furthermore, the fish aggregations are statistically characterized in the long-range data by highly non-Rayleigh distributions of the echo magnitudes. These distributions are accurately predicted by a computationally efficient, physics-based model. The model accounts for beam-pattern and waveguide effects as well as the scattering response of aggregations of fish.

  7. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  8. Timber resource statistics for eastern Washington.

    Treesearch

    Patricia M. Bassett; Daniel D. Oswald

    1983-01-01

    This report summarizes a 1980 timber resource inventory of the 16 forested counties in Washington east of the crest of the Cascade Range. Detailed tables of forest area, timber volume, growth, mortality, and harvest are presented.

  9. Semi-Poisson statistics in quantum chaos.

    PubMed

    García-García, Antonio M; Wang, Jiao

    2006-03-01

    We investigate the quantum properties of a nonrandom Hamiltonian with a steplike singularity. It is shown that the eigenfunctions are multifractals and, in a certain range of parameters, the level statistics is described exactly by semi-Poisson statistics (SP) typical of pseudointegrable systems. It is also shown that our results are universal, namely, they depend exclusively on the presence of the steplike singularity and are not modified by smooth perturbations of the potential or the addition of a magnetic flux. Although the quantum properties of our system are similar to those of a disordered conductor at the Anderson transition, we report important quantitative differences in both the level statistics and the multifractal dimensions controlling the transition. Finally, the study of quantum transport properties suggests that the classical singularity induces quantum anomalous diffusion. We discuss how these findings may be experimentally corroborated by using ultracold atoms techniques.

  10. Dynamic heterogeneity and non-Gaussian statistics for acetylcholine receptors on live cell membrane

    NASA Astrophysics Data System (ADS)

    He, W.; Song, H.; Su, Y.; Geng, L.; Ackerson, B. J.; Peng, H. B.; Tong, P.

    2016-05-01

    The Brownian motion of molecules at thermal equilibrium usually has a finite correlation time and will eventually be randomized after a long delay time, so that their displacement follows the Gaussian statistics. This is true even when the molecules have experienced a complex environment with a finite correlation time. Here, we report that the lateral motion of the acetylcholine receptors on live muscle cell membranes does not follow the Gaussian statistics for normal Brownian diffusion. From a careful analysis of a large volume of the protein trajectories obtained over a wide range of sampling rates and long durations, we find that the normalized histogram of the protein displacements shows an exponential tail, which is robust and universal for cells under different conditions. The experiment indicates that the observed non-Gaussian statistics and dynamic heterogeneity are inherently linked to the slow-active remodelling of the underlying cortical actin network.

  11. Statistical characteristics of the spatial distribution of territorial contamination by radionuclides from the Chernobyl accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arutyunyan, R.V.; Bol`shov, L.A.; Vasil`ev, S.K.

    1994-06-01

    The objective of this study was to clarify a number of issues related to the spatial distribution of contaminants from the Chernobyl accident. The effects of local statistics were addressed by collecting and analyzing (for Cesium 137) soil samples from a number of regions, and it was found that sample activity differed by a factor of 3-5. The effect of local non-uniformity was estimated by modeling the distribution of the average activity of a set of five samples for each of the regions, with the spread in the activities for a {+-}2 range being equal to 25%. The statistical characteristicsmore » of the distribution of contamination were then analyzed and found to be a log-normal distribution with the standard deviation being a function of test area. All data for the Bryanskaya Oblast area were analyzed statistically and were adequately described by a log-normal function.« less

  12. On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship.

    PubMed

    Koyama, Shinsuke

    2015-07-01

    We propose a statistical method for modeling the non-Poisson variability of spike trains observed in a wide range of brain regions. Central to our approach is the assumption that the variance and the mean of interspike intervals are related by a power function characterized by two parameters: the scale factor and exponent. It is shown that this single assumption allows the variability of spike trains to have an arbitrary scale and various dependencies on the firing rate in the spike count statistics, as well as in the interval statistics, depending on the two parameters of the power function. We also propose a statistical model for spike trains that exhibits the variance-to-mean power relationship. Based on this, a maximum likelihood method is developed for inferring the parameters from rate-modulated spike trains. The proposed method is illustrated on simulated and experimental spike trains.

  13. Condensate statistics and thermodynamics of weakly interacting Bose gas: Recursion relation approach

    NASA Astrophysics Data System (ADS)

    Dorfman, K. E.; Kim, M.; Svidzinsky, A. A.

    2011-03-01

    We study condensate statistics and thermodynamics of weakly interacting Bose gas with a fixed total number N of particles in a cubic box. We find the exact recursion relation for the canonical ensemble partition function. Using this relation, we calculate the distribution function of condensate particles for N=200. We also calculate the distribution function based on multinomial expansion of the characteristic function. Similar to the ideal gas, both approaches give exact statistical moments for all temperatures in the framework of Bogoliubov model. We compare them with the results of unconstraint canonical ensemble quasiparticle formalism and the hybrid master equation approach. The present recursion relation can be used for any external potential and boundary conditions. We investigate the temperature dependence of the first few statistical moments of condensate fluctuations as well as thermodynamic potentials and heat capacity analytically and numerically in the whole temperature range.

  14. Absolute fragmentation cross sections in atom-molecule collisions: Scaling laws for non-statistical fragmentation of polycyclic aromatic hydrocarbon molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.; Gatchell, M.; Stockett, M. H.

    2014-06-14

    We present scaling laws for absolute cross sections for non-statistical fragmentation in collisions between Polycyclic Aromatic Hydrocarbons (PAH/PAH{sup +}) and hydrogen or helium atoms with kinetic energies ranging from 50 eV to 10 keV. Further, we calculate the total fragmentation cross sections (including statistical fragmentation) for 110 eV PAH/PAH{sup +} + He collisions, and show that they compare well with experimental results. We demonstrate that non-statistical fragmentation becomes dominant for large PAHs and that it yields highly reactive fragments forming strong covalent bonds with atoms (H and N) and molecules (C{sub 6}H{sub 5}). Thus nonstatistical fragmentation may be an effectivemore » initial step in the formation of, e.g., Polycyclic Aromatic Nitrogen Heterocycles (PANHs). This relates to recent discussions on the evolution of PAHNs in space and the reactivities of defect graphene structures.« less

  15. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    NASA Astrophysics Data System (ADS)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  16. Multiple Versus Single Set Validation of Multivariate Models to Avoid Mistakes.

    PubMed

    Harrington, Peter de Boves

    2018-01-02

    Validation of multivariate models is of current importance for a wide range of chemical applications. Although important, it is neglected. The common practice is to use a single external validation set for evaluation. This approach is deficient and may mislead investigators with results that are specific to the single validation set of data. In addition, no statistics are available regarding the precision of a derived figure of merit (FOM). A statistical approach using bootstrapped Latin partitions is advocated. This validation method makes an efficient use of the data because each object is used once for validation. It was reviewed a decade earlier but primarily for the optimization of chemometric models this review presents the reasons it should be used for generalized statistical validation. Average FOMs with confidence intervals are reported and powerful, matched-sample statistics may be applied for comparing models and methods. Examples demonstrate the problems with single validation sets.

  17. Selected Streamflow Statistics and Regression Equations for Predicting Statistics at Stream Locations in Monroe County, Pennsylvania

    USGS Publications Warehouse

    Thompson, Ronald E.; Hoffman, Scott A.

    2006-01-01

    A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.

  18. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    PubMed

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.

  19. Connectivity-based fixel enhancement: Whole-brain statistical analysis of diffusion MRI measures in the presence of crossing fibres

    PubMed Central

    Raffelt, David A.; Smith, Robert E.; Ridgway, Gerard R.; Tournier, J-Donald; Vaughan, David N.; Rose, Stephen; Henderson, Robert; Connelly, Alan

    2015-01-01

    In brain regions containing crossing fibre bundles, voxel-average diffusion MRI measures such as fractional anisotropy (FA) are difficult to interpret, and lack within-voxel single fibre population specificity. Recent work has focused on the development of more interpretable quantitative measures that can be associated with a specific fibre population within a voxel containing crossing fibres (herein we use fixel to refer to a specific fibre population within a single voxel). Unfortunately, traditional 3D methods for smoothing and cluster-based statistical inference cannot be used for voxel-based analysis of these measures, since the local neighbourhood for smoothing and cluster formation can be ambiguous when adjacent voxels may have different numbers of fixels, or ill-defined when they belong to different tracts. Here we introduce a novel statistical method to perform whole-brain fixel-based analysis called connectivity-based fixel enhancement (CFE). CFE uses probabilistic tractography to identify structurally connected fixels that are likely to share underlying anatomy and pathology. Probabilistic connectivity information is then used for tract-specific smoothing (prior to the statistical analysis) and enhancement of the statistical map (using a threshold-free cluster enhancement-like approach). To investigate the characteristics of the CFE method, we assessed sensitivity and specificity using a large number of combinations of CFE enhancement parameters and smoothing extents, using simulated pathology generated with a range of test-statistic signal-to-noise ratios in five different white matter regions (chosen to cover a broad range of fibre bundle features). The results suggest that CFE input parameters are relatively insensitive to the characteristics of the simulated pathology. We therefore recommend a single set of CFE parameters that should give near optimal results in future studies where the group effect is unknown. We then demonstrate the proposed method by comparing apparent fibre density between motor neurone disease (MND) patients with control subjects. The MND results illustrate the benefit of fixel-specific statistical inference in white matter regions that contain crossing fibres. PMID:26004503

  20. Serum adipokines and HIV viral replication in patients undergoing antiretroviral therapy

    PubMed Central

    Aramă, Victoria; Tilişcan, Cătălin; Ion, Daniela Adriana; Mihăilescu, Raluca; Munteanu, Daniela; Streinu-Cercel, Anca; Tudor, Ana Maria; Hristea, Adriana; Leoveanu, Viorica; Olaru, Ioana; Aramă, Ştefan Sorin

    2012-01-01

    Introduction Several studies have reported that cytokines secreted by adipose tissue (adipokines) may be linked to HIV replication. The aim of the study was to evaluate the relationship between HIV replication and serum levels of adipokines, in a Caucasian HIV-infected population of men and women undergoing complex antiretroviral therapy. Methods A cross-sectional study was conducted in an unselected sample of 77 HIV-1-positive patients. Serum adipokines levels were measured including circulating adiponectin, leptin, resistin, tumor necrosis factor alpha (TNF-alpha) and interleukin-6 (IL-6). Patients were divided into two groups: Group 1 - with undetectable viral load and Group 2 - with persistent HIV viral replication. Differences between groups ? were tested using independent-sample t-test for Gaussian variables and Mann–Whitney–Wilcoxon test for non-parametric variables. Pearson's chi-squared test was used for correlation analysis. Results A total of 77 patients (age range: 17-65, mean: 32.5 years) including 44 men (57.1% men, age range: 17–63 years, mean: 34.1 years) and 33 women (42.9% women age range: 19–65 years, mean: 30.3 years) were included in the study. TNF-alpha had significantly higher serum levels in patients with detectable viral load (16.89 vs. 9.35 pg/mL), (p=0.043), but correlation analysis lacked statistical significance. Adiponectin had median serum levels of 9.22 ìg/mL in Group 1 vs. 16.50 ìg/mL in Group 2 but the results lacked statistical significance (p=0.059). Higher leptin, IL-6 and resistin serum levels were noted in patients with undetectable HIV viral load, without statistical significance. Conclusions The present study reported higher TNF-alpha serum levels in patients with persistent HIV viral load. We found no statistically significant correlations between adiponectin, leptin, resistin and IL-6 and HIV viral load in our Caucasian HIV-positive study population, undergoing antiretroviral therapy. PMID:24432258

  1. Which are the most useful scales for predicting repeat self-harm? A systematic review evaluating risk scales using measures of diagnostic accuracy

    PubMed Central

    Quinlivan, L; Cooper, J; Davies, L; Hawton, K; Gunnell, D; Kapur, N

    2016-01-01

    Objectives The aims of this review were to calculate the diagnostic accuracy statistics of risk scales following self-harm and consider which might be the most useful scales in clinical practice. Design Systematic review. Methods We based our search terms on those used in the systematic reviews carried out for the National Institute for Health and Care Excellence self-harm guidelines (2012) and evidence update (2013), and updated the searches through to February 2015 (CINAHL, EMBASE, MEDLINE, and PsychINFO). Methodological quality was assessed and three reviewers extracted data independently. We limited our analysis to cohort studies in adults using the outcome of repeat self-harm or attempted suicide. We calculated diagnostic accuracy statistics including measures of global accuracy. Statistical pooling was not possible due to heterogeneity. Results The eight papers included in the final analysis varied widely according to methodological quality and the content of scales employed. Overall, sensitivity of scales ranged from 6% (95% CI 5% to 6%) to 97% (CI 95% 94% to 98%). The positive predictive value (PPV) ranged from 5% (95% CI 3% to 9%) to 84% (95% CI 80% to 87%). The diagnostic OR ranged from 1.01 (95% CI 0.434 to 2.5) to 16.3 (95%CI 12.5 to 21.4). Scales with high sensitivity tended to have low PPVs. Conclusions It is difficult to be certain which, if any, are the most useful scales for self-harm risk assessment. No scales perform sufficiently well so as to be recommended for routine clinical use. Further robust prospective studies are warranted to evaluate risk scales following an episode of self-harm. Diagnostic accuracy statistics should be considered in relation to the specific service needs, and scales should only be used as an adjunct to assessment. PMID:26873046

  2. Magnitude and frequency of low flows in the Suwannee River Water Management District, Florida

    USGS Publications Warehouse

    Giese, G.L.; Franklin, M.A.

    1996-01-01

    Low-flow frequency statistics for 20 gaging stations having at least 10 years of continuous record and 31 other stations having less than 10 years of continu ous record or a series of at least two low- flow measurements are presented for unregulated streams in the Suwannee River Water Management District in north-central Florida. Statistics for the 20 continuous-record stations included are the annual and monthly minimum consecutive-day average low- flow magnitudes for 1, 3, 7, 14, and 30 consecutive days for recurrence intervals of 2, 5, 10, 20, and, for some long-term stations, 50 years, based on records available through the 1994 climatic year.Only theannual statistics are given for the 31 other stations; these are for the 7- and 30-consecutive day periods only and for recurrence intervals of 2 and 10 years only. Annual low-flow frequency statistics range from zero for many small streams to 5,500 cubic feet per second for the annual 30- consecutive-day average flow with a recurrenceinterval of 2 years for the Suwannee River near Wilcox (station 02323500). Monthly low-flow frequency statistics range from zero for many small streams to 13,800 cubic feet per second for the minimum 30-consecutive-day average flow with a 2-year recurrence interval for the month of March for the same station. Generally, low-flow characteristics of streams in the Suwannee River Water Management District are controlled by climatic, topographic, and geologic fac tors. The carbonate Floridan aquifer system underlies, or is at the surface of, the entire District. The terrane's karstic nature results in manysinkholes and springs. In some places, springs may contribute greatly to low streamflow and the contributing areas of such springs may include areasoutside the presumed surface drainage area of the springs. In other places, water may enter sinkholes within a drainage basin, then reappear in springs downstream from a gage. Many of the smaller streams in the District go dry or have no flow forseveral months in many years. In addition to the low-flow statistics, four synoptic low-flow measurement surveys were conducted on 161 sites during 1990, 1995, and 1996. Themeasurements were made to provide "snapshots" of flow conditions of streams throughout the Suwannee River Water Management District. Magnitudes of low flows during the 1990 series of measurements were in the range associated withminimum 7-consecutive-day 50-year recurrence interval to the minimum 7-consecutive-day 20-year recurrence interval, except in Taylor and Dixie Counties, where the magnitudes ranged from the minimum 7-consecutive-day 5-year flow level to the7-consecutive-day 2-year flow level. The magnitudes were all greater than the minimum 7- consecutive-day 2-year flow level during 1995 and 1996. Observations of no flow were recorded at many of the sites for all four series of measurements.

  3. Can Opposite Clear Corneal Incisions Have a Role with Post-laser In Situ Keratomileusis Astigmatism?

    PubMed Central

    El-Awady, Hatem; Ghanem, Asaad A.

    2012-01-01

    Purpose: To evaluate the astigmatic correcting effect of paired opposite clear corneal incisions (OCCIs) on the steep axis in patients with residual astigmatism after laser in situ keratomileusis (LASIK) Materials and Methods: Thirty-one eyes of 24 patients with a mean age of 28.4 years ±2.46 (range, 19-36 years) were recruited for the study. Inclusion criteria included residual astigmatism of ≥1.5 diopter (D) after LASIK with inadequate residual stromal bed thickness that precluded ablation. The cohort was divided into two groups; group I (with astigmatism ranging from -1.5 D to -2.5 D) and group II (with astigmatism > -2.5 D). The steep axis was marked prior to surgery. Paired three-step self-sealing opposite clear corneal incisions were performed 1-mm anterior to the limbus on the steep axis with 3.2-mm keratome for group I and 4.1 mm for group II. Patients were examined 1 day, 1 week, 1 month, 3 months and 6 months, postoperatively. Visual acuity, refraction, keratometry, and corneal topography were evaluated preoperatively and postoperatively. Analysis of the difference between groups was performed with the Student t-test. P<0.05 was considered statistically significant. Results: The mean uncorrected visual acuity (UCVA) improved from 0.35±0.13 (range, 0.1-0.6) to 0.78±0.19 (range, 0.5-1) in group I and from 0.26±0.19 (range, 0.1-0.5) to 0.7±0.18 (range, 0.4-1) in group II. The increase in UCVA was statistically significant in both groups (P=0.001, both cases). The mean preoperative and postoperative keratometric astigmatism in group I was 2.0±0.48 D (range, 1.5-2.5 D) and 0.8±0.37 D (range, 0.1-1.4 D), respectively. The decrease in keratometric astigmatism was highly statistically significant in group II (P=0.001.). Mean surgically induced astigmatic reduction by vector analysis was 1.47±0.85 D and 2.21±0.97 D in groups I and II respectively. There were no incision-related complications. Conclusions: Paired OCCIs were predictable and effective in correcting post-LASIK astigmatism and required no extra surgical skill or expensive instruments. OCCIs are especially useful in eyes with insufficient corneal thickness for LASIK retreatment. PMID:22623863

  4. Data and material of the Safe-Range-Inventory: An assistance tool helping to improve the charging infrastructure for electric vehicles.

    PubMed

    Carbon, Claus-Christian; Gebauer, Fabian

    2017-10-01

    The Safe-Range-Inventory (SRI) was constructed in order to help public authorities to improve the charging infrastructures for electric vehicles [1; 10.1016/j.trf.2017.04.011]. Specifically, the impact of fast (vs slow) charging stations on people's range anxiety was examined. Ninety-seven electric vehicle users from Germany (81 male; M age =46.3 years, SD =12.1) were recruited to participate in the experimental design. Statistical analyses were conducted using ANOVA for repeated measures to test for interaction effects of available charging stations and remaining range with the dependent variable range anxiety . The full data set is publicly available via https://osf.io/bveyw/ (Carbon and Gebauer, 2017) [2].

  5. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  6. The statistical overlap theory of chromatography using power law (fractal) statistics.

    PubMed

    Schure, Mark R; Davis, Joe M

    2011-12-30

    The chromatographic dimensionality was recently proposed as a measure of retention time spacing based on a power law (fractal) distribution. Using this model, a statistical overlap theory (SOT) for chromatographic peaks is developed that estimates the number of peak maxima as a function of the chromatographic dimension, saturation and scale. Power law models exhibit a threshold region whereby below a critical saturation value no loss of peak maxima due to peak fusion occurs as saturation increases. At moderate saturation, behavior is similar to the random (Poisson) peak model. At still higher saturation, the power law model shows loss of peaks nearly independent of the scale and dimension of the model. The physicochemical meaning of the power law scale parameter is discussed and shown to be equal to the Boltzmann-weighted free energy of transfer over the scale limits. The scale is discussed. Small scale range (small β) is shown to generate more uniform chromatograms. Large scale range chromatograms (large β) are shown to give occasional large excursions of retention times; this is a property of power laws where "wild" behavior is noted to occasionally occur. Both cases are shown to be useful depending on the chromatographic saturation. A scale-invariant model of the SOT shows very simple relationships between the fraction of peak maxima and the saturation, peak width and number of theoretical plates. These equations provide much insight into separations which follow power law statistics. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  8. Statistical investigation of avalanches of three-dimensional small-world networks and their boundary and bulk cross-sections

    NASA Astrophysics Data System (ADS)

    Najafi, M. N.; Dashti-Naserabadi, H.

    2018-03-01

    In many situations we are interested in the propagation of energy in some portions of a three-dimensional system with dilute long-range links. In this paper, a sandpile model is defined on the three-dimensional small-world network with real dissipative boundaries and the energy propagation is studied in three dimensions as well as the two-dimensional cross-sections. Two types of cross-sections are defined in the system, one in the bulk and another in the system boundary. The motivation of this is to make clear how the statistics of the avalanches in the bulk cross-section tend to the statistics of the dissipative avalanches, defined in the boundaries as the concentration of long-range links (α ) increases. This trend is numerically shown to be a power law in a manner described in the paper. Two regimes of α are considered in this work. For sufficiently small α s the dominant behavior of the system is just like that of the regular BTW, whereas for the intermediate values the behavior is nontrivial with some exponents that are reported in the paper. It is shown that the spatial extent up to which the statistics is similar to the regular BTW model scales with α just like the dissipative BTW model with the dissipation factor (mass in the corresponding ghost model) m2˜α for the three-dimensional system as well as its two-dimensional cross-sections.

  9. Estimating Low-Flow Frequency Statistics and Hydrologic Analysis of Selected Streamflow-Gaging Stations, Nooksack River Basin, Northwestern Washington and Canada

    USGS Publications Warehouse

    Curran, Christopher A.; Olsen, Theresa D.

    2009-01-01

    Low-flow frequency statistics were computed at 17 continuous-record streamflow-gaging stations and 8 miscellaneous measurement sites in and near the Nooksack River basin in northwestern Washington and Canada, including the 1, 3, 7, 15, 30, and 60 consecutive-day low flows with recurrence intervals of 2 and 10 years. Using these low-flow statistics, 12 regional regression equations were developed for estimating the same low-flow statistics at ungaged sites in the Nooksack River basin using a weighted-least-squares method. Adjusted R2 (coefficient of determination) values for the equations ranged from 0.79 to 0.93 and the root-mean-squared error (RMSE) expressed as a percentage ranged from 77 to 560 percent. Streamflow records from six gaging stations located in mountain-stream or lowland-stream subbasins of the Nooksack River basin were analyzed to determine if any of the gaging stations could be removed from the network without significant loss of information. Using methods of hydrograph comparison, daily-value correlation, variable space, and flow-duration ratios, and other factors relating to individual subbasins, the six gaging stations were prioritized from most to least important as follows: Skookum Creek (12209490), Anderson Creek (12210900), Warm Creek (12207750), Fishtrap Creek (12212050), Racehorse Creek (12206900), and Clearwater Creek (12207850). The optimum streamflow-gaging station network would contain all gaging stations except Clearwater Creek, and the minimum network would include Skookum Creek and Anderson Creek.

  10. (Finite) statistical size effects on compressive strength.

    PubMed

    Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien

    2014-04-29

    The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.

  11. Optimism bias leads to inconclusive results - an empirical study

    PubMed Central

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T.; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J.

    2010-01-01

    Objective Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully, and explored whether poor accrual or optimism bias is responsible for inconclusive results. Study Design Systematic review Setting Retrospective analysis of a consecutive series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Results 359 trials (374 comparisons) enrolling 150,232 patients were analyzed. 70% (262/374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273/374) of studies. Investigators’ judgments and statistical inferences were concordant in 75% (279/374) of trials. Investigators consistently overestimated their expected treatment effects, but to a significantly larger extent for inconclusive trials. The median ratio of expected over observed hazard ratio or odds ratio was 1.34 (range 0.19 – 15.40) in conclusive trials compared to 1.86 (range 1.09 – 12.00) in inconclusive studies (p<0.0001). Only 17% of the trials had treatment effects that matched original researchers’ expectations. Conclusion Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. PMID:21163620

  12. Optimism bias leads to inconclusive results-an empirical study.

    PubMed

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J

    2011-06-01

    Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully and explored whether poor accrual or optimism bias is responsible for inconclusive results. Systematic review. Retrospective analysis of a consecutive-series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Three hundred fifty-nine trials (374 comparisons) enrolling 150,232 patients were analyzed. Seventy percent (262 of 374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273 of 374) of studies. Investigators' judgments and statistical inferences were concordant in 75% (279 of 374) of trials. Investigators consistently overestimated their expected treatment effects but to a significantly larger extent for inconclusive trials. The median ratio of expected and observed hazard ratio or odds ratio was 1.34 (range: 0.19-15.40) in conclusive trials compared with 1.86 (range: 1.09-12.00) in inconclusive studies (P<0.0001). Only 17% of the trials had treatment effects that matched original researchers' expectations. Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. On the Statistical Properties of Cospectra

    NASA Astrophysics Data System (ADS)

    Huppenkothen, D.; Bachetti, M.

    2018-05-01

    In recent years, the cross-spectrum has received considerable attention as a means of characterizing the variability of astronomical sources as a function of wavelength. The cospectrum has only recently been understood as a means of mitigating instrumental effects dependent on temporal frequency in astronomical detectors, as well as a method of characterizing the coherent variability in two wavelength ranges on different timescales. In this paper, we lay out the statistical foundations of the cospectrum, starting with the simplest case of detecting a periodic signal in the presence of white noise, under the assumption that the same source is observed simultaneously in independent detectors in the same energy range. This case is especially relevant for detecting faint X-ray pulsars in detectors heavily affected by instrumental effects, including NuSTAR, Astrosat, and IXPE, which allow for even sampling and where the cospectrum can act as an effective way to mitigate dead time. We show that the statistical distributions of both single and averaged cospectra differ considerably from those for standard periodograms. While a single cospectrum follows a Laplace distribution exactly, averaged cospectra are approximated by a Gaussian distribution only for more than ∼30 averaged segments, dependent on the number of trials. We provide an instructive example of a quasi-periodic oscillation in NuSTAR and show that applying standard periodogram statistics leads to underestimated tail probabilities for period detection. We also demonstrate the application of these distributions to a NuSTAR observation of the X-ray pulsar Hercules X-1.

  14. Chemical freezeout parameters within generic nonextensive statistics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel; Yassin, Hayam; Abo Elyazeed, Eman R.

    2018-06-01

    The particle production in relativistic heavy-ion collisions seems to be created in a dynamically disordered system which can be best described by an extended exponential entropy. In distinguishing between the applicability of this and Boltzmann-Gibbs (BG) in generating various particle-ratios, generic (non)extensive statistics is introduced to the hadron resonance gas model. Accordingly, the degree of (non)extensivity is determined by the possible modifications in the phase space. Both BG extensivity and Tsallis nonextensivity are included as very special cases defined by specific values of the equivalence classes (c, d). We found that the particle ratios at energies ranging between 3.8 and 2760 GeV are best reproduced by nonextensive statistics, where c and d range between ˜ 0.9 and ˜ 1 . The present work aims at illustrating that the proposed approach is well capable to manifest the statistical nature of the system on interest. We don't aim at highlighting deeper physical insights. In other words, while the resulting nonextensivity is neither BG nor Tsallis, the freezeout parameters are found very compatible with BG and accordingly with the well-known freezeout phase-diagram, which is in an excellent agreement with recent lattice calculations. We conclude that the particle production is nonextensive but should not necessarily be accompanied by a radical change in the intensive or extensive thermodynamic quantities, such as internal energy and temperature. Only, the two critical exponents defining the equivalence classes (c, d) are the physical parameters characterizing the (non)extensivity.

  15. Effects of McGill stabilization exercises and conventional physiotherapy on pain, functional disability and active back range of motion in patients with chronic non-specific low back pain.

    PubMed

    Ghorbanpour, Arsalan; Azghani, Mahmoud Reza; Taghipour, Mohammad; Salahzadeh, Zahra; Ghaderi, Fariba; Oskouei, Ali E

    2018-04-01

    [Purpose] The aim of this study was to compare the effects of "McGill stabilization exercises" and "conventional physiotherapy" on pain, functional disability and active back flexion and extension range of motion in patients with chronic non-specific low back pain. [Subjects and Methods] Thirty four patients with chronic non-specific low back pain were randomly assigned to McGill stabilization exercises group (n=17) and conventional physiotherapy group (n=17). In both groups, patients performed the corresponding exercises for six weeks. The visual analog scale (VAS), Quebec Low Back Pain Disability Scale Questionnaire and inclinometer were used to measure pain, functional disability, and active back flexion and extension range of motion, respectively. [Results] Statistically significant improvements were observed in pain, functional disability, and active back extension range of motion in McGill stabilization exercises group. However, active back flexion range of motion was the only clinical symptom that statistically increased in patients who performed conventional physiotherapy. There was no significant difference between the clinical characteristics while compared these two groups of patients. [Conclusion] The results of this study indicated that McGill stabilization exercises and conventional physiotherapy provided approximately similar improvement in pain, functional disability, and active back range of motion in patients with chronic non-specific low back pain. However, it appears that McGill stabilization exercises provide an additional benefit to patients with chronic non-specific low back, especially in pain and functional disability improvement.

  16. Anyon black holes

    NASA Astrophysics Data System (ADS)

    Aghaei Abchouyeh, Maryam; Mirza, Behrouz; Karimi Takrami, Moein; Younesizadeh, Younes

    2018-05-01

    We propose a correspondence between an Anyon Van der Waals fluid and a (2 + 1) dimensional AdS black hole. Anyons are particles with intermediate statistics that interpolates between a Fermi-Dirac statistics and a Bose-Einstein one. A parameter α (0 < α < 1) characterizes this intermediate statistics of Anyons. The equation of state for the Anyon Van der Waals fluid shows that it has a quasi Fermi-Dirac statistics for α >αc, but a quasi Bose-Einstein statistics for α <αc. By defining a general form of the metric for the (2 + 1) dimensional AdS black hole and considering the temperature of the black hole to be equal with that of the Anyon Van der Waals fluid, we construct the exact form of the metric for a (2 + 1) dimensional AdS black hole. The thermodynamic properties of this black hole is consistent with those of the Anyon Van der Waals fluid. For α <αc, the solution exhibits a quasi Bose-Einstein statistics. For α >αc and a range of values of the cosmological constant, there is, however, no event horizon so there is no black hole solution. Thus, for these values of cosmological constants, the AdS Anyon Van der Waals black holes have only quasi Bose-Einstein statistics.

  17. EHME: a new word database for research in Basque language.

    PubMed

    Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello

    2014-11-14

    This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words

  18. Statistical strategy for anisotropic adventitia modelling in IVUS.

    PubMed

    Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia

    2006-06-01

    Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.

  19. OmniStats

    DOT National Transportation Integrated Search

    2008-09-01

    OmniStats is an irregular newsletter that looks at single topic in each 2-3 paged issue, drawing statistics from the BTS monthly Omnibus Household Survey. These questions range from such diverse topics as air travel security to disposal of used motor...

  20. Flood- and drought-related natural hazards activities of the U.S. Geological Survey in New England

    USGS Publications Warehouse

    Lombard, Pamela J.

    2016-03-23

    Tools for natural hazard assessment and mitigation • Light detection and ranging (lidar) remote sensing technology • StreamStats Web-based tool for streamflow statistics • Flood inundation mapper

  1. Non-Gaussian information from weak lensing data via deep learning

    NASA Astrophysics Data System (ADS)

    Gupta, Arushi; Matilla, José Manuel Zorrilla; Hsu, Daniel; Haiman, Zoltán

    2018-05-01

    Weak lensing maps contain information beyond two-point statistics on small scales. Much recent work has tried to extract this information through a range of different observables or via nonlinear transformations of the lensing field. Here we train and apply a two-dimensional convolutional neural network to simulated noiseless lensing maps covering 96 different cosmological models over a range of {Ωm,σ8} . Using the area of the confidence contour in the {Ωm,σ8} plane as a figure of merit, derived from simulated convergence maps smoothed on a scale of 1.0 arcmin, we show that the neural network yields ≈5 × tighter constraints than the power spectrum, and ≈4 × tighter than the lensing peaks. Such gains illustrate the extent to which weak lensing data encode cosmological information not accessible to the power spectrum or even other, non-Gaussian statistics such as lensing peaks.

  2. Operational planning using Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe

    2016-05-01

    The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.

  3. Comparison of DSM-IV-TR and DSM-5 Criteria in Diagnosing Autism Spectrum Disorders in Singapore.

    PubMed

    Sung, Min; Goh, Tze Jui; Tan, Bei Lin Joelene; Chan, Jialei Stephanie; Liew, Hwee Sen Alvin

    2018-04-28

    Our study examines the Diagnostic and Statistical Manual-Fifth Edition (DSM-5) and Diagnostic and Statistical Manual-Fourth Edition, Text Revision (DSM-IV-TR) when applied concurrently against the best estimate clinical diagnoses for 110 children (5.1-19.6 years old) referred for diagnostic assessments of Autism Spectrum Disorder (ASD) in a Singaporean outpatient speciality clinic. DSM-IV-TR performed slightly better, yielding sensitivity of 0.946 and specificity of 0.889, compared to DSM-5 (sensitivity = 0.837; specificity = 0.833). When considering the ASD sub-categories, sensitivity ranged from 0.667 to 0.933, and specificity ranged from 0.900 to 0.975. More participants with a PDD-NOS best estimate clinical diagnosis (40%) were misclassified on DSM-5. Merits and weaknesses to both classification systems, and implications for access to services and policy changes are discussed.

  4. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics

    PubMed Central

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms. PMID:23176545

  5. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics.

    PubMed

    Mani, D R; Abbatiello, Susan E; Carr, Steven A

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms.

  6. Inflated Uncertainty in Multimodel-Based Regional Climate Projections.

    PubMed

    Madsen, Marianne Sloth; Langen, Peter L; Boberg, Fredrik; Christensen, Jens Hesselbjerg

    2017-11-28

    Multimodel ensembles are widely analyzed to estimate the range of future regional climate change projections. For an ensemble of climate models, the result is often portrayed by showing maps of the geographical distribution of the multimodel mean results and associated uncertainties represented by model spread at the grid point scale. Here we use a set of CMIP5 models to show that presenting statistics this way results in an overestimation of the projected range leading to physically implausible patterns of change on global but also on regional scales. We point out that similar inconsistencies occur in impact analyses relying on multimodel information extracted using statistics at the regional scale, for example, when a subset of CMIP models is selected to represent regional model spread. Consequently, the risk of unwanted impacts may be overestimated at larger scales as climate change impacts will never be realized as the worst (or best) case everywhere.

  7. Asymmetry of projected increases in extreme temperature distributions

    PubMed Central

    Kodra, Evan; Ganguly, Auroop R.

    2014-01-01

    A statistical analysis reveals projections of consistently larger increases in the highest percentiles of summer and winter temperature maxima and minima versus the respective lowest percentiles, resulting in a wider range of temperature extremes in the future. These asymmetric changes in tail distributions of temperature appear robust when explored through 14 CMIP5 climate models and three reanalysis datasets. Asymmetry of projected increases in temperature extremes generalizes widely. Magnitude of the projected asymmetry depends significantly on region, season, land-ocean contrast, and climate model variability as well as whether the extremes of consideration are seasonal minima or maxima events. An assessment of potential physical mechanisms provides support for asymmetric tail increases and hence wider temperature extremes ranges, especially for northern winter extremes. These results offer statistically grounded perspectives on projected changes in the IPCC-recommended extremes indices relevant for impacts and adaptation studies. PMID:25073751

  8. Temperature and composition dependence of short-range order and entropy, and statistics of bond length: the semiconductor alloy (GaN)(1-x)(ZnO)(x).

    PubMed

    Liu, Jian; Pedroza, Luana S; Misch, Carissa; Fernández-Serra, Maria V; Allen, Philip B

    2014-07-09

    We present total energy and force calculations for the (GaN)1-x(ZnO)x alloy. Site-occupancy configurations are generated from Monte Carlo (MC) simulations, on the basis of a cluster expansion model proposed in a previous study. Local atomic coordinate relaxations of surprisingly large magnitude are found via density-functional calculations using a 432-atom periodic supercell, for three representative configurations at x = 0.5. These are used to generate bond-length distributions. The configurationally averaged composition- and temperature-dependent short-range order (SRO) parameters of the alloys are discussed. The entropy is approximated in terms of pair distribution statistics and thus related to SRO parameters. This approximate entropy is compared with accurate numerical values from MC simulations. An empirical model for the dependence of the bond length on the local chemical environments is proposed.

  9. Higher order statistics of planetary gravities and topographies

    NASA Technical Reports Server (NTRS)

    Kaula, William M.

    1993-01-01

    The statistical properties of Earth, Venus, Mars, Moon, and a 3-D mantle convection model are compared. The higher order properties are expressed by third and fourth moments: i.e., as mean products over equilateral triangles (defined as coskewance) and equilateral quadrangles (defined as coexance). For point values, all the fields of real planets have positive skewness, ranging from slightly above zero for Lunar gravity to 2.6 sigma(exp 3) for Martian gravity (sigma is rms magnitude). Six of the eight excesses are greater than Gaussian (3 sigma(exp 4)), ranging from 2.0 sigma(exp 4) for Earth topography to 18.6 sigma(exp 4), for Martian topography. The coskewances and coexances drop off to zero within 20 deg arc in most cases. The mantle convective model has zero skewness and excess slightly less than Gaussian, probably arising from viscosity variations being only radial.

  10. Cape Canaveral, Florida range reference atmosphere 0-70 km altitude

    NASA Technical Reports Server (NTRS)

    Tingle, A. (Editor)

    1983-01-01

    The RRA contains tabulations for monthly and annual means, standard deviations, skewness coefficients for wind speed, pressure temperature, density, water vapor pressure, virtual temperature, dew-point temperature, and the means and standard deviations for the zonal and meridional wind components and the linear (product moment) correlation coefficient between the wind components. These statistical parameters are tabulated at the station elevation and at 1 km intervals from sea level to 30 km and at 2 km intervals from 30 to 90 km altitude. The wind statistics are given at approximately 10 m above the station elevations and at altitudes with respect to mean sea level thereafter. For those range sites without rocketsonde measurements, the RRAs terminate at 30 km altitude or they are extended, if required, when rocketsonde data from a nearby launch site are available. There are four sets of tables for each of the 12 monthly reference periods and the annual reference period.

  11. Implications of MOLA Global Roughness, Statistics, and Topography

    NASA Technical Reports Server (NTRS)

    Aharonson, O.; Zuber, M. T.; Neumann, G. A.

    1999-01-01

    New insights are emerging as the ongoing high-quality measurements of the Martian surface topography by Mars Orbiter Laser Altimeter (MOLA) on board the Mars Global Surveyor (MGS) spacecraft increase in coverage, resolution, and diversity. For the first time, a global characterization of the statistical properties of topography is possible. The data were collected during the aerobreaking hiatus, science phasing, and mapping orbits of MGS, and have a resolution of 300-400 m along track, a range resolution of 37.5 cm, a range precision of 1-10 m for surface slopes up to 30 deg., and an absolute accuracy of topography of 13 m. The spacecraft's orbit inclination dictates that nadir observations have latitude coverage of about 87.1S to 87.1N; the addition of observations obtained during a period of off-nadir pointing over the north pole extended coverage to 90N. Additional information is contained in the original extended abstract.

  12. Tailored Algorithm for Sensitivity Enhancement of Gas Concentration Sensors Based on Tunable Laser Absorption Spectroscopy.

    PubMed

    Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana Dinora; Baeza-Serrato, Roberto

    2018-06-04

    In this work, a novel tailored algorithm to enhance the overall sensitivity of gas concentration sensors based on the Direct Absorption Tunable Laser Absorption Spectroscopy (DA-ATLAS) method is presented. By using this algorithm, the sensor sensitivity can be custom-designed to be quasi constant over a much larger dynamic range compared with that obtained by typical methods based on a single statistics feature of the sensor signal output (peak amplitude, area under the curve, mean or RMS). Additionally, it is shown that with our algorithm, an optimal function can be tailored to get a quasi linear relationship between the concentration and some specific statistics features over a wider dynamic range. In order to test the viability of our algorithm, a basic C 2 H 2 sensor based on DA-ATLAS was implemented, and its experimental measurements support the simulated results provided by our algorithm.

  13. An empirical evaluation of genetic distance statistics using microsatellite data from bear (Ursidae) populations.

    PubMed

    Paetkau, D; Waits, L P; Clarkson, P L; Craighead, L; Strobeck, C

    1997-12-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data.

  14. An Empirical Evaluation of Genetic Distance Statistics Using Microsatellite Data from Bear (Ursidae) Populations

    PubMed Central

    Paetkau, D.; Waits, L. P.; Clarkson, P. L.; Craighead, L.; Strobeck, C.

    1997-01-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data. PMID:9409849

  15. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  16. Reproducibility-optimized test statistic for ranking genes in microarray studies.

    PubMed

    Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero

    2008-01-01

    A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.

  17. Statistical dielectronic recombination rates for multielectron ions in plasma

    NASA Astrophysics Data System (ADS)

    Demura, A. V.; Leont'iev, D. S.; Lisitsa, V. S.; Shurygin, V. A.

    2017-10-01

    We describe the general analytic derivation of the dielectronic recombination (DR) rate coefficient for multielectron ions in a plasma based on the statistical theory of an atom in terms of the spatial distribution of the atomic electron density. The dielectronic recombination rates for complex multielectron tungsten ions are calculated numerically in a wide range of variation of the plasma temperature, which is important for modern nuclear fusion studies. The results of statistical theory are compared with the data obtained using level-by-level codes ADPAK, FAC, HULLAC, and experimental results. We consider different statistical DR models based on the Thomas-Fermi distribution, viz., integral and differential with respect to the orbital angular momenta of the ion core and the trapped electron, as well as the Rost model, which is an analog of the Frank-Condon model as applied to atomic structures. In view of its universality and relative simplicity, the statistical approach can be used for obtaining express estimates of the dielectronic recombination rate coefficients in complex calculations of the parameters of the thermonuclear plasmas. The application of statistical methods also provides information for the dielectronic recombination rates with much smaller computer time expenditures as compared to available level-by-level codes.

  18. The initial safe range of motion of the ankle joint after three methods of internal fixation of simulated fractures of the medial malleolus.

    PubMed

    Shimamura, Yoshio; Kaneko, Kazuo; Kume, Kazuhiko; Maeda, Mutsuhiro; Iwase, Hideaki

    2006-07-01

    Previous studies have demonstrated the safe passive range of ankle motion for inter-bone stiffness after internal fixation under load but there is a lack of information about the safe range of ankle motion for early rehabilitation in the absence of loading. The present study was designed to assess the effect of ankle movement on inter-bone displacement characteristics of medial malleolus fractures following three types of internal fixation to determine the safe range of motion. Five lower legs obtained during autopsy were used to assess three types of internal fixation (two with Kirschner-wires alone; two with Kirschner-wires plus tension band wiring; and, one with an AO/ASIF malleolar screw alone). Following a simulated fracture by sawing through the medial malleolus the displacement between the fractured bone ends was measured during a passive range of movement with continuous monitoring using omega (Omega) shaped transducers and a biaxial flexible goniometer. Statistical analysis was performed with repeated measures analysis of variance. Inter-bone displacement was not proportional to the magnitude of movement throughout the range of ankle motion as, when separation exceeded 25 microm, there was increasingly wide separation as plantar-flexion or dorsal-flexion was increased. There was no statistical significant difference between the small amount of inter-bone displacement observed with three types of fixation within the safe range of dorsal-flexion and plantar-flexion for early rehabilitation. However the inter-bone separation when fixation utilized two Kirschner-wires alone tended to be greater than when using the other two types of fixation during dorsal-flexion and eversion. The present study revealed a reproducible range of ankle motion for early rehabilitation which was estimated to be within the range of 20 degrees of dorsal-flexion and 10 degrees of plantar-flexion without eversion. Also, internal fixation with two Kirschner-wires alone does not seem to provide stability achieved by the other two forms of fixation.

  19. A New Paradigm in Modeling and Simulations of Complex Oxidation Chemistry Using a Statistical Approach

    DTIC Science & Technology

    2009-03-31

    8. This range encompasses diesel , HCCI and gas turbine engines , including cold ignition; and NOx , CO and soot pollutant formation in the lean and...equivalence ratios from 0.125 to 8. This range encompasses diesel , HCCI and gas turbine engines , including cold ignition; and NOx , CO and soot pollutant...California Institute of Technology Mechanical Engineering Department Pasadena CA 91125 i Abstract This report describes a study

  20. Influence of Waveform Characteristics on LiDAR Ranging Accuracy and Precision

    PubMed Central

    Yang, Bingwei; Xie, Xinhao; Li, Duan

    2018-01-01

    Time of flight (TOF) based light detection and ranging (LiDAR) is a technology for calculating distance between start/stop signals of time of flight. In lab-built LiDAR, two ranging systems for measuring flying time between start/stop signals include time-to-digital converter (TDC) that counts time between trigger signals and analog-to-digital converter (ADC) that processes the sampled start/stop pulses waveform for time estimation. We study the influence of waveform characteristics on range accuracy and precision of two kinds of ranging system. Comparing waveform based ranging (WR) with analog discrete return system based ranging (AR), a peak detection method (WR-PK) shows the best ranging performance because of less execution time, high ranging accuracy, and stable precision. Based on a novel statistic mathematical method maximal information coefficient (MIC), WR-PK precision has a high linear relationship with the received pulse width standard deviation. Thus keeping the received pulse width of measuring a constant distance as stable as possible can improve ranging precision. PMID:29642639

  1. Study of aluminum particle combustion in solid propellant plumes using digital in-line holography and imaging pyrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.

    The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less

  2. Development of a new pan-European testate amoeba transfer function for reconstructing peatland palaeohydrology

    NASA Astrophysics Data System (ADS)

    Amesbury, Matthew J.; Swindles, Graeme T.; Bobrov, Anatoly; Charman, Dan J.; Holden, Joseph; Lamentowicz, Mariusz; Mallon, Gunnar; Mazei, Yuri; Mitchell, Edward A. D.; Payne, Richard J.; Roland, Thomas P.; Turner, T. Edward; Warner, Barry G.

    2016-11-01

    In the decade since the first pan-European testate amoeba-based transfer function for peatland palaeohydrological reconstruction was published, a vast amount of additional data collection has been undertaken by the research community. Here, we expand the pan-European dataset from 128 to 1799 samples, spanning 35° of latitude and 55° of longitude. After the development of a new taxonomic scheme to permit compilation of data from a wide range of contributors and the removal of samples with high pH values, we developed ecological transfer functions using a range of model types and a dataset of ∼1300 samples. We rigorously tested the efficacy of these models using both statistical validation and independent test sets with associated instrumental data. Model performance measured by statistical indicators was comparable to other published models. Comparison to test sets showed that taxonomic resolution did not impair model performance and that the new pan-European model can therefore be used as an effective tool for palaeohydrological reconstruction. Our results question the efficacy of relying on statistical validation of transfer functions alone and support a multi-faceted approach to the assessment of new models. We substantiated recent advice that model outputs should be standardised and presented as residual values in order to focus interpretation on secure directional shifts, avoiding potentially inaccurate conclusions relating to specific water-table depths. The extent and diversity of the dataset highlighted that, at the taxonomic resolution applied, a majority of taxa had broad geographic distributions, though some morphotypes appeared to have restricted ranges.

  3. [Case-control study on cable-pin system in the treatment of olecranon fractures].

    PubMed

    Ma, Hu-Jing; Shan, Lei; Zhou, Jun-Lin; Liu, Qing-He; Lu, Tie; Sun, Song

    2012-05-01

    To prospectively evaluate the clinical result of Cable-Pin system in the treatment of olecranon fractures and compare with tension band wiring (TBW) method. From March 2008 to June 2010,65 patients with olecranon fractures were divided into two groups: 32 patients in Cable-Pin group were treated with Cable-Pin system, including 18 males and 14 females, ranging in age from 21 to 69 years, with an average of (53.69 +/- 13.42) years; 33 patients in TBW group were treated with Kirschner tension bend, including 20 males and 13 females, ranging in age from 20 to 70 years, with an average of (53.18 +/- 13.36) years. The incision length, operation time, the amounts of hemoglobin after operation, fracture healing time, complications and HSS elbow scores were recorded and analyzed statistically. The follow-up period ranged from 12 to 24 months, with an average period of 18.4 months. There were statistical differences (P<0.05) in fracture healing time (t= 2.588, P=0.012), complication rate (chi2=4.534, P=0.033) and HSS elbow joint scores (Z=-2.039, P=0.041) between two groups, which all were superior to TBW in Cable-Pin group. There was no statistical differences (P>0.05) in the length of incision (t= 0.416, P=0.679), operation time (t=0.816, P=0.417) and the postoperative amounts of hemoglobin (t=-0.553, P=0.294) between two groups. Cable-Pin system is an easy and reliable method for the treatment of olecranon fractures with less complications and better functions than TBW.

  4. Clinical application of the FACES score for face transplantation.

    PubMed

    Chopra, Karan; Susarla, Srinivas M; Goodrich, Danielle; Bernard, Steven; Zins, James E; Papay, Frank; Lee, W P Andrew; Gordon, Chad R

    2014-01-01

    This study aimed to systematically evaluate all reported outcomes of facial allotransplantation (FT) using the previously described FACES scoring instrument. This was a retrospective study of all consecutive face transplants to date (January 2012). Candidates were identified using medical and general internet database searches. Medical literature and media reports were reviewed for details regarding demographic, operative, anatomic, and psychosocial data, which were then used to formulate FACES scores. Pre-transplant and post-transplant scores for "functional status", "aesthetic deformity", "co-morbidities", "exposed tissue", and "surgical history" were calculated. Scores were statistically compared using paired-samples analyses. Twenty consecutive patients were identified, with 18 surviving recipients. The sample was composed of 3 females and 17 males, with a mean age of 35.0 ± 11.0 years (range: 19-57 years). Overall, data reporting for functional parameters was poor. Six subjects had complete pre-transplant and post-transplant data available for all 5 FACES domains. The mean pre-transplant FACES score was 33.5 ± 8.8 (range: 23-44); the mean post-transplant score was 21.5 ± 5.9 (range: 14-32) and was statistically significantly lower than the pre-transplant score (P = 0.02). Among the individual domains, FT conferred a statistically significant improvement in aesthetic defect scores and exposed tissue scores (P ≤ 0.01) while, at the same time, it displayed no significant increases in co-morbidity (P = 0.17). There is a significant deficiency in functional outcome reports thus far. Moreover, FT resulted in improved overall FACES score, with the most dramatic improvements noted in aesthetic defect and exposed tissue scores.

  5. Study of aluminum particle combustion in solid propellant plumes using digital in-line holography and imaging pyrometry

    DOE PAGES

    Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.; ...

    2017-05-05

    The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less

  6. Risk-based Methodology for Validation of Pharmaceutical Batch Processes.

    PubMed

    Wiles, Frederick

    2013-01-01

    In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.

  7. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    PubMed

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. An instrument to assess the statistical intensity of medical research papers.

    PubMed

    Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu

    2017-01-01

    There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  9. Confidence interval or p-value?: part 4 of a series on evaluation of scientific publications.

    PubMed

    du Prel, Jean-Baptist; Hommel, Gerhard; Röhrig, Bernd; Blettner, Maria

    2009-05-01

    An understanding of p-values and confidence intervals is necessary for the evaluation of scientific articles. This article will inform the reader of the meaning and interpretation of these two statistical concepts. The uses of these two statistical concepts and the differences between them are discussed on the basis of a selective literature search concerning the methods employed in scientific articles. P-values in scientific studies are used to determine whether a null hypothesis formulated before the performance of the study is to be accepted or rejected. In exploratory studies, p-values enable the recognition of any statistically noteworthy findings. Confidence intervals provide information about a range in which the true value lies with a certain degree of probability, as well as about the direction and strength of the demonstrated effect. This enables conclusions to be drawn about the statistical plausibility and clinical relevance of the study findings. It is often useful for both statistical measures to be reported in scientific articles, because they provide complementary types of information.

  10. Statistics of Dark Matter Halos from Gravitational Lensing.

    PubMed

    Jain; Van Waerbeke L

    2000-02-10

    We present a new approach to measure the mass function of dark matter halos and to discriminate models with differing values of Omega through weak gravitational lensing. We measure the distribution of peaks from simulated lensing surveys and show that the lensing signal due to dark matter halos can be detected for a wide range of peak heights. Even when the signal-to-noise ratio is well below the limit for detection of individual halos, projected halo statistics can be constrained for halo masses spanning galactic to cluster halos. The use of peak statistics relies on an analytical model of the noise due to the intrinsic ellipticities of source galaxies. The noise model has been shown to accurately describe simulated data for a variety of input ellipticity distributions. We show that the measured peak distribution has distinct signatures of gravitational lensing, and its non-Gaussian shape can be used to distinguish models with different values of Omega. The use of peak statistics is complementary to the measurement of field statistics, such as the ellipticity correlation function, and is possibly not susceptible to the same systematic errors.

  11. Lagrangian statistics of turbulent dispersion from 81923 direct numerical simulation of isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Buaria, Dhawal; Yeung, P. K.; Sawford, B. L.

    2016-11-01

    An efficient massively parallel algorithm has allowed us to obtain the trajectories of 300 million fluid particles in an 81923 simulation of isotropic turbulence at Taylor-scale Reynolds number 1300. Conditional single-particle statistics are used to investigate the effect of extreme events in dissipation and enstrophy on turbulent dispersion. The statistics of pairs and tetrads, both forward and backward in time, are obtained via post-processing of single-particle trajectories. For tetrads, since memory of shape is known to be short, we focus, for convenience, on samples which are initially regular, with all sides of comparable length. The statistics of tetrad size show similar behavior as the two-particle relative dispersion, i.e., stronger backward dispersion at intermediate times with larger backward Richardson constant. In contrast, the statistics of tetrad shape show more robust inertial range scaling, in both forward and backward frames. However, the distortion of shape is stronger for backward dispersion. Our results suggest that the Reynolds number reached in this work is sufficient to settle some long-standing questions concerning Lagrangian scale similarity. Supported by NSF Grants CBET-1235906 and ACI-1036170.

  12. Enforcing Job Safety: A Managerial View

    ERIC Educational Resources Information Center

    Barnako, Frank R.

    1975-01-01

    The views of management or of employees regarding enforcement of the job safety law range from general satisfaction to calls for repeal of the act. The complexity of standards, statistics and recordkeeping, and enforcement procedures are major areas of concern. (MW)

  13. Development of modelling algorithm of technological systems by statistical tests

    NASA Astrophysics Data System (ADS)

    Shemshura, E. A.; Otrokov, A. V.; Chernyh, V. G.

    2018-03-01

    The paper tackles the problem of economic assessment of design efficiency regarding various technological systems at the stage of their operation. The modelling algorithm of a technological system was performed using statistical tests and with account of the reliability index allows estimating the level of machinery technical excellence and defining the efficiency of design reliability against its performance. Economic feasibility of its application shall be determined on the basis of service quality of a technological system with further forecasting of volumes and the range of spare parts supply.

  14. Statistical reporting of clinical pharmacology research.

    PubMed

    Ring, Arne; Schall, Robert; Loke, Yoon K; Day, Simon

    2017-06-01

    Research in clinical pharmacology covers a wide range of experiments, trials and investigations: clinical trials, systematic reviews and meta-analyses of drug usage after market approval, the investigation of pharmacokinetic-pharmacodynamic relationships, the search for mechanisms of action or for potential signals for efficacy and safety using biomarkers. Often these investigations are exploratory in nature, which has implications for the way the data should be analysed and presented. Here we summarize some of the statistical issues that are of particular importance in clinical pharmacology research. © 2017 The British Pharmacological Society.

  15. Comment on 'Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification'.

    PubMed

    Sitek, Arkadiusz

    2016-12-21

    The origin ensemble (OE) algorithm is a new method used for image reconstruction from nuclear tomographic data. The main advantage of this algorithm is the ease of implementation for complex tomographic models and the sound statistical theory. In this comment, the author provides the basics of the statistical interpretation of OE and gives suggestions for the improvement of the algorithm in the application to prompt gamma imaging as described in Polf et al (2015 Phys. Med. Biol. 60 7085).

  16. Comment on ‘Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification’

    NASA Astrophysics Data System (ADS)

    Sitek, Arkadiusz

    2016-12-01

    The origin ensemble (OE) algorithm is a new method used for image reconstruction from nuclear tomographic data. The main advantage of this algorithm is the ease of implementation for complex tomographic models and the sound statistical theory. In this comment, the author provides the basics of the statistical interpretation of OE and gives suggestions for the improvement of the algorithm in the application to prompt gamma imaging as described in Polf et al (2015 Phys. Med. Biol. 60 7085).

  17. On the Development of Turbulent Wakes from Vortex Streets

    NASA Technical Reports Server (NTRS)

    Roshko, Anatol

    1954-01-01

    Wake development behind circular cylinders at Reynolds numbers from 40 to 10,000 was investigated in a low-speed wind tunnel. Standard hot-wire techniques were used to study the velocity fluctuations. The Reynolds number range of periodic vortex shedding is divided into two distinct subranges. At r=40 to 150, called the stable range, regular vortex streets are formed and no turbulent velocity fluctuations accompany the periodic formation of vortices. The range r=150 to 300 is a transition range to a regime called the irregular range, in which turbulent velocity fluctuations accompany the periodic formation of vortices. The turbulence is initiated by laminar-turbulent transition in the free layers which spring from the separation points on the cylinder. The transition first occurs in the range r=150 to 300. Spectrum and statistical measurements were made to study the velocity fluctuations.

  18. High dynamic range subjective testing

    NASA Astrophysics Data System (ADS)

    Allan, Brahim; Nilsson, Mike

    2016-09-01

    This paper describes of a set of subjective tests that the authors have carried out to assess the end user perception of video encoded with High Dynamic Range technology when viewed in a typical home environment. Viewers scored individual single clips of content, presented in High Definition (HD) and Ultra High Definition (UHD), in Standard Dynamic Range (SDR), and in High Dynamic Range (HDR) using both the Perceptual Quantizer (PQ) and Hybrid Log Gamma (HLG) transfer characteristics, and presented in SDR as the backwards compatible rendering of the HLG representation. The quality of SDR HD was improved by approximately equal amounts by either increasing the dynamic range or increasing the resolution to UHD. A further smaller increase in quality was observed in the Mean Opinion Scores of the viewers by increasing both the dynamic range and the resolution, but this was not quite statistically significant.

  19. Intraoperative detection of 18F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria

    PubMed Central

    2013-01-01

    Background Intraoperative detection of 18F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of 18F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Methods Of 58 patients undergoing 18F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine 18F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each 18F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. Results The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2–15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0–2.1) and 1.0 (± 0, range 1.0–1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Conclusions Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical threshold criteria method can allow for improved detection of 18F-FDG-avid tissue sites when a low in situ T/B ratio is encountered. PMID:23496877

  20. Short-term intraocular pressure trends following intravitreal ranibizumab injections for neovascular age-related macular degeneration-the role of oral acetazolamide in protecting glaucoma patients.

    PubMed

    Murray, C D; Wood, D; Allgar, V; Walters, G; Gale, R P

    2014-10-01

    To determine the effect of oral acetazolamide on lowering the peak and duration of intraocular pressure (IOP) rise in glaucoma and glaucoma suspect patients, following intravitreal injection of ranibizumab for neovascular age-related macular degeneration. The study was an open-label, parallel, randomised, controlled trial (EudraCT Number: 2010-023037-35). Twenty-four glaucoma or glaucoma suspect patients received either 500 mg acetazolamide or no treatment 60-90 min before 0.5 mg ranibizumab. The primary outcome measure was the difference in IOP immediately after injection (T0) and 5, 10, and 30 min following injection. ANCOVA was used to compare groups, adjusting for baseline IOP. The study was powered to detect a 9-mm Hg difference at T0. The IOP at T0 was 2.3 mm Hg higher in the non-treated group (mean 44.5 mm Hg, range (19-86 mm Hg)) compared with the treated group (mean 42.2 mm Hg, range (25-58 mm Hg)), but was not statistically significant after adjusting for baseline IOP (P=0.440). At 30 min, IOP was 4.9 mm Hg higher in the non-treated group (mean 20.6 mm Hg, range (11-46 mm Hg)) compared with the treated group (mean 15.7 mm Hg, range (8-21 mm Hg)). This was statistically significant after adjusting for baseline IOP (P=0.013). Although the primary end points were not reached, 500 mg oral acetazolamide, 60-90 min before intravitreal injection, results in a statistically significant reduction in IOP at 3O min post injection. Prophylactic treatment may be considered as an option to minimise neuro-retinal rim damage in high-risk glaucoma patients who are most vulnerable to IOP spikes and undergoing repeated intravitreal injections of ranibizumab.

  1. Physical and cognitive fitness in young adulthood and risk of amyotrophic lateral sclerosis at an early age.

    PubMed

    Longinetti, E; Mariosa, D; Larsson, H; Almqvist, C; Lichtenstein, P; Ye, W; Fang, F

    2017-01-01

    There is a clinical impression that patients with amyotrophic lateral sclerosis (ALS) have a higher level of physical fitness and lower body mass index (BMI) than average. However, there is a lack of literature examining the relationship between cognitive fitness and ALS risk. In this study we explored the associations of both physical and cognitive fitness with future risk of ALS. Data on physical fitness, BMI, intelligence quotient (IQ) and stress resilience were collected from 1 838 376 Swedish men aged 17-20 years at conscription during 1968-2010. Their subsequent ALS diagnoses were identified through the Swedish Patient Register. Hazard ratios (HRs) and 95% CIs from flexible parametric models were used to assess age-specific associations of physical fitness, BMI, IQ and stress resilience with ALS. We identified 439 incident ALS cases during follow-up (mean age at diagnosis: 48 years). Individuals with physical fitness above the highest tertile tended to have a higher risk of ALS before the age of 45 years (range of HRs: 1.42-1.75; statistically significant associations at age 41-43 years) compared with others. Individuals with BMI ≥ 25 tended to have a lower risk of ALS at all ages (range of HRs: 0.42-0.80; statistically significant associations at age 42-48 years) compared with those with BMI < 25. Individuals with IQ above the highest tertile had a statistically significantly increased risk of ALS at an age of 56 years and above (range of HRs: 1.33-1.81), whereas individuals with stress resilience above the highest tertile had a lower risk of ALS at an age of 55 years and below (range of HRs: 0.47-0.73). Physical fitness, BMI, IQ and stress resilience in young adulthood might be associated with the development of ALS at an early age. © 2016 EAN.

  2. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    PubMed

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants. Copyright © 2018. Published by Elsevier Ltd.

  3. The effect of static cyclotorsion compensation on refractive and visual outcomes using the Schwind Amaris laser platform for the correction of high astigmatism.

    PubMed

    Aslanides, Ioannis M; Toliou, Georgia; Padroni, Sara; Arba Mosquera, Samuel; Kolli, Sai

    2011-06-01

    To compare the refractive and visual outcomes using the Schwind Amaris excimer laser in patients with high astigmatism (>1D) with and without the static cyclotorsion compensation (SCC) algorithm available with this new laser platform. 70 consecutive eyes with ≥1D astigmatism were randomized to treatment with compensation of static cyclotorsion (SCC group- 35 eyes) or not (control group- 35 eyes). A previously validated optimized aspheric ablation algorithm profile was used in every case. All patients underwent LASIK with a microkeratome cut flap. The SCC and control group did not differ preoperatively, in terms of refractive error, magnitude of astigmatism or in terms of cardinal or oblique astigmatism. Following treatment, average deviation from target was SEq +0.16D, SD±0.52 D, range -0.98 D to +1.71 D in the SCC group compared to +0.46 D, SD±0.61 D, range -0.25 D to +2.35 D in the control group, which was statistically significant (p<0.05). Following treatment, average astigmatism was 0.24 D (SD±0.28 D, range -1.01 D to 0.00 D) in the SCC group compared to 0.46 D (SD±0.42 D, range -1.80 D to 0.00 D) in the control group, which was highly statistically significant (p<0.005). There was no statistical difference in the postoperative uncorrected vision when the aspheric algorithm was used although there was a trend to increased number of lines gained in the SCC group. This study shows that static cyclotorsion is accurately compensated for by the Schwind Amaris laser platform. The compensation of static cyclotorsion in patients with moderate astigmatism produces a significant improvement in refractive and astigmatic outcomes than when not compensated. Copyright © 2011 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  4. Nocturnal oxygen saturation profiles of healthy term infants.

    PubMed

    Terrill, Philip Ian; Dakin, Carolyn; Hughes, Ian; Yuill, Maggie; Parsley, Chloe

    2015-01-01

    Pulse oximetry is used extensively in hospital and home settings to measure arterial oxygen saturation (SpO2). Interpretation of the trend and range of SpO2 values observed in infants is currently limited by a lack of reference ranges using current devices, and may be augmented by development of cumulative frequency (CF) reference-curves. This study aims to provide reference oxygen saturation values from a prospective longitudinal cohort of healthy infants. Prospective longitudinal cohort study. Sleep-laboratory. 34 healthy term infants were enrolled, and studied at 2 weeks, 3, 6, 12 and 24 months of age (N=30, 25, 27, 26, 20, respectively). Full overnight polysomnography, including 2 s averaging pulse oximetry (Masimo Radical). Summary SpO2 statistics (mean, median, 5th and 10th percentiles) and SpO2 CF plots were calculated for each recording. CF reference-curves were then generated for each study age. Analyses were repeated with sleep-state stratifications and inclusion of manual artefact removal. Median nocturnal SpO2 values ranged between 98% and 99% over the first 2 years of life and the CF reference-curves shift right by 1% between 2 weeks and 3 months. CF reference-curves did not change with manual artefact removal during sleep and did not vary between rapid eye movement (REM) and non-REM sleep. Manual artefact removal did significantly change summary statistics and CF reference-curves during wake. SpO2 CF curves provide an intuitive visual tool for evaluating whether an individual's nocturnal SpO2 distribution falls within the range of healthy age-matched infants, thereby complementing summary statistics in the interpretation of extended oximetry recordings in infants. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Retrospective review of efficacy of radiofrequency ablation for treatment of colorectal cancer liver metastases from a Canadian perspective.

    PubMed

    Kwan, Benjamin Y M; Kielar, Ania Z; El-Maraghi, Robert H; Garcia, Lourdes M

    2014-02-01

    A retrospective single-center review of ultrasound-guided radiofrequency ablation (RFA) treatment of colorectal cancer liver metastases was performed. This study reviews the primary and secondary technical effectiveness, overall survival of patients, recurrence-free survival, tumour-free survival, rates of local recurrence, and postprocedural RFA complications. Technical effectiveness and rates of complication with respect to tumour location and size were evaluated. Our results were compared with similar studies from Europe and North America. A total of 63 patients (109 tumours) treated with RFA between February 2004 and December 2009 were reviewed. Average and median follow-up time was 19.4 and 16.5 months, respectively (range, 1-54 months). Data from patient charts, pathology, and Picture Archiving and Communication System was integrated into an Excel database. Statistical Analysis Software was used for statistical analysis. Primary and secondary technical effectiveness of percutaneous and intraoperative RFA were 90.8% and 92.7%, respectively. Average (SE) tumour-free survival was 14.4 ± 1.4 months (range, 1-43 months), and average (SE) recurrence-free survival was 33.5 ± 2.3 months (range, 2-50 months). Local recurrence was seen in 31.2% of treated tumours (range, 2-50 months) (34/109). Overall survival was 89.4% at 1 year, 70.0% at 2 years, and 38.1% at 3 years, with an average (SE) overall survival of 37.0 ± 2.8 months. There were 14 postprocedural complications. There was no statistically significant difference in technical effectiveness for small tumours (1-2 cm) vs intermediate ones (3-5 cm). There was no difference in technical effectiveness for peripheral vs parenchymal tumours. This study demonstrated good-quality outcomes for RFA treatment of colorectal cancer liver metastases from a Canadian perspective and compared favorably with published studies. Copyright © 2014 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  6. Statewide analysis of the drainage-area ratio method for 34 streamflow percentile ranges in Texas

    USGS Publications Warehouse

    Asquith, William H.; Roussel, Meghan C.; Vrabel, Joseph

    2006-01-01

    The drainage-area ratio method commonly is used to estimate streamflow for sites where no streamflow data are available using data from one or more nearby streamflow-gaging stations. The method is intuitive and straightforward to implement and is in widespread use by analysts and managers of surface-water resources. The method equates the ratio of streamflow at two stream locations to the ratio of the respective drainage areas. In practice, unity often is assumed as the exponent on the drainage-area ratio, and unity also is assumed as a multiplicative bias correction. These two assumptions are evaluated in this investigation through statewide analysis of daily mean streamflow in Texas. The investigation was made by the U.S. Geological Survey in cooperation with the Texas Commission on Environmental Quality. More than 7.8 million values of daily mean streamflow for 712 U.S. Geological Survey streamflow-gaging stations in Texas were analyzed. To account for the influence of streamflow probability on the drainage-area ratio method, 34 percentile ranges were considered. The 34 ranges are the 4 quartiles (0-25, 25-50, 50-75, and 75-100 percent), the 5 intervals of the lower tail of the streamflow distribution (0-1, 1-2, 2-3, 3-4, and 4-5 percent), the 20 quintiles of the 4 quartiles (0-5, 5-10, 10-15, 15-20, 20-25, 25-30, 30-35, 35-40, 40-45, 45-50, 50-55, 55-60, 60-65, 65-70, 70-75, 75-80, 80-85, 85-90, 90-95, and 95-100 percent), and the 5 intervals of the upper tail of the streamflow distribution (95-96, 96-97, 97-98, 98-99 and 99-100 percent). For each of the 253,116 (712X711/2) unique pairings of stations and for each of the 34 percentile ranges, the concurrent daily mean streamflow values available for the two stations provided for station-pair application of the drainage-area ratio method. For each station pair, specific statistical summarization (median, mean, and standard deviation) of both the exponent and bias-correction components of the drainage-area ratio method were computed. Statewide statistics (median, mean, and standard deviation) of the station-pair specific statistics subsequently were computed and are tabulated herein. A separate analysis considered conditioning station pairs to those stations within 100 miles of each other and with the absolute value of the logarithm (base-10) of the ratio of the drainage areas greater than or equal to 0.25. Statewide statistics of the conditional station-pair specific statistics were computed and are tabulated. The conditional analysis is preferable because of the anticipation that small separation distances reflect similar hydrologic conditions and the observation of large variation in exponent estimates for similar-sized drainage areas. The conditional analysis determined that the exponent is about 0.89 for streamflow percentiles from 0 to about 50 percent, is about 0.92 for percentiles from about 50 to about 65 percent, and is about 0.93 for percentiles from about 65 to about 85 percent. The exponent decreases rapidly to about 0.70 for percentiles nearing 100 percent. The computation of the bias-correction factor is sensitive to the range analysis interval (range of streamflow percentile); however, evidence suggests that in practice the drainage-area method can be considered unbiased. Finally, for general application, suggested values of the exponent are tabulated for 54 percentiles of daily mean streamflow in Texas; when these values are used, the bias correction is unity.

  7. Trends in precipitation, streamflow, reservoir pool elevations, and reservoir releases in Arkansas and selected sites in Louisiana, Missouri, and Oklahoma, 1951–2011

    USGS Publications Warehouse

    Wagner, Daniel M.; Krieger, Joshua D.; Merriman, Katherine R.

    2014-01-01

    The U.S. Geological Survey (USGS) and the U.S. Army Corps of Engineers (USACE) conducted a statistical analysis of trends in precipitation, streamflow, reservoir pool elevations, and reservoir releases in Arkansas and selected sites in Louisiana, Missouri, and Oklahoma for the period 1951–2011. The Mann-Kendall test was used to test for trends in annual and seasonal precipitation, annual and seasonal streamflows of 42 continuous-record USGS streamflow-gaging stations, annual pool elevations and releases from 16 USACE reservoirs, and annual releases from 11 dams on the Arkansas River. A statistically significant (p≤0.10) upward trend was observed in annual precipitation for the State, with a Sen slope of approximately 0.10 inch per year. Autumn and winter were the only seasons that had statistically significant trends in precipitation. Five of six physiographic sections and six of seven 4-digit hydrologic unit code (HUC) regions in Arkansas had statistically significant upward trends in autumn precipitation, with Sen slopes of approximately 0.06 to 0.10 inch per year. Sixteen sites had statistically significant upward trends in the annual mean daily streamflow and were located on streams that drained regions with statistically significant upward trends in annual precipitation. Expected annual rates of change corresponding to statistically significant trends in annual mean daily streamflows, which ranged from 0.32 to 0.88 percent, were greater than those corresponding to regions with statistically significant upward trends in annual precipitation, which ranged from 0.19 to 0.28 percent, suggesting that the observed trends in regional annual precipitation do not fully account for the observed trends in annual mean daily streamflows. Trends in annual maximum daily streamflows were similar to trends in the annual mean daily streamflows but were only statistically significant at seven sites. There were more statistically significant trends (28 of 42 sites) in the annual minimum daily streamflows than in the annual means or maximums. Statistically significant trends in the annual minimum daily streamflows were upward at 18 sites and downward at 10 sites. Despite autumn being the only season that had statistically significant upward trends in seasonal precipitation, statistically significant upward trends in seasonal mean streamflows occurred in every season but spring. Trends in the annual mean, maximum, and minimum daily pool elevations of USACE reservoirs were consistent between metrics for reservoirs in the White, Arkansas, and Ouachita River watersheds, while trends varied between metrics at DeQueen Lake, Millwood Lake, and Lake Chicot. Most of the statistically significant trends in pool elevation metrics were upward and gradual—Sen slopes were less than 0.37 foot per year—and were likely the result of changes in reservoir regulation plans. Trends in the annual mean and maximum daily releases from USACE reservoirs were generally upward in all HUC regions. There were few statistically significant trends in the annual mean daily releases because the reservoirs are operated to maintain a regulation stage at a downstream site according to guidelines set forth in the regulation plans of the reservoirs. The annual number of low-flow days was both increasing and decreasing for reservoirs in northern Arkansas and southern Missouri and generally increasing for reservoirs in southern Arkansas.

  8. Measurement of Nonverbal IQ in Autism Spectrum Disorder: Scores in Young Adulthood compared to Early Childhood

    PubMed Central

    Bishop, Somer L.; Farmer, Cristan; Thurm, Audrey

    2014-01-01

    Nonverbal IQ (NVIQ) was examined in 84 individuals with ASD followed from age 2 to 19. Most adults who scored in the range of ID also received scores below 70 as children, and the majority of adults with scores in the average range had scored in this range by age 3. However, within the lower ranges of ability, actual scores declined from age 2 to 19, likely due in part to limitations of appropriate tests. Use of Vineland-II DLS scores in place of NVIQ did not statistically improve the correspondence between age 2 and age 19 scores. Clinicians and researchers should use caution when making comparisons based on exact scores or specific ability ranges within or across individuals with ASD of different ages. PMID:25239176

  9. Theoretical analysis of HVAC duct hanger systems

    NASA Technical Reports Server (NTRS)

    Miller, R. D.

    1987-01-01

    Several methods are presented which, together, may be used in the analysis of duct hanger systems over a wide range of frequencies. The finite element method (FEM) and component mode synthesis (CMS) method are used for low- to mid-frequency range computations and have been shown to yield reasonably close results. The statistical energy analysis (SEA) method yields predictions which agree with the CMS results for the 800 to 1000 Hz range provided that a sufficient number of modes participate. The CMS approach has been shown to yield valuable insight into the mid-frequency range of the analysis. It has been demonstrated that it is possible to conduct an analysis of a duct/hanger system in a cost-effective way for a wide frequency range, using several methods which overlap for several frequency bands.

  10. Adaptation and colonization history affect the evolution of clines in two introduced species.

    PubMed

    Keller, Stephen R; Sowell, Dexter R; Neiman, Maurine; Wolfe, Lorne M; Taylor, Douglas R

    2009-08-01

    Phenotypic and genetic clines have long been synonymous with adaptive evolution. However, other processes (for example, migration, range expansion, invasion) may generate clines in traits or loci across geographical and environmental gradients. It is therefore important to distinguish between clines that represent adaptive evolution and those that result from selectively neutral demographic or genetic processes. We tested for the differentiation of phenotypic traits along environmental gradients using two species in the genus Silene, whilst statistically controlling for colonization history and founder effects. We sampled seed families from across the native and introduced ranges, genotyped individuals and estimated phenotypic differentiation in replicated common gardens. The results suggest that post-glacial expansion of S. vulgaris and S. latifolia involved both neutral and adaptive genetic differentiation (clines) of life history traits along major axes of environmental variation in Europe and North America. Phenotypic clines generally persisted when tested against the neutral expectation, although some clines disappeared (and one cline emerged) when the effects of genetic ancestry were statistically removed. Colonization history, estimated using genetic markers, is a useful null model for tests of adaptive trait divergence, especially during range expansion and invasion when selection and gene flow may not have reached equilibrium.

  11. Generalization of symmetric α-stable Lévy distributions for q >1

    NASA Astrophysics Data System (ADS)

    Umarov, Sabir; Tsallis, Constantino; Gell-Mann, Murray; Steinberg, Stanly

    2010-03-01

    The α-stable distributions introduced by Lévy play an important role in probabilistic theoretical studies and their various applications, e.g., in statistical physics, life sciences, and economics. In the present paper we study sequences of long-range dependent random variables whose distributions have asymptotic power-law decay, and which are called (q,α)-stable distributions. These sequences are generalizations of independent and identically distributed α-stable distributions and have not been previously studied. Long-range dependent (q,α)-stable distributions might arise in the description of anomalous processes in nonextensive statistical mechanics, cell biology, finance. The parameter q controls dependence. If q =1 then they are classical independent and identically distributed with α-stable Lévy distributions. In the present paper we establish basic properties of (q,α)-stable distributions and generalize the result of Umarov et al. [Milan J. Math. 76, 307 (2008)], where the particular case α =2,qɛ[1,3) was considered, to the whole range of stability and nonextensivity parameters α ɛ(0,2] and q ɛ[1,3), respectively. We also discuss possible further extensions of the results that we obtain and formulate some conjectures.

  12. Effect of Control Mode and Test Rate on the Measured Fracture Toughness of Advanced Ceramics

    NASA Technical Reports Server (NTRS)

    Hausmann, Bronson D.; Salem, Jonathan A.

    2018-01-01

    The effects of control mode and test rate on the measured fracture toughness of ceramics were evaluated by using chevron-notched flexure specimens in accordance with ASTM C1421. The use of stroke control gave consistent results with about 2% (statistically insignificant) variation in measured fracture toughness for a very wide range of rates (0.005 to 0.5 mm/min). Use of strain or crack mouth opening displacement (CMOD) control gave approx. 5% (statistically significant) variation over a very wide range of rates (1 to 80 µm/m/s), with the measurements being a function of rate. However, the rate effect was eliminated by use of dry nitrogen, implying a stress corrosion effect rather than a stability effect. With the use of a nitrogen environment during strain controlled tests, fracture toughness values were within about 1% over a wide range of rates (1 to 80 micons/m/s). CMOD or strain control did allow stable crack extension well past maximum force, and thus is preferred for energy calculations. The effort is being used to confirm recommendations in ASTM Test Method C1421 on fracture toughness measurement.

  13. The Impact of Azilsartan Medoxomil Treatment (Capsule Formulation) at Doses Ranging From 10 to 80 mg: Significant, Rapid Reductions in Clinic Diastolic and Systolic Blood Pressure.

    PubMed

    Perez, Alfonso; Cao, Charlie

    2017-03-01

    In this phase 2, multicenter, parallel-group, double-blind, dose-ranging study, hypertensive adults (n=449) were randomized to receive one of five doses of a capsule formulation of azilsartan medoxomil (AZL-M; 5, 10, 20, 40, 80 mg), olmesartan medoxomil (OLM) 20 mg, or placebo once daily. The primary endpoint was change in trough clinic diastolic blood pressure (DBP) at week 8. AZL-M provided rapid statistically and clinically significant reductions in DBP and systolic blood pressure (SBP) vs placebo at all doses except 5 mg. Placebo-subtracted changes were greatest with the 40 mg dose (DBP, -5.7 mm Hg; SBP, -12.3 mm Hg). Clinic changes with AZL-M (all doses) were statistically indistinguishable vs OLM, although there were greater reductions with AZL-M 40 mg using 24-hour ambulatory blood pressure. Adverse event frequency was similar in the AZL-M and placebo groups. Based on these and other findings, subsequent trials investigated the commercial AZL-M tablet in the dose range of 20 to 80 mg/d. ©2016 Wiley Periodicals, Inc.

  14. Needs of Foreign Students from Developing Nations at U.S. Colleges and Universities.

    DTIC Science & Technology

    1980-03-01

    variations among the students. Sex, age, and marital status were part of these variations. English language proficiency, as measured by TOEFL scores...varies by the command of English students have. The command of English was measured by two measures; (1) TOEFL score ranges, and (2) the self evaluation...coefficients from a statistical point of view. However, when the coefficients were examined substantive- ly, TOEFL score ranges did not account for 5% or more of

  15. Agent based reasoning for the non-linear stochastic models of long-range memory

    NASA Astrophysics Data System (ADS)

    Kononovicius, A.; Gontis, V.

    2012-02-01

    We extend Kirman's model by introducing variable event time scale. The proposed flexible time scale is equivalent to the variable trading activity observed in financial markets. Stochastic version of the extended Kirman's agent based model is compared to the non-linear stochastic models of long-range memory in financial markets. The agent based model providing matching macroscopic description serves as a microscopic reasoning of the earlier proposed stochastic model exhibiting power law statistics.

  16. The optical diagnostics of parameters of biological tissues of human intact skin in near-infrared range

    NASA Astrophysics Data System (ADS)

    Petruk, Vasyl; Kvaternyuk, Sergii; Bolyuh, Boris; Bolyuh, Dmitry; Dronenko, Vladimir; Harasim, Damian; Annabayev, Azamat

    2016-09-01

    Melanoma skin is difficult to diagnose in the early stages of development despite its location outside. Melanoma is difficult to visually differentiate from benign melanocytic nevi. In the work we investigated parameters of human intact skin in near-infrared range for different racial and gender groups. This allows to analyze statistical differences in the coefficient of diffuse reflection and use them in the differential diagnosis of cancer by optical methods subject.

  17. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less

  18. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines

    NASA Astrophysics Data System (ADS)

    Laidre, K. L.; Regehr, E. V.; Akcakaya, H. R.; Amstrup, S. C.; Atwood, T.; Lunn, N.; Obbard, M.; Stern, H. L., III; Thiemann, G.; Wiig, O.

    2016-12-01

    Loss of Arctic sea ice due to climate change is the most serious threat to polar bears (Ursus maritimus) throughout their circumpolar range. We performed a data-based sensitivity analysis with respect to this threat by evaluating the potential response of the global polar bear population to projected sea-ice conditions. We conducted 1) an assessment of generation length for polar bears, 2) developed of a standardized sea-ice metric representing important habitat characteristics for the species; and 3) performed population projections over three generations, using computer simulation and statistical models representing alternative relationships between sea ice and polar bear abundance. Using three separate approaches, the median percent change in mean global population size for polar bears between 2015 and 2050 ranged from -4% (95% CI = -62%, 50%) to -43% (95% CI = -76%, -20%). Results highlight the potential for large reductions in the global population if sea-ice loss continues. They also highlight the large amount of uncertainty in statistical projections of polar bear abundance and the sensitivity of projections to plausible alternative assumptions. The median probability of a reduction in the mean global population size of polar bears greater than 30% over three generations was approximately 0.71 (range 0.20-0.95. The median probability of a reduction greater than 50% was approximately 0.07 (range 0-0.35), and the probability of a reduction greater than 80% was negligible.

  19. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines.

    PubMed

    Regehr, Eric V; Laidre, Kristin L; Akçakaya, H Resit; Amstrup, Steven C; Atwood, Todd C; Lunn, Nicholas J; Obbard, Martyn; Stern, Harry; Thiemann, Gregory W; Wiig, Øystein

    2016-12-01

    Loss of Arctic sea ice owing to climate change is the primary threat to polar bears throughout their range. We evaluated the potential response of polar bears to sea-ice declines by (i) calculating generation length (GL) for the species, which determines the timeframe for conservation assessments; (ii) developing a standardized sea-ice metric representing important habitat; and (iii) using statistical models and computer simulation to project changes in the global population under three approaches relating polar bear abundance to sea ice. Mean GL was 11.5 years. Ice-covered days declined in all subpopulation areas during 1979-2014 (median -1.26 days year -1 ). The estimated probabilities that reductions in the mean global population size of polar bears will be greater than 30%, 50% and 80% over three generations (35-41 years) were 0.71 (range 0.20-0.95), 0.07 (range 0-0.35) and less than 0.01 (range 0-0.02), respectively. According to IUCN Red List reduction thresholds, which provide a common measure of extinction risk across taxa, these results are consistent with listing the species as vulnerable. Our findings support the potential for large declines in polar bear numbers owing to sea-ice loss, and highlight near-term uncertainty in statistical projections as well as the sensitivity of projections to different plausible assumptions. © 2016 The Authors.

  20. A quality control circle process to improve implementation effect of prevention measures for high-risk patients.

    PubMed

    Feng, Haixia; Li, Guohong; Xu, Cuirong; Ju, Changping; Suo, Peiheng

    2017-12-01

    The aim of the study was to analyse the influence of prevention measures on pressure injuries for high-risk patients and to establish the most appropriate methods of implementation. Nurses assessed patients using a checklist and factors influencing the prevention of a pressure injury determined by brain storming. A specific series of measures was drawn up and an estimate of risk of pressure injury determined using the Braden Scale, analysis of nursing documents, implementation of prevention measures for pressure sores and awareness of the system both before and after carrying out a quality control circle (QCC) process. The overall scores of implementation of prevention measures ranged from 74.86 ± 14.24 to 87.06 ± 17.04, a result that was statistically significant (P < 0.0025). The Braden Scale scores ranged from 8.53 ± 3.21 to 13.48 ± 3.57. The nursing document scores ranged from 7.67 ± 3.98 to 10.12 ± 1.63; prevention measure scores ranged from 11.48 ± 4.18 to 13.96 ± 3.92. Differences in all of the above results are statistically significant (P < 0.05). Implementation of a QCC can standardise and improve the prevention measures for patients who are vulnerable to pressure sores and is of practical importance to their prevention and control. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  1. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines

    USGS Publications Warehouse

    Regehr, Eric V.; Laidre, Kristin L.; Akçakaya, H. Resit; Amstrup, Steven C.; Atwood, Todd C.; Lunn, Nicholas J.; Obbard, Martyn E.; Stern, Harry; Thiemann, Gregory W.; Wiig, Øystein

    2016-01-01

    Loss of Arctic sea ice owing to climate change is the primary threat to polar bears throughout their range. We evaluated the potential response of polar bears to sea-ice declines by (i) calculating generation length (GL) for the species, which determines the timeframe for conservation assessments; (ii) developing a standardized sea-ice metric representing important habitat; and (iii) using statistical models and computer simulation to project changes in the global population under three approaches relating polar bear abundance to sea ice. Mean GL was 11.5 years. Ice-covered days declined in all subpopulation areas during 1979–2014 (median −1.26 days year−1). The estimated probabilities that reductions in the mean global population size of polar bears will be greater than 30%, 50% and 80% over three generations (35–41 years) were 0.71 (range 0.20–0.95), 0.07 (range 0–0.35) and less than 0.01 (range 0–0.02), respectively. According to IUCN Red List reduction thresholds, which provide a common measure of extinction risk across taxa, these results are consistent with listing the species as vulnerable. Our findings support the potential for large declines in polar bear numbers owing to sea-ice loss, and highlight near-term uncertainty in statistical projections as well as the sensitivity of projections to different plausible assumptions.

  2. Multimodel predictive system for carbon dioxide solubility in saline formation waters.

    PubMed

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-05

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO(2) solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304-433 K, pressure range 74-500 bar, and salt concentration range 0-7 m (NaCl equivalent), using 173 published CO(2) solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO(2) solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO(2) solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  3. Similar range of motion and function after resurfacing large–head or standard total hip arthroplasty

    PubMed Central

    2013-01-01

    Background and purpose Large–size hip articulations may improve range of motion (ROM) and function compared to a 28–mm THA, and the low risk of dislocation allows the patients more activity postoperatively. On the other hand, the greater extent of surgery for resurfacing hip arthroplasty (RHA) could impair rehabilitation. We investigated the effect of head size and surgical procedure on postoperative rehabilitation in a randomized clinical trial (RCT). Methods We followed randomized groups of RHAs, large–head THAs and standard THAs at 2 months, 6 months, 1 and 2 years postoperatively, recording clinical rehabilitation parameters. Results Large articulations increased the mean total range of motion by 13° during the first 6 postoperative months. The increase was not statistically significant and was transient. The 2–year total ROM (SD) for RHA, standard THA, and large–head THA was 221° (35), 232° (36), and 225° (30) respectively, but the differences were not statistically significant. The 3 groups were similar regarding Harris hip score, UCLA activity score, step rate, and sick leave. Interpretation Head size had no influence on range of motion. The lack of restriction allowed for large articulations did not improve the clinical and patient–perceived outcomes. The more extensive surgical procedure of RHA did not impair the rehabilitation. This project is registered at ClinicalTrials.gov under # NCT01113762. PMID:23530872

  4. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al.'s originally-proposed statistics, on account of the inflated error rate that can result. PMID:22496670

  5. Scenarios for Motivating the Learning of Variability: An Example in Finances

    ERIC Educational Resources Information Center

    Cordani, Lisbeth K.

    2013-01-01

    This article explores an example in finances in order to motivate the random variable learning to the very beginners in statistics. In addition, it offers a relationship between standard deviation and range in a very specific situation.

  6. Criterion 6, indicator 30 : value and volume in round wood equivalents of exports and imports of wood products

    Treesearch

    James Howard; Rebecca Westby; Kenneth Skog

    2010-01-01

    This report provides a wide range of specific and statistical information on forest products markets in terms of production, trade, prices and consumption, employment, and other factors influencing forest sustainability.

  7. Efficient Coding and Statistically Optimal Weighting of Covariance among Acoustic Attributes in Novel Sounds

    PubMed Central

    Stilp, Christian E.; Kluender, Keith R.

    2012-01-01

    To the extent that sensorineural systems are efficient, redundancy should be extracted to optimize transmission of information, but perceptual evidence for this has been limited. Stilp and colleagues recently reported efficient coding of robust correlation (r = .97) among complex acoustic attributes (attack/decay, spectral shape) in novel sounds. Discrimination of sounds orthogonal to the correlation was initially inferior but later comparable to that of sounds obeying the correlation. These effects were attenuated for less-correlated stimuli (r = .54) for reasons that are unclear. Here, statistical properties of correlation among acoustic attributes essential for perceptual organization are investigated. Overall, simple strength of the principal correlation is inadequate to predict listener performance. Initial superiority of discrimination for statistically consistent sound pairs was relatively insensitive to decreased physical acoustic/psychoacoustic range of evidence supporting the correlation, and to more frequent presentations of the same orthogonal test pairs. However, increased range supporting an orthogonal dimension has substantial effects upon perceptual organization. Connectionist simulations and Eigenvalues from closed-form calculations of principal components analysis (PCA) reveal that perceptual organization is near-optimally weighted to shared versus unshared covariance in experienced sound distributions. Implications of reduced perceptual dimensionality for speech perception and plausible neural substrates are discussed. PMID:22292057

  8. A simple microstructure return model explaining microstructure noise and Epps effects

    NASA Astrophysics Data System (ADS)

    Saichev, A.; Sornette, D.

    2014-01-01

    We present a novel simple microstructure model of financial returns that combines (i) the well-known ARFIMA process applied to tick-by-tick returns, (ii) the bid-ask bounce effect, (iii) the fat tail structure of the distribution of returns and (iv) the non-Poissonian statistics of inter-trade intervals. This model allows us to explain both qualitatively and quantitatively important stylized facts observed in the statistics of both microstructure and macrostructure returns, including the short-ranged correlation of returns, the long-ranged correlations of absolute returns, the microstructure noise and Epps effects. According to the microstructure noise effect, volatility is a decreasing function of the time-scale used to estimate it. The Epps effect states that cross correlations between asset returns are increasing functions of the time-scale at which the returns are estimated. The microstructure noise is explained as the result of the negative return correlations inherent in the definition of the bid-ask bounce component (ii). In the presence of a genuine correlation between the returns of two assets, the Epps effect is due to an average statistical overlap of the momentum of the returns of the two assets defined over a finite time-scale in the presence of the long memory process (i).

  9. Data series embedding and scale invariant statistics.

    PubMed

    Michieli, I; Medved, B; Ristov, S

    2010-06-01

    Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.

  10. Thermal heterogeneity within aqueous materials quantified by 1H NMR spectroscopy: Multiparametric validation in silico and in vitro

    NASA Astrophysics Data System (ADS)

    Lutz, Norbert W.; Bernard, Monique

    2018-02-01

    We recently suggested a new paradigm for statistical analysis of thermal heterogeneity in (semi-)aqueous materials by 1H NMR spectroscopy, using water as a temperature probe. Here, we present a comprehensive in silico and in vitro validation that demonstrates the ability of this new technique to provide accurate quantitative parameters characterizing the statistical distribution of temperature values in a volume of (semi-)aqueous matter. First, line shape parameters of numerically simulated water 1H NMR spectra are systematically varied to study a range of mathematically well-defined temperature distributions. Then, corresponding models based on measured 1H NMR spectra of agarose gel are analyzed. In addition, dedicated samples based on hydrogels or biological tissue are designed to produce temperature gradients changing over time, and dynamic NMR spectroscopy is employed to analyze the resulting temperature profiles at sub-second temporal resolution. Accuracy and consistency of the previously introduced statistical descriptors of temperature heterogeneity are determined: weighted median and mean temperature, standard deviation, temperature range, temperature mode(s), kurtosis, skewness, entropy, and relative areas under temperature curves. Potential and limitations of this method for quantitative analysis of thermal heterogeneity in (semi-)aqueous materials are discussed in view of prospective applications in materials science as well as biology and medicine.

  11. A graph theory approach to identify resonant and non-resonant transmission paths in statistical modal energy distribution analysis

    NASA Astrophysics Data System (ADS)

    Aragonès, Àngels; Maxit, Laurent; Guasch, Oriol

    2015-08-01

    Statistical modal energy distribution analysis (SmEdA) extends classical statistical energy analysis (SEA) to the mid frequency range by establishing power balance equations between modes in different subsystems. This circumvents the SEA requirement of modal energy equipartition and enables applying SmEdA to the cases of low modal overlap, locally excited subsystems and to deal with complex heterogeneous subsystems as well. Yet, widening the range of application of SEA is done at a price with large models because the number of modes per subsystem can become considerable when the frequency increases. Therefore, it would be worthwhile to have at one's disposal tools for a quick identification and ranking of the resonant and non-resonant paths involved in modal energy transmission between subsystems. It will be shown that previously developed graph theory algorithms for transmission path analysis (TPA) in SEA can be adapted to SmEdA and prove useful for that purpose. The case of airborne transmission between two cavities separated apart by homogeneous and ribbed plates will be first addressed to illustrate the potential of the graph approach. A more complex case representing transmission between non-contiguous cavities in a shipbuilding structure will be also presented.

  12. Statistical analysis of the electrocatalytic activity of Pt nanoparticles supported on novel functionalized reduced graphene oxide-chitosan for methanol electrooxidation

    NASA Astrophysics Data System (ADS)

    Ekrami-Kakhki, Mehri-Saddat; Abbasi, Sedigheh; Farzaneh, Nahid

    2018-01-01

    The purpose of this study is to statistically analyze the anodic current density and peak potential of methanol oxidation at Pt nanoparticles supported on functionalized reduced graphene oxide (RGO), using design of experiments methodology. RGO is functionalized with methyl viologen (MV) and chitosan (CH). The novel Pt/MV-RGO-CH catalyst is successfully prepared and characterized with transmission electron microscopy (TEM) image. The electrocatalytic activity of Pt/MV-RGOCH catalyst is experimentally evaluated for methanol oxidation. The effects of methanol concentration and scan rate factors are also investigated experimentally and statistically. The effects of these two main factors and their interactions are investigated, using analysis of variance test, Duncan's multiple range test and response surface method. The results of the analysis of variance show that all the main factors and their interactions have a significant effect on anodic current density and peak potential of methanol oxidation at α = 0.05. The suggested models which encompass significant factors can predict the variation of the anodic current density and peak potential of methanol oxidation. The results of Duncan's multiple range test confirmed that there is a significant difference between the studied levels of the main factors. [Figure not available: see fulltext.

  13. Evaluation and comparison of statistical methods for early temporal detection of outbreaks: A simulation-based study

    PubMed Central

    Le Strat, Yann

    2017-01-01

    The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489

  14. Simplified Approach to Predicting Rough Surface Transition

    NASA Technical Reports Server (NTRS)

    Boyle, Robert J.; Stripf, Matthias

    2009-01-01

    Turbine vane heat transfer predictions are given for smooth and rough vanes where the experimental data show transition moving forward on the vane as the surface roughness physical height increases. Consiste nt with smooth vane heat transfer, the transition moves forward for a fixed roughness height as the Reynolds number increases. Comparison s are presented with published experimental data. Some of the data ar e for a regular roughness geometry with a range of roughness heights, Reynolds numbers, and inlet turbulence intensities. The approach ta ken in this analysis is to treat the roughness in a statistical sense , consistent with what would be obtained from blades measured after e xposure to actual engine environments. An approach is given to determ ine the equivalent sand grain roughness from the statistics of the re gular geometry. This approach is guided by the experimental data. A roughness transition criterion is developed, and comparisons are made with experimental data over the entire range of experimental test co nditions. Additional comparisons are made with experimental heat tran sfer data, where the roughness geometries are both regular as well a s statistical. Using the developed analysis, heat transfer calculatio ns are presented for the second stage vane of a high pressure turbine at hypothetical engine conditions.

  15. WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.

    PubMed

    Grech, Victor

    2018-02-01

    Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Meiyun; Long, Shibing, E-mail: longshibing@ime.ac.cn; Wang, Guoming

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electronmore » transport model. Our work provides indications for the improvement of the switching uniformity.« less

  17. Does RAIM with Correct Exclusion Produce Unbiased Positions?

    PubMed Central

    Teunissen, Peter J. G.; Imparato, Davide; Tiberius, Christian C. J. M.

    2017-01-01

    As the navigation solution of exclusion-based RAIM follows from a combination of least-squares estimation and a statistically based exclusion-process, the computation of the integrity of the navigation solution has to take the propagated uncertainty of the combined estimation-testing procedure into account. In this contribution, we analyse, theoretically as well as empirically, the effect that this combination has on the first statistical moment, i.e., the mean, of the computed navigation solution. It will be shown, although statistical testing is intended to remove biases from the data, that biases will always remain under the alternative hypothesis, even when the correct alternative hypothesis is properly identified. The a posteriori exclusion of a biased satellite range from the position solution will therefore never remove the bias in the position solution completely. PMID:28672862

  18. Football goal distributions and extremal statistics

    NASA Astrophysics Data System (ADS)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  19. Do neural nets learn statistical laws behind natural language?

    PubMed

    Takahashi, Shuntaro; Tanaka-Ishii, Kumiko

    2017-01-01

    The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf's law and Heaps' law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf's law and Heaps' law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks.

  20. Bayes and the Law

    PubMed Central

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-01-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes’ theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law. PMID:27398389

  1. Maximum entropy models as a tool for building precise neural controls.

    PubMed

    Savin, Cristina; Tkačik, Gašper

    2017-10-01

    Neural responses are highly structured, with population activity restricted to a small subset of the astronomical range of possible activity patterns. Characterizing these statistical regularities is important for understanding circuit computation, but challenging in practice. Here we review recent approaches based on the maximum entropy principle used for quantifying collective behavior in neural activity. We highlight recent models that capture population-level statistics of neural data, yielding insights into the organization of the neural code and its biological substrate. Furthermore, the MaxEnt framework provides a general recipe for constructing surrogate ensembles that preserve aspects of the data, but are otherwise maximally unstructured. This idea can be used to generate a hierarchy of controls against which rigorous statistical tests are possible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Statistic characteristics of the gas-liquid flow in a vertical minichannel

    NASA Astrophysics Data System (ADS)

    Kozulin, I. A.; Kuznetsov, V. V.

    2010-03-01

    The gas-liquid upward flow was studied in a rectangular minichannel of 1.75×3.8 mm and length of 0.7 m. The experiments were carried out within the range of the gas superficial velocity from 0.1 to 10 m/s and the liquid superficial velocity from 0.07 to 0.7 m/s for the co-current H2O/CO2 flow under the conditions of saturation. The method for the two-beam laser scanning of structure and determination of statistic characteristics of the two-phase flow was worked through. The slug-bubble, slug, transitional, churn, and annular flows were distinguished. The statistics characteristics of liquid and gas phases motion in a minichannel were obtained for the first time including the velocities of phase motion.

  3. Normal probabilities for Vandenberg AFB wind components - monthly reference periods for all flight azimuths, 0- to 70-km altitudes

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1975-01-01

    Vandenberg Air Force Base (AFB), California, wind component statistics are presented to be used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as a statistical model to represent component winds at Vandenberg AFB. Head tail, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99.865 percent for each month. The results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Vandenberg AFB.

  4. Normal probabilities for Cape Kennedy wind components: Monthly reference periods for all flight azimuths. Altitudes 0 to 70 kilometers

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1973-01-01

    This document replaces Cape Kennedy empirical wind component statistics which are presently being used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as an adequate statistical model to represent component winds at Cape Kennedy. Head-, tail-, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99,865 percent for each month. Results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Cape Kennedy, Florida.

  5. On the statistics of increments in strong Alfvenic turbulence

    NASA Astrophysics Data System (ADS)

    Palacios, J. C.; Perez, J. C.

    2017-12-01

    In-situ measurements have shown that the solar wind is dominated by non-compressive Alfvén-like fluctuations of plasma velocity and magnetic field over a broad range of scales. In this work, we present recent progress in understanding intermittency in Alfvenic turbulence by investigating the statistics of Elsasser increments from simulations of steadily driven Reduced MHD with numerical resolutions up to 2048^3. The nature of these statistics guards a close relation to the fundamental properties of small-scale structures in which the turbulence is ultimately dissipated and therefore has profound implications in the possible contribution of turbulence to the heating of the solar wind. We extensively investigate the properties and three-dimensional structure of probability density functions (PDFs) of increments and compare with recent phenomenological models of intermittency in MHD turbulence.

  6. Bayes and the Law.

    PubMed

    Fenton, Norman; Neil, Martin; Berger, Daniel

    2016-06-01

    Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes' theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law.

  7. VizieR Online Data Catalog: Supernova matter EOS (Buyukcizmeci+, 2014)

    NASA Astrophysics Data System (ADS)

    Buyukcizmeci, N.; Botvina, A. S.; Mishustin, I. N.

    2017-03-01

    The Statistical Model for Supernova Matter (SMSM) was developed in Botvina & Mishustin (2004, PhLB, 584, 233 ; 2010, NuPhA, 843, 98) as a direct generalization of the Statistical Multifragmentation Model (SMM; Bondorf et al. 1995, PhR, 257, 133). We treat supernova matter as a mixture of nuclear species, electrons, and photons in statistical equilibrium. The SMSM EOS tables cover the following ranges of control parameters: 1. Temperature: T = 0.2-25 MeV; for 35 T values. 2. Electron fraction Ye: 0.02-0.56; linear mesh of Ye = 0.02, giving 28 Ye values. It is equal to the total proton fraction Xp, due to charge neutrality. 3. Baryon number density fraction {rho}/{rho}0 = (10-8-0.32), giving 31 {rho}/{rho}0 values. (2 data files).

  8. Do neural nets learn statistical laws behind natural language?

    PubMed Central

    Takahashi, Shuntaro

    2017-01-01

    The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf’s law and Heaps’ law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf’s law and Heaps’ law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks. PMID:29287076

  9. Estimating selected low-flow frequency statistics and harmonic-mean flows for ungaged, unregulated streams in Indiana

    USGS Publications Warehouse

    Martin, Gary R.; Fowler, Kathleen K.; Arihood, Leslie D.

    2016-09-06

    Information on low-flow characteristics of streams is essential for the management of water resources. This report provides equations for estimating the 1-, 7-, and 30-day mean low flows for a recurrence interval of 10 years and the harmonic-mean flow at ungaged, unregulated stream sites in Indiana. These equations were developed using the low-flow statistics and basin characteristics for 108 continuous-record streamgages in Indiana with at least 10 years of daily mean streamflow data through the 2011 climate year (April 1 through March 31). The equations were developed in cooperation with the Indiana Department of Environmental Management.Regression techniques were used to develop the equations for estimating low-flow frequency statistics and the harmonic-mean flows on the basis of drainage-basin characteristics. A geographic information system was used to measure basin characteristics for selected streamgages. A final set of 25 basin characteristics measured at all the streamgages were evaluated to choose the best predictors of the low-flow statistics.Logistic-regression equations applicable statewide are presented for estimating the probability that selected low-flow frequency statistics equal zero. These equations use the explanatory variables total drainage area, average transmissivity of the full thickness of the unconsolidated deposits within 1,000 feet of the stream network, and latitude of the basin outlet. The percentage of the streamgage low-flow statistics correctly classified as zero or nonzero using the logistic-regression equations ranged from 86.1 to 88.9 percent.Generalized-least-squares regression equations applicable statewide for estimating nonzero low-flow frequency statistics use total drainage area, the average hydraulic conductivity of the top 70 feet of unconsolidated deposits, the slope of the basin, and the index of permeability and thickness of the Quaternary surficial sediments as explanatory variables. The average standard error of prediction of these regression equations ranges from 55.7 to 61.5 percent.Regional weighted-least-squares regression equations were developed for estimating the harmonic-mean flows by dividing the State into three low-flow regions. The Northern region uses total drainage area and the average transmissivity of the entire thickness of unconsolidated deposits as explanatory variables. The Central region uses total drainage area, the average hydraulic conductivity of the entire thickness of unconsolidated deposits, and the index of permeability and thickness of the Quaternary surficial sediments. The Southern region uses total drainage area and the percent of the basin covered by forest. The average standard error of prediction for these equations ranges from 39.3 to 66.7 percent.The regional regression equations are applicable only to stream sites with low flows unaffected by regulation and to stream sites with drainage basin characteristic values within specified limits. Caution is advised when applying the equations for basins with characteristics near the applicable limits and for basins with karst drainage features and for urbanized basins. Extrapolations near and beyond the applicable basin characteristic limits will have unknown errors that may be large. Equations are presented for use in estimating the 90-percent prediction interval of the low-flow statistics estimated by use of the regression equations at a given stream site.The regression equations are to be incorporated into the U.S. Geological Survey StreamStats Web-based application for Indiana. StreamStats allows users to select a stream site on a map and automatically measure the needed basin characteristics and compute the estimated low-flow statistics and associated prediction intervals.

  10. Statistical learning and auditory processing in children with music training: An ERP study.

    PubMed

    Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne

    2017-07-01

    The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  11. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  12. Statistics for X-chromosome associations.

    PubMed

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  13. Prognostic value of coronary computed tomographic angiography findings in asymptomatic individuals: a 6-year follow-up from the prospective multicentre international CONFIRM study.

    PubMed

    Cho, Iksung; Al'Aref, Subhi J; Berger, Adam; Ó Hartaigh, Bríain; Gransar, Heidi; Valenti, Valentina; Lin, Fay Y; Achenbach, Stephan; Berman, Daniel S; Budoff, Matthew J; Callister, Tracy Q; Al-Mallah, Mouaz H; Cademartiri, Filippo; Chinnaiyan, Kavitha; Chow, Benjamin J W; DeLago, Augustin; Villines, Todd C; Hadamitzky, Martin; Hausleiter, Joerg; Leipsic, Jonathon; Shaw, Leslee J; Kaufmann, Philipp A; Feuchtner, Gudrun; Kim, Yong-Jin; Maffei, Erica; Raff, Gilbert; Pontone, Gianluca; Andreini, Daniele; Marques, Hugo; Rubinshtein, Ronen; Chang, Hyuk-Jae; Min, James K

    2018-03-14

    The long-term prognostic benefit of coronary computed tomographic angiography (CCTA) findings of coronary artery disease (CAD) in asymptomatic populations is unknown. From the prospective multicentre international CONFIRM long-term study, we evaluated asymptomatic subjects without known CAD who underwent both coronary artery calcium scoring (CACS) and CCTA (n = 1226). Coronary computed tomographic angiography findings included the severity of coronary artery stenosis, plaque composition, and coronary segment location. Using the C-statistic and likelihood ratio tests, we evaluated the incremental prognostic utility of CCTA findings over a base model that included a panel of traditional risk factors (RFs) as well as CACS to predict long-term all-cause mortality. During a mean follow-up of 5.9 ± 1.2 years, 78 deaths occurred. Compared with the traditional RF alone (C-statistic 0.64), CCTA findings including coronary stenosis severity, plaque composition, and coronary segment location demonstrated improved incremental prognostic utility beyond traditional RF alone (C-statistics range 0.71-0.73, all P < 0.05; incremental χ2 range 20.7-25.5, all P < 0.001). However, no added prognostic benefit was offered by CCTA findings when added to a base model containing both traditional RF and CACS (C-statistics P > 0.05, for all). Coronary computed tomographic angiography improved prognostication of 6-year all-cause mortality beyond a set of conventional RF alone, although, no further incremental value was offered by CCTA when CCTA findings were added to a model incorporating RF and CACS.

  14. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    PubMed

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  15. Evaluation of orthognathic surgery on articular disc position and temporomandibular joint symptoms in skeletal class II patients: A Magnetic Resonance Imaging study.

    PubMed

    Firoozei, Gholamreza; Shahnaseri, Shirin; Momeni, Hasan; Soltani, Parisa

    2017-08-01

    The purpose of orthognathic surgery is to correct facial deformity and dental malocclusion and to obtain normal orofacial function. However, there are controversies of whether orthognathic surgery might have any negative influence on temporomandibular (TM) joint. The purpose of this study was to evaluate the influence of orthognathic surgery on articular disc position and temporomandibular joint symptoms of skeletal CI II patients by means of magnetic resonance imaging. For this purpose, fifteen patients with skeletal CI II malocclusion, aged 19-32 years (mean 23 years), 10 women and 5 men, from the Isfahan Department of Oral and Maxillofacial Surgery were studied. All received LeFort I and bilateral sagittal split osteotomy (BSSO) osteotomies and all patients received pre- and post-surgical orthodontic treatment. Magnetic resonance imaging was performed 1 day preoperatively and 3 month postoperatively. Descriptive statistics and Wilcoxon and Mc-Nemar tests were used for statistical analysis. P <0.05 was considered significant. Disc position ranged between 4.25 and 8.09 prior to surgery (mean=5.74±1.21). After surgery disc position range was 4.36 to 7.40 (mean=5.65±1.06). Statistical analysis proved that although TM disc tended to move anteriorly after BSSO surgery, this difference was not statistically significant ( p value<0.05). The findings of the present study revealed that orthognathic surgery does not alter the disc and condyle relationship. Therefore, it has minimal effects on intact and functional TM joint. Key words: Orthognathic surgery, skeletal class 2, magnetic resonance imaging, temporomandibular disc.

  16. [Acetabular anteversion angle of the hip in the Mexican adult population measured with computed tomography].

    PubMed

    Rubalcava, J; Gómez-García, F; Ríos-Reina, J L

    2012-01-01

    Knowledge of the radiogrametric characteristics of a specific skeletal segment in a healthy population is of the utmost clinical importance. The main justification for this study is that there is no published description of the radiogrametric parameter of acetabular anteversion in a healthy Mexican adult population. A prospective, descriptive and cross-sectional study was conducted. Individuals of both genders older than 18 years and orthopedically healthy were included. They underwent a two-dimensional axial tomographic study of both hips to measure the acetabular anteversion angles. The statistical analysis consisted of obtaining central trend and scatter measurements. A multivariate analysis of variance (ANOVA) and statistical significance were performed. 118 individuals were studied, 60 males and 58 females, with a mean age of 47.7 +/- 16.7, and a range of 18-85 years. The anteversion of the entire group was 18.6 degrees + 4.1 degrees. Anteversion in males was 17.3 degrees +/- 3.5 degrees (10 degrees - 25 degrees) and in females 19.8 degrees +/- 4.7 degrees (10 degrees - 31 degrees). There were no statistically significant differences (p < or = 0.05) in right and left anteversion in the entire group. However, there were statistically significant differences (p > or = 0.005) both in the right and left sides when males and females were compared. Our study showed that there are great variations in the anteversion ranges of a healthy population. When our results are compared with those published by other authors the mean of most measurements exceeds 15 degrees. This should be useful to make therapeutic decisions that involve acetabular anteversion.

  17. Fermi-Pasta-Ulam-Tsingou problems: Passage from Boltzmann to q-statistics

    NASA Astrophysics Data System (ADS)

    Bagchi, Debarshee; Tsallis, Constantino

    2018-02-01

    The Fermi-Pasta-Ulam (FPU) one-dimensional Hamiltonian includes a quartic term which guarantees ergodicity of the system in the thermodynamic limit. Consistently, the Boltzmann factor P(ε) ∼e-βε describes its equilibrium distribution of one-body energies, and its velocity distribution is Maxwellian, i.e., P(v) ∼e - βv2 /2. We consider here a generalized system where the quartic coupling constant between sites decays as 1 / dijα (α ≥ 0 ;dij = 1 , 2 , …) . Through first-principle molecular dynamics we demonstrate that, for large α (above α ≃ 1), i.e., short-range interactions, Boltzmann statistics (based on the additive entropic functional SB [ P(z) ] = - k ∫ dzP(z) ln P(z)) is verified. However, for small values of α (below α ≃ 1), i.e., long-range interactions, Boltzmann statistics dramatically fails and is replaced by q-statistics (based on the nonadditive entropic functional Sq [ P(z) ] = k(1 - ∫ dz[ P(z) ]q) /(q - 1) , with S1 =SB). Indeed, the one-body energy distribution is q-exponential, P(ε) ∼ eqε-βε ε ≡[ 1 +(qε - 1) βε ε ]-1 /(qε - 1) with qε > 1, and its velocity distribution is given by P(v) ∼ eqv-βvv2 / 2 with qv > 1. Moreover, within small error bars, we verify qε =qv = q, which decreases from an extrapolated value q ≃ 5 / 3 to q = 1 when α increases from zero to α ≃ 1, and remains q = 1 thereafter.

  18. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    NASA Astrophysics Data System (ADS)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  19. The joint space-time statistics of macroweather precipitation, space-time statistical factorization and macroweather models.

    PubMed

    Lovejoy, S; de Lima, M I P

    2015-07-01

    Over the range of time scales from about 10 days to 30-100 years, in addition to the familiar weather and climate regimes, there is an intermediate "macroweather" regime characterized by negative temporal fluctuation exponents: implying that fluctuations tend to cancel each other out so that averages tend to converge. We show theoretically and numerically that macroweather precipitation can be modeled by a stochastic weather-climate model (the Climate Extended Fractionally Integrated Flux, model, CEFIF) first proposed for macroweather temperatures and we show numerically that a four parameter space-time CEFIF model can approximately reproduce eight or so empirical space-time exponents. In spite of this success, CEFIF is theoretically and numerically difficult to manage. We therefore propose a simplified stochastic model in which the temporal behavior is modeled as a fractional Gaussian noise but the spatial behaviour as a multifractal (climate) cascade: a spatial extension of the recently introduced ScaLIng Macroweather Model, SLIMM. Both the CEFIF and this spatial SLIMM model have a property often implicitly assumed by climatologists that climate statistics can be "homogenized" by normalizing them with the standard deviation of the anomalies. Physically, it means that the spatial macroweather variability corresponds to different climate zones that multiplicatively modulate the local, temporal statistics. This simplified macroweather model provides a framework for macroweather forecasting that exploits the system's long range memory and spatial correlations; for it, the forecasting problem has been solved. We test this factorization property and the model with the help of three centennial, global scale precipitation products that we analyze jointly in space and in time.

  20. Enhancing the mathematical properties of new haplotype homozygosity statistics for the detection of selective sweeps.

    PubMed

    Garud, Nandita R; Rosenberg, Noah A

    2015-06-01

    Soft selective sweeps represent an important form of adaptation in which multiple haplotypes bearing adaptive alleles rise to high frequency. Most statistical methods for detecting selective sweeps from genetic polymorphism data, however, have focused on identifying hard selective sweeps in which a favored allele appears on a single haplotypic background; these methods might be underpowered to detect soft sweeps. Among exceptions is the set of haplotype homozygosity statistics introduced for the detection of soft sweeps by Garud et al. (2015). These statistics, examining frequencies of multiple haplotypes in relation to each other, include H12, a statistic designed to identify both hard and soft selective sweeps, and H2/H1, a statistic that conditional on high H12 values seeks to distinguish between hard and soft sweeps. A challenge in the use of H2/H1 is that its range depends on the associated value of H12, so that equal H2/H1 values might provide different levels of support for a soft sweep model at different values of H12. Here, we enhance the H12 and H2/H1 haplotype homozygosity statistics for selective sweep detection by deriving the upper bound on H2/H1 as a function of H12, thereby generating a statistic that normalizes H2/H1 to lie between 0 and 1. Through a reanalysis of resequencing data from inbred lines of Drosophila, we show that the enhanced statistic both strengthens interpretations obtained with the unnormalized statistic and leads to empirical insights that are less readily apparent without the normalization. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Variability of Diurnal Temperature Range During Winter Over Western Himalaya: Range- and Altitude-Wise Study

    NASA Astrophysics Data System (ADS)

    Shekhar, M. S.; Devi, Usha; Dash, S. K.; Singh, G. P.; Singh, Amreek

    2018-04-01

    The current trends in diurnal temperature range, maximum temperature, minimum temperature, mean temperature, and sun shine hours over different ranges and altitudes of Western Himalaya during winter have been studied. Analysis of 25 years of data shows an increasing trend in diurnal temperature range over all the ranges and altitudes of Western Himalaya during winter, thereby confirming regional warming of the region due to present climate change and global warming. Statistical studies show significant increasing trend in maximum temperature over all the ranges and altitudes of Western Himalaya. Minimum temperature shows significant decreasing trend over Pir Panjal and Shamshawari range and significant increasing trend over higher altitude of Western Himalaya. Similarly, sunshine hours show significant decreasing trend over Karakoram range. There exists strong positive correlation between diurnal temperature range and maximum temperature for all the ranges and altitudes of Western Himalaya. Strong negative correlation exists between diurnal temperature range and minimum temperature over Shamshawari and Great Himalaya range and lower altitude of Western Himalaya. Sunshine hours show strong positive correlation with diurnal temperature range over Pir Panjal and Great Himalaya range and lower and higher altitudes.

  2. The clinical pattern of nephrotic syndrome in children has no effect on the concentration of soluble urokinase receptor (suPAR) in serum and urine.

    PubMed

    Ochocińska, Agnieszka; Jarmużek, Wioletta; Janas, Roman

    2018-04-23

    Concentration of soluble urokinase receptor (suPAR) was regarded as viable marker to differentiate the focal segmental glomerulosclerosis (FSGS) from other glomerulopathies and also as predictive parameter for progression of renal disease. The aim of this study was to evaluate serum and urine (s)(u)suPAR concentration in steroid-sensitive and steroid-resistant nephrotic children treated with different (double and triple-drug) regimens. Overall 43 children were evaluated including 14 patients with steroid-resistant nephrotic syndrome (SRNS) aged 9±6 years and 29 with steroid-sensitive nephrotic syndrome (SSNS) aged 9±5 years, as well as control group (n=59). The concentration of suPAR was measured with ELISA kit (R∧D Systems Inc.). There was no difference in serum suPAR level between SRNS (6404, range: 4613-9575 pg/mL) and SSNS (5745, range: 4666-8246 pg/mL) patients, and also in urinary suPAR: SRNS (2877, range: 847- 19121 pg/mL) and SSNS (2854, range: 328-7434 pg/mL), respectively. There was no statistically significant difference in serum biomarker concentrations between patients with severe course of the disease, in combination therapy, with three drugs: CsA + MMF + Pred (5968, range: 4613-9575 pg/mL) in comparison with patients receiving double therapy: CsA + Pred or MMF + Pred (5449, range: 4666-6623 pg/mL, 5905, range: 5102-6730 pg/mL, respectively). SuPAR concentration in the urine of patients treated with Pred + MMF was lower (1493, range: 328-4444 pg/mL) than in patients receiving Pred + CsA (3193, range: 629-7434 pg/mL), as well as lower than in patients with triple combination of drugs (3318, range: 448-5570 pg/mL), however the difference was not statistically significant. Serum and urine concentration of suPAR did not different between different clinical patterns of nephrotic syndrome in children, regardless the immunosuppressive treatment used. © 2018 MEDPRESS.

  3. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  4. Behind the statistics: the ethnography of suicide in Palestine.

    PubMed

    Dabbagh, Nadia

    2012-06-01

    As part of the first anthropological study on suicide in the modern Arab world, statistics gathered from the Ramallah region of the West Bank in Palestine painted an apparently remarkably similar picture to that found in Western countries such as the UK and France. More men than women completed suicide, more women than men attempted suicide. Men used more violent methods such as hanging and women softer methods such as medication overdose. Completed suicide was higher in the older age range, attempted suicide in the younger. However, ethnographic fieldwork and detailed examination of the case studies and suicide narratives gathered and analysed within the cultural, political and economic contexts illustrated more starkly the differences in suicidal practices between Palestinian West Bank society of the 1990s and other regions of the world. The central argument of the paper is that although statistics tell a very important story, ethnography uncovers a multitude of stories 'behind the statistics', and thus helps us to make sense of both cultural context and subjective experience.

  5. Performance evaluation of spectral vegetation indices using a statistical sensitivity function

    USGS Publications Warehouse

    Ji, Lei; Peters, Albert J.

    2007-01-01

    A great number of spectral vegetation indices (VIs) have been developed to estimate biophysical parameters of vegetation. Traditional techniques for evaluating the performance of VIs are regression-based statistics, such as the coefficient of determination and root mean square error. These statistics, however, are not capable of quantifying the detailed relationship between VIs and biophysical parameters because the sensitivity of a VI is usually a function of the biophysical parameter instead of a constant. To better quantify this relationship, we developed a “sensitivity function” for measuring the sensitivity of a VI to biophysical parameters. The sensitivity function is defined as the first derivative of the regression function, divided by the standard error of the dependent variable prediction. The function elucidates the change in sensitivity over the range of the biophysical parameter. The Student's t- or z-statistic can be used to test the significance of VI sensitivity. Additionally, we developed a “relative sensitivity function” that compares the sensitivities of two VIs when the biophysical parameters are unavailable.

  6. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  7. Study of pseudo noise CW diode laser for ranging applications

    NASA Technical Reports Server (NTRS)

    Lee, Hyo S.; Ramaswami, Ravi

    1992-01-01

    A new Pseudo Random Noise (PN) modulated CW diode laser radar system is being developed for real time ranging of targets at both close and large distances (greater than 10 KM) to satisy a wide range of applications: from robotics to future space applications. Results from computer modeling and statistical analysis, along with some preliminary data obtained from a prototype system, are presented. The received signal is averaged for a short time to recover the target response function. It is found that even with uncooperative targets, based on the design parameters used (200-mW laser and 20-cm receiver), accurate ranging is possible up to about 15 KM, beyond which signal to noise ratio (SNR) becomes too small for real time analog detection.

  8. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  9. Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis.

    PubMed

    Buldyrev, S V; Goldberger, A L; Havlin, S; Mantegna, R N; Matsa, M E; Peng, C K; Simons, M; Stanley, H E

    1995-05-01

    An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.

  10. Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis

    NASA Technical Reports Server (NTRS)

    Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Matsa, M. E.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.

  11. The effect of asymptomatic histological prostatitis on sexual function and lower urinary tract symptoms.

    PubMed

    Urkmez, Ahmet; Yuksel, Ozgur Haki; Uruc, Fatih; Akan, Serkan; Yildirim, Caglar; Sahin, Aytac; Verit, Ayhan

    2016-05-01

    Prostatitis affects 10-14% of men of all ages and ethnicities. More than 50% of the men experience episodes of prostatitis at one time of their lives. Patients with CP typically have longlasting genitourinary/pelvic pain and obstructive and/or irritative voiding symptoms. Sexual dysfunction and psychological symptoms are frequently added to these symptoms. We also investigated the relationship between sexual functions, and lower urinary system symptoms, and asymptomatic histological prostatitis detected on transrectal ultrasound-guided (TRUS) biopsy performed with the indication of high PSA levels. Sixty cases compliant with the study criteria among patients who underwent prostate biopsies between September 2014 and June 2015 with the indication of higher PSA levels were included in the study. All patients were requested to complete IIEF-5 and IPSS forms one day previously. Based on histological analysis of biopsy materials, the patients were allocated into groups of BPH (simple BPH without histological prostatitis) (n:30) and histological chronic prostatitis (combination of BPH and histological prostatitis) (n:30). Mean age of the cases was 65.73±5.01 (range, 56-75 yrs) years. PSA levels ranged between 4-15 ng/ml. A statistically significant intergroup difference was not found regarding mean age, BMIs, PSA levels, incidence rates of hypertension and coronary artery disease (p>0.05). Prostate volumes of the HCP group were higher than those of the BPH group , with statistically significant differences (p:0.001; p<0.01). Questionnaire forms of the patients included in the study were statistically evaluated, and mean IPSS score of the HCP group was found to be higher when compared with that of the BPH group, with statistically significant differences. (p:0.016; p<0.05). However mean IIEF score of the BPH group was higher than that of the HCP group, with statistically significant differences (p:0.039; p<0.05). These findings suggested the presence of a correlation between chronic inflammation and lower urinary tract symptoms (LUTS). In addition, statistically significant lower IIEF values in patients with histological chronic prostatitis relative to those without suggested negative effects of even asymptomatic inflammation on sexual functions and mechanism of erection.

  12. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  13. Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, J S; Morales, K E; Whipple, R E

    Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of themore » material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'« less

  14. Whole-Range Assessment: A Simple Method for Analysing Allelopathic Dose-Response Data

    PubMed Central

    An, Min; Pratley, J. E.; Haig, T.; Liu, D.L.

    2005-01-01

    Based on the typical biological responses of an organism to allelochemicals (hormesis), concepts of whole-range assessment and inhibition index were developed for improved analysis of allelopathic data. Examples of their application are presented using data drawn from the literature. The method is concise and comprehensive, and makes data grouping and multiple comparisons simple, logical, and possible. It improves data interpretation, enhances research outcomes, and is a statistically efficient summary of the plant response profiles. PMID:19330165

  15. Normal and abnormal human vestibular ocular function

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Black, F. O.

    1986-01-01

    The major motivation of this research is to understand the role the vestibular system plays in sensorimotor interactions which result in spatial disorientation and motion sickness. A second goal was to explore the range of abnormality as it is reflected in quantitative measures of vestibular reflex responses. The results of a study of vestibular reflex measurements in normal subjects and preliminary results in abnormal subjects are presented in this report. Statistical methods were used to define the range of normal responses, and determine age related changes in function.

  16. On the Relationship Between Transfer Function-derived Response Times and Hydrograph Analysis Timing Parameters: Are there Similarities?

    NASA Astrophysics Data System (ADS)

    Bansah, S.; Ali, G.; Haque, M. A.; Tang, V.

    2017-12-01

    The proportion of precipitation that becomes streamflow is a function of internal catchment characteristics - which include geology, landscape characteristics and vegetation - and influence overall storage dynamics. The timing and quantity of water discharged by a catchment are indeed embedded in event hydrographs. Event hydrograph timing parameters, such as the response lag and time of concentration, are important descriptors of how long it takes the catchment to respond to input precipitation and how long it takes the latter to filter through the catchment. However, the extent to which hydrograph timing parameters relate to average response times derived from fitting transfer functions to annual hydrographs is unknown. In this study, we used a gamma transfer function to determine catchment average response times as well as event-specific hydrograph parameters across a network of eight nested watersheds ranging from 0.19 km2 to 74.6 km2 prairie catchments located in south central Manitoba (Canada). Various statistical analyses were then performed to correlate average response times - estimated using the parameters of the fitted gamma transfer function - to event-specific hydrograph parameters. Preliminary results show significant interannual variations in response times and hydrograph timing parameters: the former were in the order of a few hours to days, while the latter ranged from a few days to weeks. Some statistically significant relationships were detected between response times and event-specific hydrograph parameters. Future analyses will involve the comparison of statistical distributions of event-specific hydrograph parameters with that of runoff response times and baseflow transit times in order to quantity catchment storage dynamics across a range of temporal scales.

  17. TURBULENCE-INDUCED RELATIVE VELOCITY OF DUST PARTICLES. IV. THE COLLISION KERNEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Liubin; Padoan, Paolo, E-mail: lpan@cfa.harvard.edu, E-mail: ppadoan@icc.ub.edu

    Motivated by its importance for modeling dust particle growth in protoplanetary disks, we study turbulence-induced collision statistics of inertial particles as a function of the particle friction time, τ{sub p}. We show that turbulent clustering significantly enhances the collision rate for particles of similar sizes with τ{sub p} corresponding to the inertial range of the flow. If the friction time, τ{sub p,} {sub h}, of the larger particle is in the inertial range, the collision kernel per unit cross section increases with increasing friction time, τ{sub p,} {sub l}, of the smaller particle and reaches the maximum at τ{sub p,}more » {sub l} = τ{sub p,} {sub h}, where the clustering effect peaks. This feature is not captured by the commonly used kernel formula, which neglects the effect of clustering. We argue that turbulent clustering helps alleviate the bouncing barrier problem for planetesimal formation. We also investigate the collision velocity statistics using a collision-rate weighting factor to account for higher collision frequency for particle pairs with larger relative velocity. For τ{sub p,} {sub h} in the inertial range, the rms relative velocity with collision-rate weighting is found to be invariant with τ{sub p,} {sub l} and scales with τ{sub p,} {sub h} roughly as ∝ τ{sub p,h}{sup 1/2}. The weighting factor favors collisions with larger relative velocity, and including it leads to more destructive and less sticking collisions. We compare two collision kernel formulations based on spherical and cylindrical geometries. The two formulations give consistent results for the collision rate and the collision-rate weighted statistics, except that the spherical formulation predicts more head-on collisions than the cylindrical formulation.« less

  18. Statistical properties of excited nuclei in the mass range 47 Less-Than-Or-Slanted-Equal-To A Less-Than-Or-Slanted-Equal-To 59

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhuravlev, B. V., E-mail: zhurav@ippe.ru; Lychagin, A. A., E-mail: Lychagin1@yandex.ru; Titarenko, N. N.

    Level densities and their energy dependences for nuclei in the mass range of 47 {<=} A {<=} 59 were determined from the results obtained by measuring neutron-evaporation spectra in respective (p, n) reactions. The spectra of neutrons originating from the (p, n) reactions on {sup 47}Ti, {sup 48}Ti, {sup 49}Ti, {sup 53}Cr, {sup 54}Cr, {sup 57}Fe, and {sup 59}Co nuclei were measured in the proton-energy range of 7-11 MeV. These measurements were performed with the aid of a fast-neutron spectrometer by the time-of-flight method over the base of the EGP-15 pulsed tandem accelerator installed at the Institute for Physics andmore » Power Engineering (Obninsk, Russia). A high resolution of the spectrometer and its stability in the time of flight made it possible to identify reliably discrete low-lying levels along with the continuum part of neutron spectra. Our measured data were analyzed within the statistical equilibrium and preequilibrium models of nuclear reactions. The respective calculations were performed with the aid of the Hauser-Feshbach formalismof statistical theory supplemented with the generalized model of a superfluid nucleus, the back-shifted Fermi gas model, and the Gilbert-Cameron composite formula for nuclear level densities. Nuclear level densities for {sup 47}V, {sup 48}V, {sup 49}V, {sup 53}Mn, {sup 54}Mn, {sup 57}Co, and {sup 59}Ni and their energy dependences were determined. The results are discussed and compared with available experimental data and with recommendations of model-based systematics.« less

  19. Testing for X-Ray–SZ Differences and Redshift Evolution in the X-Ray Morphology of Galaxy Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurgaliev, D.; McDonald, M.; Benson, B. A.

    We present a quantitative study of the X-ray morphology of galaxy clusters, as a function of their detection method and redshift. We analyze two separate samples of galaxy clusters: a sample of 36 clusters atmore » $$0.35\\lt z\\lt 0.9$$ selected in the X-ray with the ROSAT PSPC 400 deg(2) survey, and a sample of 90 clusters at $$0.25\\lt z\\lt 1.2$$ selected via the Sunyaev–Zel’dovich (SZ) effect with the South Pole Telescope. Clusters from both samples have similar-quality Chandra observations, which allow us to quantify their X-ray morphologies via two distinct methods: centroid shifts (w) and photon asymmetry ($${A}_{\\mathrm{phot}}$$). The latter technique provides nearly unbiased morphology estimates for clusters spanning a broad range of redshift and data quality. We further compare the X-ray morphologies of X-ray- and SZ-selected clusters with those of simulated clusters. We do not find a statistically significant difference in the measured X-ray morphology of X-ray and SZ-selected clusters over the redshift range probed by these samples, suggesting that the two are probing similar populations of clusters. We find that the X-ray morphologies of simulated clusters are statistically indistinguishable from those of X-ray- or SZ-selected clusters, implying that the most important physics for dictating the large-scale gas morphology (outside of the core) is well-approximated in these simulations. Finally, we find no statistically significant redshift evolution in the X-ray morphology (both for observed and simulated clusters), over the range of $$z\\sim 0.3$$ to $$z\\sim 1$$, seemingly in contradiction with the redshift-dependent halo merger rate predicted by simulations.« less

  20. Estimating Water Supply Arsenic Levels in the New England Bladder Cancer Study

    PubMed Central

    Freeman, Laura E. Beane; Lubin, Jay H.; Airola, Matthew S.; Baris, Dalsu; Ayotte, Joseph D.; Taylor, Anne; Paulu, Chris; Karagas, Margaret R.; Colt, Joanne; Ward, Mary H.; Huang, An-Tsun; Bress, William; Cherala, Sai; Silverman, Debra T.; Cantor, Kenneth P.

    2011-01-01

    Background: Ingestion of inorganic arsenic in drinking water is recognized as a cause of bladder cancer when levels are relatively high (≥ 150 µg/L). The epidemiologic evidence is less clear at the low-to-moderate concentrations typically observed in the United States. Accurate retrospective exposure assessment over a long time period is a major challenge in conducting epidemiologic studies of environmental factors and diseases with long latency, such as cancer. Objective: We estimated arsenic concentrations in the water supplies of 2,611 participants in a population-based case–control study in northern New England. Methods: Estimates covered the lifetimes of most study participants and were based on a combination of arsenic measurements at the homes of the participants and statistical modeling of arsenic concentrations in the water supply of both past and current homes. We assigned a residential water supply arsenic concentration for 165,138 (95%) of the total 173,361 lifetime exposure years (EYs) and a workplace water supply arsenic level for 85,195 EYs (86% of reported occupational years). Results: Three methods accounted for 93% of the residential estimates of arsenic concentration: direct measurement of water samples (27%; median, 0.3 µg/L; range, 0.1–11.5), statistical models of water utility measurement data (49%; median, 0.4 µg/L; range, 0.3–3.3), and statistical models of arsenic concentrations in wells using aquifers in New England (17%; median, 1.6 µg/L; range, 0.6–22.4). Conclusions: We used a different validation procedure for each of the three methods, and found our estimated levels to be comparable with available measured concentrations. This methodology allowed us to calculate potential drinking water exposure over long periods. PMID:21421449

Top