Sample records for full statistical analysis

  1. Generalized Full-Information Item Bifactor Analysis

    ERIC Educational Resources Information Center

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single-group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of…

  2. Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation

    NASA Technical Reports Server (NTRS)

    Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.

    1998-01-01

    The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.

  3. Rock Statistics at the Mars Pathfinder Landing Site, Roughness and Roving on Mars

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Bridges, N. T.; Anderson, R. C.; Golombek, M. P.

    1999-01-01

    Several rock counts have been carried out at the Mars Pathfinder landing site producing consistent statistics of rock coverage and size-frequency distributions. These rock statistics provide a primary element of "ground truth" for anchoring remote sensing information used to pick the Pathfinder, and future, landing sites. The observed rock population statistics should also be consistent with the emplacement and alteration processes postulated to govern the landing site landscape. The rock population databases can however be used in ways that go beyond the calculation of cumulative number and cumulative area distributions versus rock diameter and height. Since the spatial parameters measured to characterize each rock are determined with stereo image pairs, the rock database serves as a subset of the full landing site digital terrain model (DTM). Insofar as a rock count can be carried out in a speedier, albeit coarser, manner than the full DTM analysis, rock counting offers several operational and scientific products in the near term. Quantitative rock mapping adds further information to the geomorphic study of the landing site, and can also be used for rover traverse planning. Statistical analysis of the surface roughness using the rock count proxy DTM is sufficiently accurate when compared to the full DTM to compare with radar remote sensing roughness measures, and with rover traverse profiles.

  4. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  5. Waiting time distribution revealing the internal spin dynamics in a double quantum dot

    NASA Astrophysics Data System (ADS)

    Ptaszyński, Krzysztof

    2017-07-01

    Waiting time distribution and the zero-frequency full counting statistics of unidirectional electron transport through a double quantum dot molecule attached to spin-polarized leads are analyzed using the quantum master equation. The waiting time distribution exhibits a nontrivial dependence on the value of the exchange coupling between the dots and the gradient of the applied magnetic field, which reveals the oscillations between the spin states of the molecule. The zero-frequency full counting statistics, on the other hand, is independent of the aforementioned quantities, thus giving no insight into the internal dynamics. The fact that the waiting time distribution and the zero-frequency full counting statistics give a nonequivalent information is associated with two factors. Firstly, it can be explained by the sensitivity to different timescales of the dynamics of the system. Secondly, it is associated with the presence of the correlation between subsequent waiting times, which makes the renewal theory, relating the full counting statistics and the waiting time distribution, no longer applicable. The study highlights the particular usefulness of the waiting time distribution for the analysis of the internal dynamics of mesoscopic systems.

  6. Proceedings of the second annual Forest Inventory and Analysis symposium; Salt Lake City, UT. October 17-18, 2000

    Treesearch

    Gregory A. Reams; Ronald E. McRoberts; Paul C. van Deusen; [Editors

    2001-01-01

    Documents progress in developing techniques in remote sensing, statistics, information management, and analysis required for full implementation of the national Forest Inventory and Analysis program’s annual forest inventory system.

  7. Do-it-yourself statistics: A computer-assisted likelihood approach to analysis of data from genetic crosses.

    PubMed Central

    Robbins, L G

    2000-01-01

    Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965

  8. Tunneling Statistics for Analysis of Spin-Readout Fidelity

    NASA Astrophysics Data System (ADS)

    Gorman, S. K.; He, Y.; House, M. G.; Keizer, J. G.; Keith, D.; Fricke, L.; Hile, S. J.; Broome, M. A.; Simmons, M. Y.

    2017-09-01

    We investigate spin and charge dynamics of a quantum dot of phosphorus atoms coupled to a radio-frequency single-electron transistor (SET) using full counting statistics. We show how the magnetic field plays a role in determining the bunching or antibunching tunneling statistics of the donor dot and SET system. Using the counting statistics, we show how to determine the lowest magnetic field where spin readout is possible. We then show how such a measurement can be used to investigate and optimize single-electron spin-readout fidelity.

  9. Rolling-Element Fatigue Testing and Data Analysis - A Tutorial

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.

    2011-01-01

    In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.

  10. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  11. Kansas's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  12. Performance of an Axisymmetric Rocket Based Combined Cycle Engine During Rocket Only Operation Using Linear Regression Analysis

    NASA Technical Reports Server (NTRS)

    Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.

    1998-01-01

    The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.

  13. Evaluation of a segment-based LANDSAT full-frame approach to corp area estimation

    NASA Technical Reports Server (NTRS)

    Bauer, M. E. (Principal Investigator); Hixson, M. M.; Davis, S. M.

    1981-01-01

    As the registration of LANDSAT full frames enters the realm of current technology, sampling methods should be examined which utilize other than the segment data used for LACIE. The effect of separating the functions of sampling for training and sampling for area estimation. The frame selected for analysis was acquired over north central Iowa on August 9, 1978. A stratification of he full-frame was defined. Training data came from segments within the frame. Two classification and estimation procedures were compared: statistics developed on one segment were used to classify that segment, and pooled statistics from the segments were used to classify a systematic sample of pixels. Comparisons to USDA/ESCS estimates illustrate that the full-frame sampling approach can provide accurate and precise area estimates.

  14. Full information acquisition in scanning probe microscopy and spectroscopy

    DOEpatents

    Jesse, Stephen; Belianinov, Alex; Kalinin, Sergei V.; Somnath, Suhas

    2017-04-04

    Apparatus and methods are described for scanning probe microscopy and spectroscopy based on acquisition of full probe response. The full probe response contains valuable information about the probe-sample interaction that is lost in traditional scanning probe microscopy and spectroscopy methods. The full probe response is analyzed post data acquisition using fast Fourier transform and adaptive filtering, as well as multivariate analysis. The full response data is further compressed to retain only statistically significant components before being permanently stored.

  15. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  16. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Treesearch

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  17. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  18. Effect and safety of early weight-bearing on the outcome after open-wedge high tibial osteotomy: a systematic review and meta-analysis.

    PubMed

    Lee, O-Sung; Ahn, Soyeon; Lee, Yong Seuk

    2017-07-01

    The purpose of this systematic review and meta-analysis was to evaluate the effectiveness and safety of early weight-bearing by comparing clinical and radiological outcomes between early and traditional delayed weight-bearing after OWHTO. A rigorous and systematic approach was used. The methodological quality was also assessed. Results that are possible to be compared in two or more than two articles were presented as forest plots. A 95% confidence interval was calculated for each effect size, and we calculated the I 2 statistic, which presents the percentage of total variation attributable to the heterogeneity among studies. The random-effects model was used to calculate the effect size. Six articles were included in the final analysis. All case groups were composed of early full weight-bearing within 2 weeks. All control groups were composed of late full weight-bearing between 6 weeks and 2 months. Pooled analysis was possible for the improvement in Lysholm score, but there was no statistically significant difference shown between groups. Other clinical results were also similar between groups. Four studies reported mechanical femorotibial angle (mFTA) and this result showed no statistically significant difference between groups in the pooled analysis. Furthermore, early weight-bearing showed more favorable results in some radiologic results (osseointegration and patellar height) and complications (thrombophlebitis and recurrence). Our analysis supports that early full weight-bearing after OWHTO using a locking plate leads to improvement in outcomes and was comparable to the delayed weight-bearing in terms of clinical and radiological outcomes. On the contrary, early weight-bearing was more favorable with respect to some radiologic parameters and complications compared with delayed weight-bearing.

  19. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  20. Statistics on continuous IBD data: Exact distribution evaluation for a pair of full(half)-sibs and a pair of a (great-) grandchild with a (great-) grandparent

    PubMed Central

    Stefanov, Valeri T

    2002-01-01

    Background Pairs of related individuals are widely used in linkage analysis. Most of the tests for linkage analysis are based on statistics associated with identity by descent (IBD) data. The current biotechnology provides data on very densely packed loci, and therefore, it may provide almost continuous IBD data for pairs of closely related individuals. Therefore, the distribution theory for statistics on continuous IBD data is of interest. In particular, distributional results which allow the evaluation of p-values for relevant tests are of importance. Results A technology is provided for numerical evaluation, with any given accuracy, of the cumulative probabilities of some statistics on continuous genome data for pairs of closely related individuals. In the case of a pair of full-sibs, the following statistics are considered: (i) the proportion of genome with 2 (at least 1) haplotypes shared identical-by-descent (IBD) on a chromosomal segment, (ii) the number of distinct pieces (subsegments) of a chromosomal segment, on each of which exactly 2 (at least 1) haplotypes are shared IBD. The natural counterparts of these statistics for the other relationships are also considered. Relevant Maple codes are provided for a rapid evaluation of the cumulative probabilities of such statistics. The genomic continuum model, with Haldane's model for the crossover process, is assumed. Conclusions A technology, together with relevant software codes for its automated implementation, are provided for exact evaluation of the distributions of relevant statistics associated with continuous genome data on closely related individuals. PMID:11996673

  1. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation ofmore » the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.« less

  2. Bayesian networks and statistical analysis application to analyze the diagnostic test accuracy

    NASA Astrophysics Data System (ADS)

    Orzechowski, P.; Makal, Jaroslaw; Onisko, A.

    2005-02-01

    The computer aided BPH diagnosis system based on Bayesian network is described in the paper. First result are compared to a given statistical method. Different statistical methods are used successfully in medicine for years. However, the undoubted advantages of probabilistic methods make them useful in application in newly created systems which are frequent in medicine, but do not have full and competent knowledge. The article presents advantages of the computer aided BPH diagnosis system in clinical practice for urologists.

  3. Full-text publication of abstracts presented at European Orthodontic Society congresses.

    PubMed

    Livas, Christos; Pandis, Nikolaos; Ren, Yijin

    2014-10-01

    Empirical evidence has indicated that only a subsample of studies conducted reach full-text publication and this phenomenon has become known as publication bias. A form of publication bias is the selectively delayed full publication of conference abstracts. The objective of this article was to examine the publication status of oral abstracts and poster-presentation abstracts, included in the scientific program of the 82nd and 83rd European Orthodontic Society (EOS) congresses, held in 2006 and 2007, and to identify factors associated with full-length publication. A systematic search of PubMed and Google Scholar databases was performed in April 2013 using author names and keywords from the abstract title to locate abstract and full-article publications. Information regarding mode of presentation, type of affiliation, geographical origin, statistical results, and publication details were collected and analyzed using univariable and multivariable logistic regression. Approximately 51 per cent of the EOS 2006 and 55 per cent of the EOS 2007 abstracts appeared in print more than 5 years post congress. A mean period of 1.32 years elapsed between conference and publication date. Mode of presentation (oral or poster), use of statistical analysis, and research subject area were significant predictors for publication success. Inherent discrepancies of abstract reporting, mainly related to presentation of preliminary results and incomplete description of methods, may be considered in analogous studies. On average 52.2 per cent of the abstracts presented at the two EOS conferences reached full publication. Abstracts presented orally, including statistical analysis, were more likely to get published. © The Author 2013. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. The role of environmental heterogeneity in meta-analysis of gene-environment interactions with quantitative traits.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Taylor, Jeremy M G; Rice, Kenneth M; Wen, Xiaoquan; Rice, John D; Stringham, Heather M; Boehnke, Michael

    2014-07-01

    With challenges in data harmonization and environmental heterogeneity across various data sources, meta-analysis of gene-environment interaction studies can often involve subtle statistical issues. In this paper, we study the effect of environmental covariate heterogeneity (within and between cohorts) on two approaches for fixed-effect meta-analysis: the standard inverse-variance weighted meta-analysis and a meta-regression approach. Akin to the results in Simmonds and Higgins (), we obtain analytic efficiency results for both methods under certain assumptions. The relative efficiency of the two methods depends on the ratio of within versus between cohort variability of the environmental covariate. We propose to use an adaptively weighted estimator (AWE), between meta-analysis and meta-regression, for the interaction parameter. The AWE retains full efficiency of the joint analysis using individual level data under certain natural assumptions. Lin and Zeng (2010a, b) showed that a multivariate inverse-variance weighted estimator retains full efficiency as joint analysis using individual level data, if the estimates with full covariance matrices for all the common parameters are pooled across all studies. We show consistency of our work with Lin and Zeng (2010a, b). Without sacrificing much efficiency, the AWE uses only univariate summary statistics from each study, and bypasses issues with sharing individual level data or full covariance matrices across studies. We compare the performance of the methods both analytically and numerically. The methods are illustrated through meta-analysis of interaction between Single Nucleotide Polymorphisms in FTO gene and body mass index on high-density lipoprotein cholesterol data from a set of eight studies of type 2 diabetes. © 2014 WILEY PERIODICALS, INC.

  5. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  6. An Interinstitutional Analysis of Faculty Teaching Load.

    ERIC Educational Resources Information Center

    Ahrens, Stephen W.

    A two-year interinstitutional study among 15 cooperating universities was conducted to determine whether significant differences exist in teaching loads among the selected universities as measured by student credit hours produced by full-time equivalent faculty. The statistical model was a multivariate analysis of variance with fixed effects and…

  7. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  8. [Comparative study of the repair of full thickness tear of the supraspinatus by means of "single row" or "suture bridge" techniques].

    PubMed

    Arroyo-Hernández, M; Mellado-Romero, M A; Páramo-Díaz, P; Martín-López, C M; Cano-Egea, J M; Vilá Y Rico, J

    2015-01-01

    The purpose of this study is to analyze if there is any difference between the arthroscopic reparation of full-thickness supraspinatus tears with simple row technique versus suture bridge technique. We accomplished a retrospective study of 123 patients with full-thickness supraspinatus tears between January 2009 and January 2013 in our hospital. There were 60 simple row reparations, and 63 suture bridge ones. The mean age in the simple row group was 62.9, and in the suture bridge group was 63.3 years old. There were more women than men in both groups (67%). All patients were studied using the Constant test. The mean Constant test in the suture bridge group was 76.7, and in the simple row group was 72.4. We have also accomplished a statistical analysis of each Constant item. Strength was higher in the suture bridge group, with a significant statistical difference (p 0.04). The range of movement was also greater in the suture bridge group, but was not statistically significant. Suture bridge technique has better clinical results than single row reparations, but the difference is not statistically significant (p = 0.298).

  9. Multilevel Structural Equation Models for the Analysis of Comparative Data on Educational Performance

    ERIC Educational Resources Information Center

    Goldstein, Harvey; Bonnet, Gerard; Rocher, Thierry

    2007-01-01

    The Programme for International Student Assessment comparative study of reading performance among 15-year-olds is reanalyzed using statistical procedures that allow the full complexity of the data structures to be explored. The article extends existing multilevel factor analysis and structural equation models and shows how this can extract richer…

  10. A full-spectrum analysis of high-speed train interior noise under multi-physical-field coupling excitations

    NASA Astrophysics Data System (ADS)

    Zheng, Xu; Hao, Zhiyong; Wang, Xu; Mao, Jie

    2016-06-01

    High-speed-railway-train interior noise at low, medium, and high frequencies could be simulated by finite element analysis (FEA) or boundary element analysis (BEA), hybrid finite element analysis-statistical energy analysis (FEA-SEA) and statistical energy analysis (SEA), respectively. First, a new method named statistical acoustic energy flow (SAEF) is proposed, which can be applied to the full-spectrum HST interior noise simulation (including low, medium, and high frequencies) with only one model. In an SAEF model, the corresponding multi-physical-field coupling excitations are firstly fully considered and coupled to excite the interior noise. The interior noise attenuated by sound insulation panels of carriage is simulated through modeling the inflow acoustic energy from the exterior excitations into the interior acoustic cavities. Rigid multi-body dynamics, fast multi-pole BEA, and large-eddy simulation with indirect boundary element analysis are first employed to extract the multi-physical-field excitations, which include the wheel-rail interaction forces/secondary suspension forces, the wheel-rail rolling noise, and aerodynamic noise, respectively. All the peak values and their frequency bands of the simulated acoustic excitations are validated with those from the noise source identification test. Besides, the measured equipment noise inside equipment compartment is used as one of the excitation sources which contribute to the interior noise. Second, a full-trimmed FE carriage model is firstly constructed, and the simulated modal shapes and frequencies agree well with the measured ones, which has validated the global FE carriage model as well as the local FE models of the aluminum alloy-trim composite panel. Thus, the sound transmission loss model of any composite panel has indirectly been validated. Finally, the SAEF model of the carriage is constructed based on the accurate FE model and stimulated by the multi-physical-field excitations. The results show that the trend of the simulated 1/3 octave band sound pressure spectrum agrees well with that of the on-site-measured one. The deviation between the simulated and measured overall sound pressure level (SPL) is 2.6 dB(A) and well controlled below the engineering tolerance limit, which has validated the SAEF model in the full-spectrum analysis of the high speed train interior noise.

  11. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  12. Sensitivity of the Halstead and Wechsler Test Batteries to brain damage: Evidence from Reitan's original validation sample.

    PubMed

    Loring, David W; Larrabee, Glenn J

    2006-06-01

    The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.

  13. The Effect of Missing Data Handling Methods on Goodness of Fit Indices in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Köse, Alper

    2014-01-01

    The primary objective of this study was to examine the effect of missing data on goodness of fit statistics in confirmatory factor analysis (CFA). For this aim, four missing data handling methods; listwise deletion, full information maximum likelihood, regression imputation and expectation maximization (EM) imputation were examined in terms of…

  14. Valid Statistical Analysis for Logistic Regression with Multiple Sources

    NASA Astrophysics Data System (ADS)

    Fienberg, Stephen E.; Nardi, Yuval; Slavković, Aleksandra B.

    Considerable effort has gone into understanding issues of privacy protection of individual information in single databases, and various solutions have been proposed depending on the nature of the data, the ways in which the database will be used and the precise nature of the privacy protection being offered. Once data are merged across sources, however, the nature of the problem becomes far more complex and a number of privacy issues arise for the linked individual files that go well beyond those that are considered with regard to the data within individual sources. In the paper, we propose an approach that gives full statistical analysis on the combined database without actually combining it. We focus mainly on logistic regression, but the method and tools described may be applied essentially to other statistical models as well.

  15. Testing statistical isotropy in cosmic microwave background polarization maps

    NASA Astrophysics Data System (ADS)

    Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.

    2018-04-01

    We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.

  16. A Three Dimensional Kinematic and Kinetic Study of the Golf Swing

    PubMed Central

    Nesbit, Steven M.

    2005-01-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665

  17. A three dimensional kinematic and kinetic study of the golf swing.

    PubMed

    Nesbit, Steven M

    2005-12-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.

  18. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  19. Statistical analysis of sperm sorting

    NASA Astrophysics Data System (ADS)

    Koh, James; Marcos, Marcos

    2017-11-01

    The success rate of assisted reproduction depends on the proportion of morphologically normal sperm. It is possible to use an external field for manipulation and sorting. Depending on their morphology, the extent of response varies. Due to the wide distribution in sperm morphology even among individuals, the resulting distribution of kinematic behaviour, and consequently the feasibility of sorting, should be analysed statistically. In this theoretical work, Resistive Force Theory and Slender Body Theory will be applied and compared. Full name is Marcos.

  20. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less

  1. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  3. Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.

    PubMed

    Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos

    To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.

  4. Methods for evaluating temporal groundwater quality data and results of decadal-scale changes in chloride, dissolved solids, and nitrate concentrations in groundwater in the United States, 1988-2010

    USGS Publications Warehouse

    Lindsey, Bruce D.; Rupert, Michael G.

    2012-01-01

    Decadal-scale changes in groundwater quality were evaluated by the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. Samples of groundwater collected from wells during 1988-2000 - a first sampling event representing the decade ending the 20th century - were compared on a pair-wise basis to samples from the same wells collected during 2001-2010 - a second sampling event representing the decade beginning the 21st century. The data set consists of samples from 1,236 wells in 56 well networks, representing major aquifers and urban and agricultural land-use areas, with analytical results for chloride, dissolved solids, and nitrate. Statistical analysis was done on a network basis rather than by individual wells. Although spanning slightly more or less than a 10-year period, the two-sample comparison between the first and second sampling events is referred to as an analysis of decadal-scale change based on a step-trend analysis. The 22 principal aquifers represented by these 56 networks account for nearly 80 percent of the estimated withdrawals of groundwater used for drinking-water supply in the Nation. Well networks where decadal-scale changes in concentrations were statistically significant were identified using the Wilcoxon-Pratt signed-rank test. For the statistical analysis of chloride, dissolved solids, and nitrate concentrations at the network level, more than half revealed no statistically significant change over the decadal period. However, for networks that had statistically significant changes, increased concentrations outnumbered decreased concentrations by a large margin. Statistically significant increases of chloride concentrations were identified for 43 percent of 56 networks. Dissolved solids concentrations increased significantly in 41 percent of the 54 networks with dissolved solids data, and nitrate concentrations increased significantly in 23 percent of 56 networks. At least one of the three - chloride, dissolved solids, or nitrate - had a statistically significant increase in concentration in 66 percent of the networks. Statistically significant decreases in concentrations were identified in 4 percent of the networks for chloride, 2 percent of the networks for dissolved solids, and 9 percent of the networks for nitrate. A larger percentage of urban land-use networks had statistically significant increases in chloride, dissolved solids, and nitrate concentrations than agricultural land-use networks. In order to assess the magnitude of statistically significant changes, the median of the differences between constituent concentrations from the first full-network sampling event and those from the second full-network sampling event was calculated using the Turnbull method. The largest median decadal increases in chloride concentrations were in networks in the Upper Illinois River Basin (67 mg/L) and in the New England Coastal Basins (34 mg/L), whereas the largest median decadal decrease in chloride concentrations was in the Upper Snake River Basin (1 mg/L). The largest median decadal increases in dissolved solids concentrations were in networks in the Rio Grande Valley (260 mg/L) and the Upper Illinois River Basin (160 mg/L). The largest median decadal decrease in dissolved solids concentrations was in the Apalachicola-Chattahoochee-Flint River Basin (6.0 mg/L). The largest median decadal increases in nitrate as nitrogen (N) concentrations were in networks in the South Platte River Basin (2.0 mg/L as N) and the San Joaquin-Tulare Basins (1.0 mg/L as N). The largest median decadal decrease in nitrate concentrations was in the Santee River Basin and Coastal Drainages (0.63 mg/L). The magnitude of change in networks with statistically significant increases typically was much larger than the magnitude of change in networks with statistically significant decreases. The magnitude of change was greatest for chloride in the urban land-use networks and greatest for dissolved solids and nitrate in the agricultural land-use networks. Analysis of data from all networks combined indicated statistically significant increases for chloride, dissolved solids, and nitrate. Although chloride, dissolved solids, and nitrate concentrations were typically less than the drinking-water standards and guidelines, a statistical test was used to determine whether or not the proportion of samples exceeding the drinking-water standard or guideline changed significantly between the first and second full-network sampling events. The proportion of samples exceeding the U.S. Environmental Protection Agency (USEPA) Secondary Maximum Contaminant Level for dissolved solids (500 milligrams per liter) increased significantly between the first and second full-network sampling events when evaluating all networks combined at the national level. Also, for all networks combined, the proportion of samples exceeding the USEPA Maximum Contaminant Level (MCL) of 10 mg/L as N for nitrate increased significantly. One network in the Delmarva Peninsula had a significant increase in the proportion of samples exceeding the MCL for nitrate. A subset of 261 wells was sampled every other year (biennially) to evaluate decadal-scale changes using a time-series analysis. The analysis of the biennial data set showed that changes were generally similar to the findings from the analysis of decadal-scale change that was based on a step-trend analysis. Because of the small number of wells in a network with biennial data (typically 4-5 wells), the time-series analysis is more useful for understanding water-quality responses to changes in site-specific conditions rather than as an indicator of the change for the entire network.

  5. Cochlear Implant Electrode Array From Partial to Full Insertion in Non-Human Primate Model.

    PubMed

    Manrique-Huarte, Raquel; Calavia, Diego; Gallego, Maria Antonia; Manrique, Manuel

    2018-04-01

    To determine the feasibility of progressive insertion (two sequential surgeries: partial to full insertion) of an electrode array and to compare functional outcomes. 8 normal-hearing animals (Macaca fascicularis (MF)) were included. A 14 contact electrode array, which is suitably sized for the MF cochlea was partially inserted (PI) in 16 ears. After 3 months of follow-up revision surgery the electrode was advanced to a full insertion (FI) in 8 ears. Radiological examination and auditory testing was performed monthly for 6 months. In order to compare the values a two way repeated measures ANOVA was used. A p-value below 0.05 was considered as statistically significant. IBM SPSS Statistics V20 was used. Surgical procedure was completed in all cases with no complications. Mean auditory threshold shift (ABR click tones) after 6 months follow-up is 19 dB and 27 dB for PI and FI group. For frequencies 4, 6, 8, 12, and 16 kHz in the FI group, tone burst auditory thresholds increased after the revision surgery showing no recovery thereafter. Mean threshold shift at 6 months of follow- up is 19.8 dB ranging from 2 to 36dB for PI group and 33.14dB ranging from 8 to 48dB for FI group. Statistical analysis yields no significant differences between groups. It is feasible to perform a partial insertion of an electrode array and progress on a second surgical time to a full insertion (up to 270º). Hearing preservation is feasible for both procedures. Note that a minimal threshold deterioration is depicted among full insertion group, especially among high frequencies, with no statistical differences.

  6. Association of Chairmen of Departments of Physiology Analysis of Annual Questionnaire--1983/84.

    ERIC Educational Resources Information Center

    Physiologist, 1984

    1984-01-01

    Presents the full questionnaire sent to chairmen of physiology departments, with statistical data (grand totals and means per department) provided for each item on the questionnaire. Includes histograms of faculty salaries and data on departmental budgets and space. (JN)

  7. Lehrer in der Bundesrepublik Deutschland. Eine Kritische Analyse Statistischer Daten uber das Lehrpersonal an Allgemeinbildenden Schulen. (Education in the Federal Republic of Germany. A Statistical Study of Teachers in Schools of General Education.)

    ERIC Educational Resources Information Center

    Kohler, Helmut

    The purpose of this study was to analyze the available statistics concerning teachers in schools of general education in the Federal Republic of Germany. An analysis of the demographic structure of the pool of full-time teachers showed that in 1971 30 percent of the teachers were under age 30, and 50 percent were under age 35. It was expected that…

  8. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  9. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  10. Impact of Satellite Viewing-Swath Width on Global and Regional Aerosol Optical Thickness Statistics and Trends

    NASA Technical Reports Server (NTRS)

    Colarco, P. R.; Kahn, R. A.; Remer, L. A.; Levy, R. C.

    2014-01-01

    We use the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite aerosol optical thickness (AOT) product to assess the impact of reduced swath width on global and regional AOT statistics and trends. Alongtrack and across-track sampling strategies are employed, in which the full MODIS data set is sub-sampled with various narrow-swath (approximately 400-800 km) and single pixel width (approximately 10 km) configurations. Although view-angle artifacts in the MODIS AOT retrieval confound direct comparisons between averages derived from different sub-samples, careful analysis shows that with many portions of the Earth essentially unobserved, spatial sampling introduces uncertainty in the derived seasonal-regional mean AOT. These AOT spatial sampling artifacts comprise up to 60%of the full-swath AOT value under moderate aerosol loading, and can be as large as 0.1 in some regions under high aerosol loading. Compared to full-swath observations, narrower swath and single pixel width sampling exhibits a reduced ability to detect AOT trends with statistical significance. On the other hand, estimates of the global, annual mean AOT do not vary significantly from the full-swath values as spatial sampling is reduced. Aggregation of the MODIS data at coarse grid scales (10 deg) shows consistency in the aerosol trends across sampling strategies, with increased statistical confidence, but quantitative errors in the derived trends are found even for the full-swath data when compared to high spatial resolution (0.5 deg) aggregations. Using results of a model-derived aerosol reanalysis, we find consistency in our conclusions about a seasonal-regional spatial sampling artifact in AOT Furthermore, the model shows that reduced spatial sampling can amount to uncertainty in computed shortwave top-ofatmosphere aerosol radiative forcing of 2-3 W m(sup-2). These artifacts are lower bounds, as possibly other unconsidered sampling strategies would perform less well. These results suggest that future aerosol satellite missions having significantly less than full-swath viewing are unlikely to sample the true AOT distribution well enough to obtain the statistics needed to reduce uncertainty in aerosol direct forcing of climate.

  11. A primer on the study of transitory dynamics in ecological series using the scale-dependent correlation analysis.

    PubMed

    Rodríguez-Arias, Miquel Angel; Rodó, Xavier

    2004-03-01

    Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.

  12. NEUROBEHAVIORAL EVALUATIONS OF BINARY AND TERTIARY MIXTURES OF CHEMICALS: LESSIONS LEARNING.

    EPA Science Inventory

    The classical approach to the statistical analysis of binary chemical mixtures is to construct full dose-response curves for one compound in the presence of a range of doses of the second compound (isobolographic analyses). For interaction studies using more than two chemicals, ...

  13. Statistical properties of the ice particle distribution in stratiform clouds

    NASA Astrophysics Data System (ADS)

    Delanoe, J.; Tinel, C.; Testud, J.

    2003-04-01

    This paper presents an extensive analysis of several microphysical data bases CEPEX, EUCREX, CLARE and CARL to determine statistical properties of the Particle Size Distribution (PSD). The data base covers different type of stratiform clouds : tropical cirrus (CEPEX), mid-latitude cirrus (EUCREX) and mid-latitude cirrus and stratus (CARL,CLARE) The approach for analysis uses the concept of normalisation of the PSD developed by Testud et al. (2001). The normalization aims at isolating three independent characteristics of the PSD : its "intrinsic" shape, the "average size" of the spectrum and the ice water content IWC, "average size" is meant the mean mass weighted diameter. It is shown that concentration should be normalized by N_0^* proportional to IWC/D_m^4. The "intrinsic" shape is defined as F(Deq/D_m)=N(Deq)/N_0^* where Deq is the equivalent melted diameter. The "intrinsic" shape is found to be very stable in the range 001.5, more scatter is observed, but future analysis should decide if it is representative of real physical variation or statistical "error" due to counting problem. Considering an overall statistics over the full data base, a large scatter of the N_0^* against Dm plot is found. But in the case of a particular event or a particular leg of a flight, the N_0^* vs. Dm plot is much less scattered and shows a systematic trend for decaying of N_0^* when Dm increases. This trend is interpreted as the manifestation of the predominance of the aggregation process. Finally an important point for cloud remote sensing is investigated : the normalised relationships IWC/N_0^* against Z/N_0^* is much less scattered that the classical IWC against Z the radar reflectivity factor.

  14. Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis.

    PubMed

    Shrout, Patrick E; Rodgers, Joseph L

    2018-01-04

    Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

  15. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  16. Student Participation in Dual Enrollment and College Success

    ERIC Educational Resources Information Center

    Jones, Stephanie J.

    2014-01-01

    The study investigated the impact of dual enrollment participation on the academic preparation of first-year full-time college students at a large comprehensive community college and a large research university. The research design was causal-comparative and utilized descriptive and inferential statistics. Multivariate analysis of variances were…

  17. The Learning Organization Model across Vocational and Academic Teacher Groups

    ERIC Educational Resources Information Center

    Park, Joo Ho; Rojewski, Jay W.

    2006-01-01

    Multiple-group confirmatory factor analysis was used to investigate factorial invariance between vocational and academic teacher groups on a measure of the learning organization concept. Participants were 488 full-time teachers of public trade industry-technical and business schools located within Seoul, South Korea. Statistically significant…

  18. Predicting and downscaling ENSO impacts on intraseasonal precipitation statistics in California: The 1997/98 event

    USGS Publications Warehouse

    Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.

    2000-01-01

    Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.

  19. Statistical testing of the full-range leadership theory in nursing.

    PubMed

    Kanste, Outi; Kääriäinen, Maria; Kyngäs, Helvi

    2009-12-01

    The aim of this study is to test statistically the structure of the full-range leadership theory in nursing. The data were gathered by postal questionnaires from nurses and nurse leaders working in healthcare organizations in Finland. A follow-up study was performed 1 year later. The sample consisted of 601 nurses and nurse leaders, and the follow-up study had 78 respondents. Theory was tested through structural equation modelling, standard regression analysis and two-way anova. Rewarding transformational leadership seems to promote and passive laissez-faire leadership to reduce willingness to exert extra effort, perceptions of leader effectiveness and satisfaction with the leader. Active management-by-exception seems to reduce willingness to exert extra effort and perception of leader effectiveness. Rewarding transformational leadership remained as a strong explanatory factor of all outcome variables measured 1 year later. The data supported the main structure of the full-range leadership theory, lending support to the universal nature of the theory.

  20. Four modes of optical parametric operation for squeezed state generation

    NASA Astrophysics Data System (ADS)

    Andersen, U. L.; Buchler, B. C.; Lam, P. K.; Wu, J. W.; Gao, J. R.; Bachor, H.-A.

    2003-11-01

    We report a versatile instrument, based on a monolithic optical parametric amplifier, which reliably generates four different types of squeezed light. We obtained vacuum squeezing, low power amplitude squeezing, phase squeezing and bright amplitude squeezing. We show a complete analysis of this light, including a full quantum state tomography. In addition we demonstrate the direct detection of the squeezed state statistics without the aid of a spectrum analyser. This technique makes the nonclassical properties directly visible and allows complete measurement of the statistical moments of the squeezed quadrature.

  1. Discrepancies Between Plastic Surgery Meeting Abstracts and Subsequent Full-Length Manuscript Publications.

    PubMed

    Denadai, Rafael; Araujo, Gustavo Henrique; Pinho, Andre Silveira; Denadai, Rodrigo; Samartine, Hugo; Raposo-Amaral, Cassio Eduardo

    2016-10-01

    The purpose of this bibliometric study was to assess the discrepancies between plastic surgery meeting abstracts and subsequent full-length manuscript publications. Abstracts presented at the Brazilian Congress of Plastic Surgery from 2010 to 2011 were compared with matching manuscript publications. Discrepancies between the abstract and the subsequent manuscript were categorized as major (changes in the purpose, methods, study design, sample size, statistical analysis, results, and conclusions) and minor (changes in the title and authorship) variations. The overall discrepancy rate was 96 %, with at least one major (76 %) and/or minor (96 %) variation. There were inconsistencies between the study title (56 %), authorship (92 %), purpose (6 %), methods (20 %), study design (36 %), sample size (51.2 %), statistical analysis (14 %), results (20 %), and conclusions (8 %) of manuscripts compared with their corresponding meeting abstracts. As changes occur before manuscript publication of plastic surgery meeting abstracts, caution should be exercised in referencing abstracts or altering surgical practices based on abstracts' content. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  2. Full Counting Statistics for Interacting Fermions with Determinantal Quantum Monte Carlo Simulations.

    PubMed

    Humeniuk, Stephan; Büchler, Hans Peter

    2017-12-08

    We present a method for computing the full probability distribution function of quadratic observables such as particle number or magnetization for the Fermi-Hubbard model within the framework of determinantal quantum Monte Carlo calculations. Especially in cold atom experiments with single-site resolution, such a full counting statistics can be obtained from repeated projective measurements. We demonstrate that the full counting statistics can provide important information on the size of preformed pairs. Furthermore, we compute the full counting statistics of the staggered magnetization in the repulsive Hubbard model at half filling and find excellent agreement with recent experimental results. We show that current experiments are capable of probing the difference between the Hubbard model and the limiting Heisenberg model.

  3. Non-Markovian full counting statistics in quantum dot molecules

    PubMed Central

    Xue, Hai-Bin; Jiao, Hu-Jun; Liang, Jiu-Qing; Liu, Wu-Ming

    2015-01-01

    Full counting statistics of electron transport is a powerful diagnostic tool for probing the nature of quantum transport beyond what is obtainable from the average current or conductance measurement alone. In particular, the non-Markovian dynamics of quantum dot molecule plays an important role in the nonequilibrium electron tunneling processes. It is thus necessary to understand the non-Markovian full counting statistics in a quantum dot molecule. Here we study the non-Markovian full counting statistics in two typical quantum dot molecules, namely, serially coupled and side-coupled double quantum dots with high quantum coherence in a certain parameter regime. We demonstrate that the non-Markovian effect manifests itself through the quantum coherence of the quantum dot molecule system, and has a significant impact on the full counting statistics in the high quantum-coherent quantum dot molecule system, which depends on the coupling of the quantum dot molecule system with the source and drain electrodes. The results indicated that the influence of the non-Markovian effect on the full counting statistics of electron transport, which should be considered in a high quantum-coherent quantum dot molecule system, can provide a better understanding of electron transport through quantum dot molecules. PMID:25752245

  4. Statistical Analysis of Small-Scale Magnetic Flux Emergence Patterns: A Useful Subsurface Diagnostic?

    NASA Astrophysics Data System (ADS)

    Lamb, Derek A.

    2016-10-01

    While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.

  5. Negotiated Wages and Working Conditions in Ontario Hospitals: 1973.

    ERIC Educational Resources Information Center

    Ontario Dept. of Labour, Toronto. Research Branch.

    This report is a statistical analysis of provisions in collective agreements covering approximately 38,000 full-time employees in 156 hospitals in the Province of Ontario. Part 1 consists of 56 tables giving information on the geographical distribution of hospital contracts, the unions that are party to them, their duration, and the sizes and…

  6. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  7. 78 FR 65318 - National Committee on Vital and Health Statistics: Meeting Full Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-31

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Committee on Vital and Health Statistics: Meeting... Health Statistics (NCVHS), Full Committee Meeting. Time and Date: November 13, 2013 9:00 a.m.-2:15 p.m..., National Center for Health Statistics, 3311 Toledo Road, Auditorium B & C, Hyattsville, Maryland 20782...

  8. 78 FR 54470 - National Committee on Vital and Health Statistics: Meeting Full Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Committee on Vital and Health Statistics: Meeting... Health Statistics (NCVHS); Full Committee Meeting. Time and Date: September 16, 2013 9:00 a.m.-2:45 p.m... Statistics, Centers for Disease Control and Prevention, 3311 Toledo Road, Room 2402, Hyattsville, Maryland...

  9. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  10. Part-time versus full-time occlusion therapy for treatment of amblyopia: A meta-analysis.

    PubMed

    Yazdani, Negareh; Sadeghi, Ramin; Momeni-Moghaddam, Hamed; Zarifmahmoudi, Leili; Ehsaei, Asieh; Barrett, Brendan T

    2017-06-01

    To compare full-time occlusion (FTO) and part-time occlusion (PTO) therapy in the treatment of amblyopia, with the secondary aim of evaluating the minimum number of hours of part-time patching required for maximal effect from occlusion. A literature search was performed in PubMed, Scopus, Science Direct, Ovid, Web of Science and Cochrane library. Methodological quality of the literature was evaluated according to the Oxford Center for Evidence Based Medicine and modified Newcastle-Ottawa scale. Statistical analyses were performed using Comprehensive Meta-Analysis (version 2, Biostat Inc., USA). The present meta-analysis included six studies [three randomized controlled trials (RCTs) and three non-RCTs]. Pooled standardized difference in the mean changes in the visual acuity was 0.337 [lower and upper limits: -0.009, 0.683] higher in the FTO as compared to the PTO group; however, this difference was not statistically significant ( P  = 0.056, Cochrane Q value = 20.4 ( P  = 0.001), I 2  = 75.49%). Egger's regression intercept was 5.46 ( P  = 0.04). The pooled standardized difference in means of visual acuity changes was 1.097 [lower and upper limits: 0.68, 1.513] higher in the FTO arm ( P  < 0.001), and 0.7 [lower and upper limits: 0.315, 1.085] higher in the PTO arm ( P  < 0.001) compared to PTO less than two hours. This meta-analysis shows no statistically significant difference between PTO and FTO in treatment of amblyopia. However, our results suggest that the minimum effective PTO duration, to observe maximal improvement in visual acuity is six hours per day.

  11. [Survey and analysis of radiation safety education at radiological technology schools].

    PubMed

    Ohba, Hisateru; Ogasawara, Katsuhiko; Aburano, Tamio

    2004-10-01

    We carried out a questionnaire survey of all radiological technology schools, to investigate the status of radiation safety education. The questionnaire consisted of questions concerning full-time teachers, measures being taken for the Radiation Protection Supervisor Qualifying Examination, equipment available for radiation safety education, radiation safety education for other departments, curriculum of radiation safety education, and related problems. The returned questionnaires were analyzed according to different groups categorized by form of education and type of establishment. The overall response rate was 55%, and there were statistically significant differences in the response rates among the different forms of education. No statistically significant differences were found in the items relating to full-time teachers, measures for Radiation Protection Supervisor Qualifying Examination, and radiation safety education for other departments, either for the form of education or type of establishment. Queries on the equipment used for radiation safety education revealed a statistically significant difference in unsealed radioisotope institutes among the forms of education. In terms of curriculum, the percentage of radiological technology schools which dealt with neither the shielding calculation method for radiation facilities nor with the control of medical waste was found to be approximately 10%. Other educational problems that were indicated included shortages of full-time teachers and equipment for radiation safety education. In the future, in order to improve radiation safety education at radiological technology schools, we consider it necessary to develop unsealed radioisotope institutes, to appoint more full-time teachers, and to educate students about risk communication.

  12. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  13. Gamma-ray Full Spectrum Analysis for Environmental Radioactivity by HPGe Detector

    NASA Astrophysics Data System (ADS)

    Jeong, Meeyoung; Lee, Kyeong Beom; Kim, Kyeong Ja; Lee, Min-Kie; Han, Ju-Bong

    2014-12-01

    Odyssey, one of the NASA¡¯s Mars exploration program and SELENE (Kaguya), a Japanese lunar orbiting spacecraft have a payload of Gamma-Ray Spectrometer (GRS) for analyzing radioactive chemical elements of the atmosphere and the surface. In these days, gamma-ray spectroscopy with a High-Purity Germanium (HPGe) detector has been widely used for the activity measurements of natural radionuclides contained in the soil of the Earth. The energy spectra obtained by the HPGe detectors have been generally analyzed by means of the Window Analysis (WA) method. In this method, activity concentrations are determined by using the net counts of energy window around individual peaks. Meanwhile, an alternative method, the so-called Full Spectrum Analysis (FSA) method uses count numbers not only from full-absorption peaks but from the contributions of Compton scattering due to gamma-rays. Consequently, while it takes a substantial time to obtain a statistically significant result in the WA method, the FSA method requires a much shorter time to reach the same level of the statistical significance. This study shows the validation results of FSA method. We have compared the concentration of radioactivity of 40K, 232Th and 238U in the soil measured by the WA method and the FSA method, respectively. The gamma-ray spectrum of reference materials (RGU and RGTh, KCl) and soil samples were measured by the 120% HPGe detector with cosmic muon veto detector. According to the comparison result of activity concentrations between the FSA and the WA, we could conclude that FSA method is validated against the WA method. This study implies that the FSA method can be used in a harsh measurement environment, such as the gamma-ray measurement in the Moon, in which the level of statistical significance is usually required in a much shorter data acquisition time than the WA method.

  14. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antcheva, I.; /CERN; Ballintijn, M.

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose outmore » of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way.« less

  15. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics.

    PubMed

    Lyon, Yana A; Riggs, Dylan; Fornelli, Luca; Compton, Philip D; Julian, Ryan R

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. Graphical abstract ᅟ.

  16. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics

    NASA Astrophysics Data System (ADS)

    Lyon, Yana A.; Riggs, Dylan; Fornelli, Luca; Compton, Philip D.; Julian, Ryan R.

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. [Figure not available: see fulltext.

  17. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1984-01-01

    Analysis was performed to characterize the radiometry of three Thematic Mapper (TM) digital products of a scene of Arkansas. The three digital products examined were the NASA raw (BT) product, the radiometrically corrected (AT) product and the radiometrically and geometrically corrected (PT) product. The frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band were examined on a series of image subsets from the full scene. The results are presented from one 1024 x 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. Bands 1, 2 and 5 of the sample area are presented. The subsets were extracted from the three digital data products to cover the same geographic area. This analysis provides the first step towards a full appraisal of the TM radiometry being performed as part of the ESA/CEC contribution to the NASA/LIDQA program.

  18. Generalized Full-Information Item Bifactor Analysis

    PubMed Central

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of multidimensional item response theory models for an arbitrary mixing of dichotomous, ordinal, and nominal items. The extended item bifactor model also enables the estimation of latent variable means and variances when data from more than one group are present. Generalized user-defined parameter restrictions are permitted within or across groups. We derive an efficient full-information maximum marginal likelihood estimator. Our estimation method achieves substantial computational savings by extending Gibbons and Hedeker’s (1992) bifactor dimension reduction method so that the optimization of the marginal log-likelihood only requires two-dimensional integration regardless of the dimensionality of the latent variables. We use simulation studies to demonstrate the flexibility and accuracy of the proposed methods. We apply the model to study cross-country differences, including differential item functioning, using data from a large international education survey on mathematics literacy. PMID:21534682

  19. The effect of leverage and/or influential on structure-activity relationships.

    PubMed

    Bolboacă, Sorana D; Jäntschi, Lorentz

    2013-05-01

    In the spirit of reporting valid and reliable Quantitative Structure-Activity Relationship (QSAR) models, the aim of our research was to assess how the leverage (analysis with Hat matrix, h(i)) and the influential (analysis with Cook's distance, D(i)) of QSAR models may reflect the models reliability and their characteristics. The datasets included in this research were collected from previously published papers. Seven datasets which accomplished the imposed inclusion criteria were analyzed. Three models were obtained for each dataset (full-model, h(i)-model and D(i)-model) and several statistical validation criteria were applied to the models. In 5 out of 7 sets the correlation coefficient increased when compounds with either h(i) or D(i) higher than the threshold were removed. Withdrawn compounds varied from 2 to 4 for h(i)-models and from 1 to 13 for D(i)-models. Validation statistics showed that D(i)-models possess systematically better agreement than both full-models and h(i)-models. Removal of influential compounds from training set significantly improves the model and is recommended to be conducted in the process of quantitative structure-activity relationships developing. Cook's distance approach should be combined with hat matrix analysis in order to identify the compounds candidates for removal.

  20. Statistical analysis of landing contact conditions for three lifting body research vehicles

    NASA Technical Reports Server (NTRS)

    Larson, R. R.

    1972-01-01

    The landing contact conditions for the HL-10, M2-F2/F3, and the X-24A lifting body vehicles are analyzed statistically for 81 landings. The landing contact parameters analyzed are true airspeed, peak normal acceleration at the center of gravity, roll angle, and roll velocity. Ground measurement parameters analyzed are lateral and longitudinal distance from intended touchdown, lateral distance from touchdown to full stop, and rollout distance. The results are presented in the form of histograms for frequency distributions and cumulative frequency distribution probability curves with a Pearson Type 3 curve fit for extrapolation purposes.

  1. Using independent component analysis for electrical impedance tomography

    NASA Astrophysics Data System (ADS)

    Yan, Peimin; Mo, Yulong

    2004-05-01

    Independent component analysis (ICA) is a way to resolve signals into independent components based on the statistical characteristics of the signals. It is a method for factoring probability densities of measured signals into a set of densities that are as statistically independent as possible under the assumptions of a linear model. Electrical impedance tomography (EIT) is used to detect variations of the electric conductivity of the human body. Because there are variations of the conductivity distributions inside the body, EIT presents multi-channel data. In order to get all information contained in different location of tissue it is necessary to image the individual conductivity distribution. In this paper we consider to apply ICA to EIT on the signal subspace (individual conductivity distribution). Using ICA the signal subspace will then be decomposed into statistically independent components. The individual conductivity distribution can be reconstructed by the sensitivity theorem in this paper. Compute simulations show that the full information contained in the multi-conductivity distribution will be obtained by this method.

  2. Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.

    PubMed

    Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin

    2017-08-01

    The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at p<0.05. According to the ISO 4049 standards, all the tested materials showed acceptable water sorption and solubility, and a halogen light source was an option to polymerize bulk-fill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. A user-friendly workflow for analysis of Illumina gene expression bead array data available at the arrayanalysis.org portal.

    PubMed

    Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana

    2015-06-30

    Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.

  4. Interventions for unilateral refractive amblyopia.

    PubMed

    Shotton, Kate; Powell, Christine; Voros, Gerasimos; Hatt, Sarah R

    2008-10-08

    Unilateral refractive amblyopia is a common cause of reduced visual acuity in childhood, but optimal treatment is not well defined. This review examined the treatment effect from spectacles and conventional occlusion. Evaluation of the evidence of the effectiveness of spectacles and or occlusion in the treatment of unilateral refractive amblyopia. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and LILACS. Relevant conference proceedings were manually searched. There were no date or language restrictions. The searches were last run on 7 July 2008. Randomised controlled trials of treatment for unilateral refractive amblyopia by spectacles, with or without occlusion were eligible. We included studies with participants of any age. Two authors independently assessed abstracts identified by the searches. We obtained full text copies and contacted study authors where necessary. Eight trials were eligible for inclusion. Data were extracted from seven. No meta-analysis was performed. For all studies mean acuity (standard deviation (SD)) in the amblyopic eye post treatment is reported.Comparison: Spectacles only versus no treatment (Clarke 2003). Mean (SD) visual acuity: spectacles group 0.31 (0.17); no treatment group 0.42 (0.19). Mean difference (MD) between groups -0.11 (borderline statistical significance: 95% confidence interval (CI) -0.22 to 0.00).Comparison: Spectacles plus occlusion versus no treatment (Clarke 2003). Mean (SD) visual acuity: full treatment 0.22 (0.13); no treatment 0.42 (0.19). Mean difference between the groups -0.20 (statistically significant: 95% CI -0.30 to -0.10).Comparison: Spectacles plus occlusion versus spectacles only: Clarke 2003 MD -0.09 (borderline statistical significance 95% CI, -0.18 to 0.00); PEDIG 2005b; MD -0.15 (not statistically significant 95% CI -0.32 to 0.02); PEDIG 2006a; MD 0.01 (not statistically significant 95% CI -0.08 to 0.10).Comparison: Occlusion regimes. PEDIG 2003a: 2 hours versus 6 hours for moderate amblyopia: MD 0.01 (not statistically significant: 95% CI -0.06 to 0.08); PEDIG 2003b: 6 hours versus full-time for severe amblyopia: MD 0.03 (not statistically significant: 95% CI -0.08 to 0.14). Stewart 2007a: 6 hours versus full-time occlusion: MD -0.12 (not statistically significant: 95% CI -0.27 to 0.03) In some cases of unilateral refractive amblyopia it appears that there is a treatment benefit from refractive correction alone. Where amblyopia persists there is some evidence that adding occlusion further improves vision. It remains unclear which treatment regimes are optimal for individual patients. The nature of any dose/response effect from occlusion still needs to be clarified.

  5. Bayesian models based on test statistics for multiple hypothesis testing problems.

    PubMed

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  6. Superstition and post-tonsillectomy hemorrhage.

    PubMed

    Kumar, Veena V; Kumar, Naveen V; Isaacson, Glenn

    2004-11-01

    The objective was to determine whether post-tonsillectomy hemorrhages occur more frequently in redheaded children, in patterns of threes, on Friday-the-13th days, or with the full moon. Case-control analysis. The authors performed multiple statistical analyses of all children undergoing tonsillectomy at Temple University Children's Medical Center (Philadelphia, PA) during a 29-month period. Children readmitted to the hospital with or without surgical control of bleeding were compared with children who did not bleed. Relation of post-tonsillectomy hemorrhages to the phase of the moon was evaluated using a standard normal deviate. The frequency of surgery performed on Friday-the-13th days was compared with a differently dated Friday chosen at random. Clusters of three hemorrhages in a 7-day period were recorded. Families of children were contacted and asked whether their child had red hair. A chi analysis compared redheaded and non-redheaded tonsillectomy patients. Twenty-eight of 589 tonsillectomy cases performed required readmission for bleeding events. Twenty tonsillectomies occurred on a full-moon day, resulting in one bleeding event. One cluster of three post-tonsillectomy hemorrhages occurred in a 7-day period. Four of the children who bled had red hair. Two tonsillectomies occurred on Friday the 13th, with no associated hemorrhage. Statistical analysis revealed a random pattern to post-tonsillectomy hemorrhage. Post-tonsillectomy hemorrhages do not occur in clusters of three and are not more frequent with the full moon or on Friday the 13th. The bleeding rate among children with red hair is similar to that of non-redheaded children.

  7. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI

    PubMed Central

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-01-01

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798

  8. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  9. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  10. Alignments of parity even/odd-only multipoles in CMB

    NASA Astrophysics Data System (ADS)

    Aluri, Pavan K.; Ralston, John P.; Weltman, Amanda

    2017-12-01

    We compare the statistics of parity even and odd multipoles of the cosmic microwave background (CMB) sky from Planck full mission temperature measurements. An excess power in odd multipoles compared to even multipoles has previously been found on large angular scales. Motivated by this apparent parity asymmetry, we evaluate directional statistics associated with even compared to odd multipoles, along with their significances. Primary tools are the Power tensor and Alignment tensor statistics. We limit our analysis to the first 60 multipoles i.e. l = [2, 61]. We find no evidence for statistically unusual alignments of even parity multipoles. More than one independent statistic finds evidence for alignments of anisotropy axes of odd multipoles, with a significance equivalent to ∼2σ or more. The robustness of alignment axes is tested by making Galactic cuts and varying the multipole range. Very interestingly, the region spanned by the (a)symmetry axes is found to broadly contain other parity (a)symmetry axes previously observed in the literature.

  11. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births

    PubMed Central

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A. Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers. PMID:29546120

  12. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births.

    PubMed

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers.

  13. Charm dimuon production in neutrino-nucleon interactions in the NOMAD experiment

    NASA Astrophysics Data System (ADS)

    Petti, Roberto; Samoylov, Oleg

    2012-09-01

    We present our new measurement of charm dimuon production in neutrino-iron interactions based upon the full statistics collected by the NOMAD experiment. After background subtraction we observe 15,340 charm dimuon events, providing the largest sample currently available. The analysis exploits the large inclusive charged current sample (about 9 million events after all analysis cuts) to constrain the total systematic uncertainty to about 2%. The extraction of strange sea and charm production parameters is also discussed.

  14. Charm dimuon production in neutrino-nucleon interactions in the NOMAD experiment

    NASA Astrophysics Data System (ADS)

    Petti, R.; Samoylov, O. B.

    2011-12-01

    We present our new measurement of charm dimuon production in neutrino-iron interactions based upon the full statistics collected by the NOMAD experiment. After background subtraction we observe 15,340 charm dimuon events, providing the largest sample currently available. The analysis exploits the large inclusive charged current sample (about 9 million events after all analysis cuts) to constrain the total systematic uncertainty to ˜2%. The extraction of strange sea and charm production parameters is also discussed.

  15. Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?

    NASA Astrophysics Data System (ADS)

    Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.

    2013-09-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.

  16. Statistical Literacy in Public Debate--Examples from the UK 2015 General Election

    ERIC Educational Resources Information Center

    Arnold, Phoebe

    2017-01-01

    Full Fact is an independent, non-partisan fact-checking charity. A particular focus is the analysis of factual claims in political debate in the UK; for example, fact-checking claims and counterclaims made during Prime Minister's questions. Facts do not appear in a vacuum as they are often used as key elements in an effort to make a coherent…

  17. The Effect of Casting Ring Liner Length and Prewetting on the Marginal Adaptation and Dimensional Accuracy of Full Crown Castings.

    PubMed

    Haralur, Satheesh B; Hamdi, Osama A; Al-Shahrani, Abdulaziz A; Alhasaniah, Sultan

    2017-01-01

    To evaluate the effect of varying cellulose casting ring liner length and its prewetting on the marginal adaptation and dimensional accuracy of full veneer metal castings. The master die was milled in stainless steel to fabricate the wax pattern. Sixty wax patterns were fabricated with a uniform thickness of 1.5 mm at an occlusal surface and 1 mm axial surface, cervical width at 13.5 mm, and 10 mm cuspal height. The samples were divided into six groups ( n = 10). Groups I and II samples had the full-length cellulose prewet and dry ring liner, respectively. The groups III and IV had 2 mm short prewet and dry cellulose ring liner, respectively, whereas groups V and VI were invested in 6 mm short ring liner. The wax patterns were immediately invested in phosphate bonded investment, and casting procedure was completed with nickel-chrome alloy. The castings were cleaned and mean score of measurements at four reference points for marginal adaption, casting height, and cervical width was calculated. The marginal adaption was calculated with Imaje J software, whereas the casting height and cervical width was determined using a digital scale. The data was subjected to one-way analysis of varaince and Tukey post hoc statistical analysis with Statistical Package for the Social Sciences version 20 software. The group II had the best marginal adaption with a gap of 63.786 μm followed by group I (65.185 μm), group IV (87.740 μm), and group III (101.455 μm). A large marginal gap was observed in group V at 188.871 μm. Cuspal height was more accurate with group V (10.428 mm), group VI (10.421 mm), and group II (10.488 mm). The cervical width was approximately similar in group I, group III, and group V. Statistically significant difference was observed in Tukey post hoc analysis between group V and group VI with all the other groups with regards to marginal adaptation. The dry cellulose ring liners provided better marginal adaptation in comparison to prewet cellulose ring liners. Accurate cuspal height was obtained with shorter ring liner in comparison to full-length cellulose ring liners.

  18. The Effect of Casting Ring Liner Length and Prewetting on the Marginal Adaptation and Dimensional Accuracy of Full Crown Castings

    PubMed Central

    Haralur, Satheesh B.; Hamdi, Osama A.; Al-Shahrani, Abdulaziz A.; Alhasaniah, Sultan

    2017-01-01

    Aim: To evaluate the effect of varying cellulose casting ring liner length and its prewetting on the marginal adaptation and dimensional accuracy of full veneer metal castings. Materials and Methods: The master die was milled in stainless steel to fabricate the wax pattern. Sixty wax patterns were fabricated with a uniform thickness of 1.5 mm at an occlusal surface and 1 mm axial surface, cervical width at 13.5 mm, and 10 mm cuspal height. The samples were divided into six groups (n = 10). Groups I and II samples had the full-length cellulose prewet and dry ring liner, respectively. The groups III and IV had 2 mm short prewet and dry cellulose ring liner, respectively, whereas groups V and VI were invested in 6 mm short ring liner. The wax patterns were immediately invested in phosphate bonded investment, and casting procedure was completed with nickel-chrome alloy. The castings were cleaned and mean score of measurements at four reference points for marginal adaption, casting height, and cervical width was calculated. The marginal adaption was calculated with Imaje J software, whereas the casting height and cervical width was determined using a digital scale. The data was subjected to one-way analysis of varaince and Tukey post hoc statistical analysis with Statistical Package for the Social Sciences version 20 software. Results: The group II had the best marginal adaption with a gap of 63.786 μm followed by group I (65.185 μm), group IV (87.740 μm), and group III (101.455 μm). A large marginal gap was observed in group V at 188.871 μm. Cuspal height was more accurate with group V (10.428 mm), group VI (10.421 mm), and group II (10.488 mm). The cervical width was approximately similar in group I, group III, and group V. Statistically significant difference was observed in Tukey post hoc analysis between group V and group VI with all the other groups with regards to marginal adaptation. Conclusion: The dry cellulose ring liners provided better marginal adaptation in comparison to prewet cellulose ring liners. Accurate cuspal height was obtained with shorter ring liner in comparison to full-length cellulose ring liners. PMID:28316950

  19. GARNET--gene set analysis with exploration of annotation relations.

    PubMed

    Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu

    2011-02-15

    Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).

  20. Sports hernia in National Hockey League players: does surgery affect performance?

    PubMed

    Jakoi, Andre; O'Neill, Craig; Damsgaard, Christopher; Fehring, Keith; Tom, James

    2013-01-01

    Athletic pubalgia is a complex injury that results in loss of play in competitive athletes, especially hockey players. The number of reported sports hernias has been increasing, and the importance of their management is vital. There are no studies reporting whether athletes can return to play at preinjury levels. The focus of this study was to evaluate the productivity of professional hockey players before an established athletic pubalgia diagnosis contrasted with the productivity after sports hernia repair. Cohort study; Level of evidence, 3. Professional National Hockey League (NHL) players who were reported to have a sports hernia and who underwent surgery from 2001 to 2008 were identified. Statistics were gathered on the players' previous 2 full seasons and compared with the statistics 2 full seasons after surgery. Data concerning games played, goals, average time on ice, time of productivity, and assists were gathered. Players were divided into 3 groups: group A incorporated all players, group B were players with 6 or fewer seasons of play, and group C consisted of players with 7 or more seasons of play. A control group was chosen to compare player deterioration or improvement over a career; each player selected for the study had a corresponding control player with the same tenure in his career and position during the same years. Forty-three hockey players were identified to have had sports hernia repairs from 2001 to 2008; ultimately, 80% would return to play 2 or more full seasons. Group A had statistically significant decreases in games played, goals scored, and assists. Versus the control group, the decreases in games played and assists were supported. Statistical analysis showed significant decreases in games played, goals scored, assists, and average time on ice the following 2 seasons in group C, which was also seen in comparison with the control group. Group B (16 players) showed only statistical significance in games played versus the control group. Players who undergo sports hernia surgeries return to play and often perform similar to their presurgery level. Players with over 7 full seasons return but with significant decreases in their overall performance levels. Less veteran players were able to return to play without any statistical decrease in performance and are likely the best candidates for repair once incurring injury.

  1. DETECTING UNSPECIFIED STRUCTURE IN LOW-COUNT IMAGES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Nathan M.; Dyk, David A. van; Kashyap, Vinay L.

    Unexpected structure in images of astronomical sources often presents itself upon visual inspection of the image, but such apparent structure may either correspond to true features in the source or be due to noise in the data. This paper presents a method for testing whether inferred structure in an image with Poisson noise represents a significant departure from a baseline (null) model of the image. To infer image structure, we conduct a Bayesian analysis of a full model that uses a multiscale component to allow flexible departures from the posited null model. As a test statistic, we use a tailmore » probability of the posterior distribution under the full model. This choice of test statistic allows us to estimate a computationally efficient upper bound on a p-value that enables us to draw strong conclusions even when there are limited computational resources that can be devoted to simulations under the null model. We demonstrate the statistical performance of our method on simulated images. Applying our method to an X-ray image of the quasar 0730+257, we find significant evidence against the null model of a single point source and uniform background, lending support to the claim of an X-ray jet.« less

  2. Statistical hadronization with exclusive channels in e +e - annihilation

    DOE PAGES

    Ferroni, L.; Becattini, F.

    2012-01-01

    We present a systematic analysis of exclusive hadronic channels in e +e - collisions at centre-of-mass energies between 2.1 and 2.6 GeV within the statistical hadronization model. Because of the low multiplicities involved, calculations have been carried out in the full microcanonical ensemble, including conservation of energy-momentum, angular momentum, parity, isospin, and all relevant charges. We show that the data is in an overall good agreement with the model for an energy density of about 0.5 GeV/fm 3 and an extra strangeness suppression parameter γ S 0:7, essentially the same values found with fits to inclusive multiplicities at higher energy.

  3. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Reusable, extensible, and modifiable R scripts and Kepler workflows for comprehensive single set ChIP-seq analysis.

    PubMed

    Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark

    2016-07-05

    There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of ChIP-seq data. The pipelines are supplied as both Kepler workflows, which allow data provenance tracking, and, in the majority of cases, as standalone R scripts. These pipelines are designed for ease of modification and repurposing.

  5. Global Sensitivity Analysis of Environmental Systems via Multiple Indices based on Statistical Moments of Model Outputs

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Dell'Oca, A.

    2017-12-01

    We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.

  6. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    PubMed

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  7. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  8. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  9. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  11. Inverse statistics in the foreign exchange market

    NASA Astrophysics Data System (ADS)

    Jensen, M. H.; Johansen, A.; Petroni, F.; Simonsen, I.

    2004-09-01

    We investigate intra-day foreign exchange (FX) time series using the inverse statistic analysis developed by Simonsen et al. (Eur. Phys. J. 27 (2002) 583) and Jensen et al. (Physica A 324 (2003) 338). Specifically, we study the time-averaged distributions of waiting times needed to obtain a certain increase (decrease) ρ in the price of an investment. The analysis is performed for the Deutsch Mark (DM) against the US for the full year of 1998, but similar results are obtained for the Japanese Yen against the US. With high statistical significance, the presence of “resonance peaks” in the waiting time distributions is established. Such peaks are a consequence of the trading habits of the market participants as they are not present in the corresponding tick (business) waiting time distributions. Furthermore, a new stylized fact, is observed for the (normalized) waiting time distribution in the form of a power law Pdf. This result is achieved by rescaling of the physical waiting time by the corresponding tick time thereby partially removing scale-dependent features of the market activity.

  12. Analysis of the Einstein sample of early-type galaxies

    NASA Technical Reports Server (NTRS)

    Eskridge, Paul B.; Fabbiano, Giuseppina

    1993-01-01

    The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.

  13. A study of tensile test on open-cell aluminum foam sandwich

    NASA Astrophysics Data System (ADS)

    Ibrahim, N. A.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Abdullah Sidek, Atiah Bt.; Endut, N. A.

    2018-01-01

    Aluminum foam sandwich (AFS) panels are one of the growing materials in the various industries because of its lightweight behavior. AFS also known for having excellent stiffness to weight ratio and high-energy absorption. Due to their advantages, many researchers’ shows an interest in aluminum foam material for expanding the use of foam structure. However, there is still a gap need to be fill in order to develop reliable data on mechanical behavior of AFS with different parameters and analysis method approach. Least of researcher focusing on open-cell aluminum foam and statistical analysis. Thus, this research conducted by using open-cell aluminum foam core grade 6101 with aluminum sheets skin tested under tension. The data is analyzed using full factorial in JMP statistical analysis software (version 11). ANOVA result show a significant value of the model which less than 0.500. While scatter diagram and 3D plot surface profiler found that skins thickness gives a significant impact to stress/strain value compared to core thickness.

  14. Markov chain Monte Carlo estimation of quantum states

    NASA Astrophysics Data System (ADS)

    Diguglielmo, James; Messenger, Chris; Fiurášek, Jaromír; Hage, Boris; Samblowski, Aiko; Schmidt, Tabea; Schnabel, Roman

    2009-03-01

    We apply a Bayesian data analysis scheme known as the Markov chain Monte Carlo to the tomographic reconstruction of quantum states. This method yields a vector, known as the Markov chain, which contains the full statistical information concerning all reconstruction parameters including their statistical correlations with no a priori assumptions as to the form of the distribution from which it has been obtained. From this vector we can derive, e.g., the marginal distributions and uncertainties of all model parameters, and also of other quantities such as the purity of the reconstructed state. We demonstrate the utility of this scheme by reconstructing the Wigner function of phase-diffused squeezed states. These states possess non-Gaussian statistics and therefore represent a nontrivial case of tomographic reconstruction. We compare our results to those obtained through pure maximum-likelihood and Fisher information approaches.

  15. Music Structure Analysis from Acoustic Signals

    NASA Astrophysics Data System (ADS)

    Dannenberg, Roger B.; Goto, Masataka

    Music is full of structure, including sections, sequences of distinct musical textures, and the repetition of phrases or entire sections. The analysis of music audio relies upon feature vectors that convey information about music texture or pitch content. Texture generally refers to the average spectral shape and statistical fluctuation, often reflecting the set of sounding instruments, e.g., strings, vocal, or drums. Pitch content reflects melody and harmony, which is often independent of texture. Structure is found in several ways. Segment boundaries can be detected by observing marked changes in locally averaged texture.

  16. Advanced support systems development and supporting technologies for Controlled Ecological Life Support Systems (CELSS)

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Li, Ku-Yen; Yaws, Carl L.; Mei, Harry T.; Nguyen, Vinh D.; Chu, Hsing-Wei

    1994-01-01

    A methyl acetate reactor was developed to perform a subscale kinetic investigation in the design and optimization of a full-scale metabolic simulator for long term testing of life support systems. Other tasks in support of the closed ecological life support system test program included: (1) heating, ventilation and air conditioning analysis of a variable pressure growth chamber, (2) experimental design for statistical analysis of plant crops, (3) resource recovery for closed life support systems, and (4) development of data acquisition software for automating an environmental growth chamber.

  17. [Analysis of microbial community structure at full-scale wastewater treatment plants by oxidation ditch].

    PubMed

    Guo, Yun; Yang, Dian-hai; Lu, Wen-jian

    2012-08-01

    The microbial populations of the oxidation ditch process at the full-scale municipal wastewater treatment plants (WWTP) in a city in north China were analyzed by fluorescent in situ hybridization (FISH). Fractions structure varieties and distribution characteristics of Accumulibacter as potential phosphorus accumulating organisms (PAOs), and Competibacter as potential glycogen accumulating organisms (GAOs) were quantified. The results indicated that Accumulibacter comprised around 2.0% +/- 0.6%, 3.4% +/- 0.6% and 3.5% +/- 1.2% of the total biomass in the anaerobic tank, anoxic zone and zone, respectively, while the corresponding values for Competibacter were 25.3% +/- 8.7%, 30.3% +/- 7.1% and 24.4% +/- 6.1%. Lower Accumulibacter fractions were found compared with previous full-scale reports (7%-22%), indicating low phosphorus removal efficiency in the oxidation ditch system. Statistical analysis indicated that the amount of PAOs was significantly higher in the anoxic zone and the aerobic zone compared with that in the anaerobic tank, while GAOs remained at the same level.

  18. Effect of lunar phase on frequency of psychogenic nonepileptic events in the EMU.

    PubMed

    Bolen, Robert D; Campbell, Zeke; Dennis, William A; Koontz, Elizabeth H; Pritchard, Paul B

    2016-06-01

    Studies of the effect of a full moon on seizures have yielded mixed results, despite a continuing prevailing belief regarding the association of lunar phase with human behavior. The potential effect of a full moon on psychogenic nonepileptic events has not been as well studied, despite what anecdotal accounts from most epilepsy monitoring unit (EMU) staff would suggest. We obtained the dates and times of all events from patients diagnosed with psychogenic nonepileptic events discharged from our EMU over a two-year period. The events were then plotted on a 29.5-day lunar calendar. Events were also broken down into lunar quarters for statistical analysis. We found a statistically significant increase in psychogenic nonepileptic events during the new moon quarter in our EMU during our studied timeframe. Our results are not concordant with the results of a similarly designed past study, raising the possibility that psychogenic nonepileptic events are not influenced by lunar phase. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Effects of the water level on the flow topology over the Bolund island

    NASA Astrophysics Data System (ADS)

    Cuerva-Tejero, A.; Yeow, T. S.; Gallego-Castillo, C.; Lopez-Garcia, O.

    2014-06-01

    We have analyzed the influence of the actual height of Bolund island above water level on different full-scale statistics of the velocity field over the peninsula. Our analysis is focused on the database of 10-minute statistics provided by Risø-DTU for the Bolund Blind Experiment. We have considered 10-minut.e periods with near-neutral atmospheric conditions, mean wind speed values in the interval [5,20] m/s, and westerly wind directions. As expected, statistics such as speed-up, normalized increase of turbulent kinetic energy and probability of recirculating flow show a large dependence on the emerged height of the island for the locations close to the escarpment. For the published ensemble mean values of speed-up and normalized increase of turbulent kinetic energy in these locations, we propose that some ammount of uncertainty could be explained as a deterministic dependence of the flow field statistics upon the actual height of the Bolund island above the sea level.

  20. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  1. Probing the statistical properties of CMB B-mode polarization through Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Santos, Larissa; Wang, Kai; Zhao, Wen

    2016-07-01

    The detection of the magnetic type B-mode polarization is the main goal of future cosmic microwave background (CMB) experiments. In the standard model, the B-mode map is a strong non-gaussian field due to the CMB lensing component. Besides the two-point correlation function, the other statistics are also very important to dig the information of the polarization map. In this paper, we employ the Minkowski functionals to study the morphological properties of the lensed B-mode maps. We find that the deviations from Gaussianity are very significant for both full and partial-sky surveys. As an application of the analysis, we investigate the morphological imprints of the foreground residuals in the B-mode map. We find that even for very tiny foreground residuals, the effects on the map can be detected by the Minkowski functional analysis. Therefore, it provides a complementary way to investigate the foreground contaminations in the CMB studies.

  2. Local coexistence of VO 2 phases revealed by deep data analysis

    DOE PAGES

    Strelcov, Evgheni; Ievlev, Anton; Tselev, Alexander; ...

    2016-07-07

    We report a synergistic approach of micro-Raman spectroscopic mapping and deep data analysis to study the distribution of crystallographic phases and ferroelastic domains in a defected Al-doped VO 2 microcrystal. Bayesian linear unmixing revealed an uneven distribution of the T phase, which is stabilized by the surface defects and uneven local doping that went undetectable by other classical analysis techniques such as PCA and SIMPLISMA. This work demonstrates the impact of information recovery via statistical analysis and full mapping in spectroscopic studies of vanadium dioxide systems, which is commonly substituted by averaging or single point-probing approaches, both of which suffermore » from information misinterpretation due to low resolving power.« less

  3. Harmonic statistics

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  4. MetAlign: interface-driven, versatile metabolomics tool for hyphenated full-scan mass spectrometry data preprocessing.

    PubMed

    Lommen, Arjen

    2009-04-15

    Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.

  5. Lognormal-like statistics of a stochastic squeeze process

    NASA Astrophysics Data System (ADS)

    Shapira, Dekel; Cohen, Doron

    2017-10-01

    We analyze the full statistics of a stochastic squeeze process. The model's two parameters are the bare stretching rate w and the angular diffusion coefficient D . We carry out an exact analysis to determine the drift and the diffusion coefficient of log(r ) , where r is the radial coordinate. The results go beyond the heuristic lognormal description that is implied by the central limit theorem. Contrary to the common "quantum Zeno" approximation, the radial diffusion is not simply Dr=(1 /8 ) w2/D but has a nonmonotonic dependence on w /D . Furthermore, the calculation of the radial moments is dominated by the far non-Gaussian tails of the log(r ) distribution.

  6. TOMS and SBUV Data: Comparison to 3D Chemical-Transport Model Results

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Douglass, Anne R.; Steenrod, Steve; Frith, Stacey

    2003-01-01

    We have updated our merged ozone data (MOD) set using the TOMS data from the new version 8 algorithm. We then analyzed these data for contributions from solar cycle, volcanoes, QBO, and halogens using a standard statistical time series model. We have recently completed a hindcast run of our 3D chemical-transport model for the same years. This model uses off-line winds from the finite-volume GCM, a full stratospheric photochemistry package, and time-varying forcing due to halogens, solar uv, and volcanic aerosols. We will report on a parallel analysis of these model results using the same statistical time series technique as used for the MOD data.

  7. Andreev Bound States Formation and Quasiparticle Trapping in Quench Dynamics Revealed by Time-Dependent Counting Statistics.

    PubMed

    Souto, R Seoane; Martín-Rodero, A; Yeyati, A Levy

    2016-12-23

    We analyze the quantum quench dynamics in the formation of a phase-biased superconducting nanojunction. We find that in the absence of an external relaxation mechanism and for very general conditions the system gets trapped in a metastable state, corresponding to a nonequilibrium population of the Andreev bound states. The use of the time-dependent full counting statistics analysis allows us to extract information on the asymptotic population of even and odd many-body states, demonstrating that a universal behavior, dependent only on the Andreev state energy, is reached in the quantum point contact limit. These results shed light on recent experimental observations on quasiparticle trapping in superconducting atomic contacts.

  8. AMSARA: Accession Medical Standards Analysis and Research Activity. Report of 2006 Attrition and Morbidity Data for 2005 Accessions

    DTIC Science & Technology

    2007-12-17

    and Research Activity (AMSARA) NUMBER Department of Epidemiology Division of Preventive Medicine Walter Reed Army Institute of Research 503 Robert...Hospitalizations, Recruits, Epidemiology, Attrition, Disability, Statistics, Preventive Medicine , Physical Fitness, Motivation, Accession, Waiver, Existing Prior to...an active duty Army enlistee. These findings are presented in abstract form in this report and as a full manuscript submitted to Military Medicine in

  9. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    PubMed

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  10. Impact of non-specific normal databases on perfusion quantification of low-dose myocardial SPECT studies.

    PubMed

    Scabbio, Camilla; Zoccarato, Orazio; Malaspina, Simona; Lucignani, Giovanni; Del Sole, Angelo; Lecchi, Michela

    2017-10-17

    To evaluate the impact of non-specific normal databases on the percent summed rest score (SR%) and stress score (SS%) from simulated low-dose SPECT studies by shortening the acquisition time/projection. Forty normal-weight and 40 overweight/obese patients underwent myocardial studies with a conventional gamma-camera (BrightView, Philips) using three different acquisition times/projection: 30, 15, and 8 s (100%-counts, 50%-counts, and 25%-counts scan, respectively) and reconstructed using the iterative algorithm with resolution recovery (IRR) Astonish TM (Philips). Three sets of normal databases were used: (1) full-counts IRR; (2) half-counts IRR; and (3) full-counts traditional reconstruction algorithm database (TRAD). The impact of these databases and the acquired count statistics on the SR% and SS% was assessed by ANOVA analysis and Tukey test (P < 0.05). Significantly higher SR% and SS% values (> 40%) were found for the full-counts TRAD databases respect to the IRR databases. For overweight/obese patients, significantly higher SS% values for 25%-counts scans (+19%) are confirmed compared to those of 50%-counts scan, independently of using the half-counts or the full-counts IRR databases. Astonish TM requires the adoption of the own specific normal databases in order to prevent very high overestimation of both stress and rest perfusion scores. Conversely, the count statistics of the normal databases seems not to influence the quantification scores.

  11. Probabilistic Modeling and Visualization of the Flexibility in Morphable Models

    NASA Astrophysics Data System (ADS)

    Lüthi, M.; Albrecht, T.; Vetter, T.

    Statistical shape models, and in particular morphable models, have gained widespread use in computer vision, computer graphics and medical imaging. Researchers have started to build models of almost any anatomical structure in the human body. While these models provide a useful prior for many image analysis task, relatively little information about the shape represented by the morphable model is exploited. We propose a method for computing and visualizing the remaining flexibility, when a part of the shape is fixed. Our method, which is based on Probabilistic PCA, not only leads to an approach for reconstructing the full shape from partial information, but also allows us to investigate and visualize the uncertainty of a reconstruction. To show the feasibility of our approach we performed experiments on a statistical model of the human face and the femur bone. The visualization of the remaining flexibility allows for greater insight into the statistical properties of the shape.

  12. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiang, N. B.; Kong, D. F., E-mail: nanbin@ynao.ac.cn

    The Physikalisch Meteorologisches Observatorium Davos total solar irradiance (TSI), Active Cavity Radiometer Irradiance Monitoring TSI, and Royal Meteorological Institute of Belgium TSI are three typical TSI composites. Magnetic Plage Strength Index (MPSI) and Mount Wilson Sunspot Index (MWSI) should indicate the weak and strong magnetic field activity on the solar full disk, respectively. Cross-correlation (CC) analysis of MWSI with three TSI composites shows that TSI should be weakly correlated with MWSI, and not be in phase with MWSI at timescales of solar cycles. The wavelet coherence (WTC) and partial wavelet coherence (PWC) of TSI with MWSI indicate that the inter-solar-cyclemore » variation of TSI is also not related to solar strong magnetic field activity, which is represented by MWSI. However, CC analysis of MPSI with three TSI composites indicates that TSI should be moderately correlated and accurately in phase with MPSI at timescales of solar cycles, and that the statistical significance test indicates that the correlation coefficient of three TSI composites with MPSI is statistically significantly higher than that of three TSI composites with MWSI. Furthermore, the cross wavelet transform (XWT) and WTC of TSI with MPSI show that the TSI is highly related and actually in phase with MPSI at a timescale of a solar cycle as well. Consequently, the CC analysis, XWT, and WTC indicate that the solar weak magnetic activity on the full disk, which is represented by MPSI, dominates the inter-solar-cycle variation of TSI.« less

  14. [Burnout syndrome in teachers from two universities in Popayán, Colombia].

    PubMed

    Correa-Correa, Zamanda; Muñoz-Zambrano, Isabel; Chaparro, Andrés F

    2010-08-01

    Evaluating professional exhaustion or burnout syndrome: background, syndrome and consequences amongst half-time and full-time staff working in two private universities in the city of Popayán during 2008. The study population included 44 male and female participants aged 20 to 40 who were evaluated by using a brief burnout questionnaire (BBQ). This questionnaire had been validated for Latin-American and for teachers. It was not exclusively focused on the structure of the syndrome itself but rather included background elements and consequences. The study was quantitative and cross-sectional, having a deductive hypothetical methodological focus. Descriptive statistics and the Chi-square test were used for data analysis, accepting p<0.05 statistical significance. The analysis was univariate and bivariate. The results indicated low burnout syndrome frequency in the study population. However, 9.1 % high depersonalization frequency was found (i.e. teachers had developed negative attitudes and were insensitive to those receiving their services) and 15.9 % and 9.1 % frequencies for high physical and social consequences, respectively. Bivariate analysis revealed significant association of several factors. The results indicated low burnout syndrome frequency in this population. However, factors which were highly associated with physical and social consequences were: being male, aged 20 to 40, having a marital relationship with a habitual partner, working full-time, working at home and spending more than 75 % of the working day interacting with the beneficiaries of the services being provided.

  15. Full-scale measurements and system identification on Sutong cable-stayed bridge during Typhoon Fung-Wong.

    PubMed

    Wang, Hao; Tao, Tianyou; Guo, Tong; Li, Jian; Li, Aiqun

    2014-01-01

    The structural health monitoring system (SHMS) provides an effective tool to conduct full-scale measurements on existing bridges for essential research on bridge wind engineering. In July 2008, Typhoon Fung-Wong lashed China and hit Sutong cable-stayed bridge (SCB) in China. During typhoon period, full-scale measurements were conducted to record the wind data and the structural vibration responses were collected by the SHMS installed on SCB. Based on the statistical method and the spectral analysis technique, the measured data are analyzed to obtain the typical parameters and characteristics. Furthermore, this paper analyzed the measured structural vibration responses and indicated the vibration characteristics of the stay cable and the deck, the relationship between structural vibrations and wind speed, the comparison of upstream and downstream cable vibrations, the effectiveness of cable dampers, and so forth. Considering the significance of damping ratio in vibration mitigation, the modal damping ratios of the SCB are identified based on the Hilbert-Huang transform (HHT) combined with the random decrement technique (RDT). The analysis results can be used to validate the current dynamic characteristic analysis methods, buffeting calculation methods, and wind tunnel test results of the long-span cable-stayed bridges.

  16. Full-Scale Measurements and System Identification on Sutong Cable-Stayed Bridge during Typhoon Fung-Wong

    PubMed Central

    Tao, Tianyou; Li, Aiqun

    2014-01-01

    The structural health monitoring system (SHMS) provides an effective tool to conduct full-scale measurements on existing bridges for essential research on bridge wind engineering. In July 2008, Typhoon Fung-Wong lashed China and hit Sutong cable-stayed bridge (SCB) in China. During typhoon period, full-scale measurements were conducted to record the wind data and the structural vibration responses were collected by the SHMS installed on SCB. Based on the statistical method and the spectral analysis technique, the measured data are analyzed to obtain the typical parameters and characteristics. Furthermore, this paper analyzed the measured structural vibration responses and indicated the vibration characteristics of the stay cable and the deck, the relationship between structural vibrations and wind speed, the comparison of upstream and downstream cable vibrations, the effectiveness of cable dampers, and so forth. Considering the significance of damping ratio in vibration mitigation, the modal damping ratios of the SCB are identified based on the Hilbert-Huang transform (HHT) combined with the random decrement technique (RDT). The analysis results can be used to validate the current dynamic characteristic analysis methods, buffeting calculation methods, and wind tunnel test results of the long-span cable-stayed bridges. PMID:24995367

  17. Efficacy of platelet-rich plasma in arthroscopic repair of full-thickness rotator cuff tears: a meta-analysis.

    PubMed

    Cai, You-zhi; Zhang, Chi; Lin, Xiang-jin

    2015-12-01

    The use of platelet-rich plasma (PRP) is an innovative clinical therapy, especially in arthroscopic rotator cuff repair. The purpose of this study was to compare the clinical improvement and tendon-to-bone healing with and without PRP therapy in arthroscopic rotator cuff repair. A systematic search was done in the major medical databases to evaluate the studies using PRP therapy (PRP+) or with no PRP (PRP-) for the treatment of patients with rotator cuff tears. We reviewed clinical scores such as the Constant score, the American Shoulder and Elbow Surgeons score, the University of California at Los Angeles (UCLA) Shoulder Rating Scale, the Simple Shoulder Test, and the failure-to-heal rate by magnetic resonance imaging between PRP+ and PRP- groups. Five studies included in this review were used for a meta-analysis based on data availability. There were no statistically significant differences between PRP+ and PRP- groups for overall outcome scores (P > .05). However, the PRP+ group exhibited better healing rates postoperatively than the PRP- group (P = .03) in small/moderate full-thickness tears. The use of PRP therapy in full-thickness rotator cuff repairs showed no statistically significant difference compared with no PRP therapy in clinical outcome scores, but the failure-to-heal rate was significantly decreased when PRP was used for treatment of small-to-moderately sized tears. PRP therapy may improve tendon-to-bone healing in patients with small or moderate rotator cuff tears. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  18. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  19. Improving the Prognostic Ability through Better Use of Standard Clinical Data - The Nottingham Prognostic Index as an Example

    PubMed Central

    Winzer, Klaus-Jürgen; Buchholz, Anika; Schumacher, Martin; Sauerbrei, Willi

    2016-01-01

    Background Prognostic factors and prognostic models play a key role in medical research and patient management. The Nottingham Prognostic Index (NPI) is a well-established prognostic classification scheme for patients with breast cancer. In a very simple way, it combines the information from tumor size, lymph node stage and tumor grade. For the resulting index cutpoints are proposed to classify it into three to six groups with different prognosis. As not all prognostic information from the three and other standard factors is used, we will consider improvement of the prognostic ability using suitable analysis approaches. Methods and Findings Reanalyzing overall survival data of 1560 patients from a clinical database by using multivariable fractional polynomials and further modern statistical methods we illustrate suitable multivariable modelling and methods to derive and assess the prognostic ability of an index. Using a REMARK type profile we summarize relevant steps of the analysis. Adding the information from hormonal receptor status and using the full information from the three NPI components, specifically concerning the number of positive lymph nodes, an extended NPI with improved prognostic ability is derived. Conclusions The prognostic ability of even one of the best established prognostic index in medicine can be improved by using suitable statistical methodology to extract the full information from standard clinical data. This extended version of the NPI can serve as a benchmark to assess the added value of new information, ranging from a new single clinical marker to a derived index from omics data. An established benchmark would also help to harmonize the statistical analyses of such studies and protect against the propagation of many false promises concerning the prognostic value of new measurements. Statistical methods used are generally available and can be used for similar analyses in other diseases. PMID:26938061

  20. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  1. Lung Cancer Risk Prediction Model Incorporating Lung Function: Development and Validation in the UK Biobank Prospective Cohort Study.

    PubMed

    Muller, David C; Johansson, Mattias; Brennan, Paul

    2017-03-10

    Purpose Several lung cancer risk prediction models have been developed, but none to date have assessed the predictive ability of lung function in a population-based cohort. We sought to develop and internally validate a model incorporating lung function using data from the UK Biobank prospective cohort study. Methods This analysis included 502,321 participants without a previous diagnosis of lung cancer, predominantly between 40 and 70 years of age. We used flexible parametric survival models to estimate the 2-year probability of lung cancer, accounting for the competing risk of death. Models included predictors previously shown to be associated with lung cancer risk, including sex, variables related to smoking history and nicotine addiction, medical history, family history of lung cancer, and lung function (forced expiratory volume in 1 second [FEV1]). Results During accumulated follow-up of 1,469,518 person-years, there were 738 lung cancer diagnoses. A model incorporating all predictors had excellent discrimination (concordance (c)-statistic [95% CI] = 0.85 [0.82 to 0.87]). Internal validation suggested that the model will discriminate well when applied to new data (optimism-corrected c-statistic = 0.84). The full model, including FEV1, also had modestly superior discriminatory power than one that was designed solely on the basis of questionnaire variables (c-statistic = 0.84 [0.82 to 0.86]; optimism-corrected c-statistic = 0.83; p FEV1 = 3.4 × 10 -13 ). The full model had better discrimination than standard lung cancer screening eligibility criteria (c-statistic = 0.66 [0.64 to 0.69]). Conclusion A risk prediction model that includes lung function has strong predictive ability, which could improve eligibility criteria for lung cancer screening programs.

  2. Comparison of full-mouth disinfection and quadrant-wise scaling in the treatment of adult chronic periodontitis: a systematic review and meta-analysis.

    PubMed

    Fang, H; Han, M; Li, Q-L; Cao, C Y; Xia, R; Zhang, Z-H

    2016-08-01

    Scaling and root planing are widely considered as effective methods for treating chronic periodontitis. A meta-analysis published in 2008 showed no statistically significant differences between full-mouth disinfection (FMD) or full-mouth scaling and root planing (FMS) and quadrant scaling and root planing (Q-SRP). The FMD approach only resulted in modest additional improvements in several indices. Whether differences exist between these two approaches requires further validation. Accordingly, a study was conducted to further validate whether FMD with antiseptics or FMS without the use of antiseptics within 24 h provides greater clinical improvement than Q-SRP in patients with chronic periodontitis. Medline (via OVID), EMBASE (via OVID), PubMed and CENTRAL databases were searched up to 27 January 2015. Randomized controlled trials comparing FMD or FMS with Q-SRP after at least 3 mo were included. Meta-analysis was performed to obtain the weighted mean difference (WMD), together with the corresponding 95% confidence intervals. Thirteen articles were included in the meta-analysis. The WMD of probing pocket depth reduction was 0.25 mm (p < 0.05) for FMD vs. Q-SRP in single-rooted teeth with moderate pockets, and clinical attachment level gain in single- and multirooted teeth with moderate pockets was 0.33 mm (p < 0.05) for FMD vs. Q-SRP. Except for those, no statistically significant differences were found in the other subanalyses of FMD vs. Q-SRP, FMS vs. Q-SRP and FMD vs. FMS. Therefore, the meta-analysis results showed that FMD was better than Q-SRP for achieving probing pocket depth reduction and clinical attachment level gain in moderate pockets. Additionally, regardless of the treatment, no serious complications were observed. FMD, FMS and Q-SRP are all effective for the treatment of adult chronic periodontitis, and they do not lead to any obvious discomfort among patients. Moreover, FMD had modest additional clinical benefits over Q-SRP, so we prefer to recommend FMD as the first choice for the treatment of adult chronic periodontitis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. The impact of the 2007-2009 recession on workers' health coverage.

    PubMed

    Fronstin, Paul

    2011-04-01

    IMPACT OF THE RECESSION: The 2007-2009 recession has taken its toll on the percentage of the population with employment-based health coverage. While, since 2000, there has been a slow erosion in the percentage of individuals under age 65 with employment-based health coverage, 2009 was the first year in which the percentage fell below 60 percent, and marked the largest one-year decline in coverage. FEWER WORKERS WITH COVERAGE: The percentage of workers with coverage through their own job fell from 53.2 percent in 2008 to 52 percent in 2009, a 2.4 percent decline in the likelihood that a worker has coverage through his or her own job. The percentage of workers with coverage as a dependent fell from 17 percent in 2008 to 16.3 percent in 2009, a 4.5 percent drop in the likelihood that a worker has coverage as a dependent. These declines occurred as the unemployment rate increased from an average of 5.8 percent in 2008 to 9.3 percent in 2009 (and reached a high of 10.1 percent during 2009). FIRM SIZE/INDUSTRY: The decline in the percentage of workers with coverage from their own job affected workers in private-sector firms of all sizes. Among public-sector workers, the decline from 73.4 percent to 73 percent was not statistically significant. Workers in all private-sector industries experienced a statistically significant decline in coverage between 2008 and 2009. HOURS WORKED: Full-time workers experienced a decline in coverage that was statistically significant while part-time workers did not. Among full-time workers, those employed full year experienced a statistically significant decline in coverage from their own job. Those employed full time but for only part of the year did not experience a statistically significant change in coverage. Among part-time workers, those employed full year experienced a statistically significant increase in the likelihood of having coverage in their own name, as did part-time workers employed for only part of the year. ANNUAL EARNINGS: The decline in the percentage of workers with coverage through their own job was limited to workers with lower annual earnings. Statistically significant declines were not found among any group of workers with annual earnings of at least $40,000. Workers with a high school education or less experienced a statistically significant decline in the likelihood of having coverage. Neither workers with a college degree nor those with a graduate degree experienced a statistically significant decline in coverage through their own job. Workers of all races experienced statistically significant declines in coverage between 2008 and 2009. Both men and women experienced a statistically significant decline in the percentage with health coverage through their own job. IMPACT OF STRUCTURAL CHANGES TO THE WORK FORCE: The movement of workers from the manufacturing industry to the service sector continued between 2008 and 2009. The percentage of workers employed on a full-time basis decreased while the percentage working part time increased. While there was an overall decline in the percentage of full-time workers, that decline was limited to workers employed full year. The percentage of workers employed on a full-time, part-year basis increased between 2008 and 2009. The distribution of workers by annual earnings shifted from middle-income workers to lower-income workers between 2008 and 2009.

  4. The Mindful Attention Awareness Scale: Further Examination of Dimensionality, Reliability, and Concurrent Validity Estimates.

    PubMed

    Osman, Augustine; Lamis, Dorian A; Bagge, Courtney L; Freedenthal, Stacey; Barnes, Sean M

    2016-01-01

    We examined the factor structure and psychometric properties of the Mindful Attention Awareness Scale (MAAS) in a sample of 810 undergraduate students. Using common exploratory factor analysis (EFA), we obtained evidence for a 1-factor solution (41.84% common variance). To confirm unidimensionality of the 15-item MAAS, we conducted a 1-factor confirmatory factor analysis (CFA). Results of the EFA and CFA, respectively, provided support for a unidimensional model. Using differential item functioning analysis methods within item response theory modeling (IRT-based DIF), we found that individuals with high and low levels of nonattachment responded similarly to the MAAS items. Following a detailed item analysis, we proposed a 5-item short version of the instrument and present descriptive statistics and composite score reliability for the short and full versions of the MAAS. Finally, correlation analyses showed that scores on the full and short versions of the MAAS were associated with measures assessing related constructs. The 5-item MAAS is as useful as the original MAAS in enhancing our understanding of the mindfulness construct.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of thismore » object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.« less

  6. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    PubMed

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  7. Usage Statistics

    MedlinePlus

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  8. Hydrostratigraphic analysis of the MADE site with full-resolution GPR and direct-push hydraulic profiling

    USGS Publications Warehouse

    Dogan, M.; Van Dam, R. L.; Bohling, Geoffrey C.; Butler, J.J.; Hyndman, D.W.

    2011-01-01

    Full-resolution 3D Ground-Penetrating Radar (GPR) data were combined with high-resolution hydraulic conductivity (K) data from vertical Direct-Push (DP) profiles to characterize a portion of the highly heterogeneous MAcro Dispersion Experiment (MADE) site. This is an important first step to better understand the influence of aquifer heterogeneities on observed anomalous transport. Statistical evaluation of DP data indicates non-normal distributions that have much higher similarity within each GPR facies than between facies. The analysis of GPR and DP data provides high-resolution estimates of the 3D geometry of hydrostratigraphic zones, which can then be populated with stochastic K fields. The lack of such estimates has been a significant limitation for testing and parameterizing a range of novel transport theories at sites where the traditional advection-dispersion model has proven inadequate. ?? 2011 by the American Geophysical Union.

  9. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  10. Fractographic study of the behavior of different ceramic veneers on full coverage crowns in relation to supporting core materials

    PubMed Central

    Agustín-Panadero, Rubén; Román-Rodriguez, Juan L.; Solá-Ruíz, María F.; Granell-Ruíz, María; Fons-Font, Antonio

    2013-01-01

    Objectives: To observe porcelain veneer behavior of zirconia and metal-ceramic full coverage crowns when subjected to compression testing, comparing zirconia cores to metal cores. Study Design: The porcelain fracture surfaces of 120 full coverage crowns (60 with a metal core and 60 with a zirconia core) subjected to static load (compression) testing were analyzed. Image analysis was performed using macroscopic processing with 8x and 12x enlargement. Five samples from each group were prepared and underwent scanning electron microscope (SEM) analysis in order to make a fractographic study of fracture propagation in the contact area and composition analysis in the most significant areas of the specimen. Results: Statistically significant differences in fracture type (cohesive or adhesive) were found between the metal-ceramic and zirconia groups: the incidence of adhesive fracture was seen to be greater in metal-ceramic groups (92%) and cohesive fracture was more frequent in zirconium oxide groups (72%). The fracture propagation pattern was on the periphery of the contact area in the full coverage crown restorations selected for fractographic study. Conclusions: The greater frequency of cohesive fracture in restorations with zirconia cores indicates that their behavior is inadequate compared to metal-ceramic restorations and that further research is needed to improve their clinical performance. Key words:Zirconia, zirconium oxide, fractography, composition, porcelain veneers, fracture, cohesive, adhesive. PMID:24455092

  11. Precision of guided scanning procedures for full-arch digital impressions in vivo.

    PubMed

    Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert

    2017-11-01

    System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.

  12. Does concomitant acromioplasty facilitate arthroscopic repair of full-thickness rotator cuff tears? A meta-analysis with trial sequential analysis of randomized controlled trials.

    PubMed

    Song, Lei; Miao, Ling; Zhang, Peng; Wang, Wen-Liang

    2016-01-01

    To conduct a meta-analysis with randomized controlled trials (RCTs) published in full text to determine the benefits of concomitant acromioplasty in repairing full-thickness rotator cuff tears. Literature search was performed in PubMed, Embase and the Cochrane Library from databases inception through February 2016 to identify RCTs evaluating the efficacy of performing a concomitant acromioplasty. Statistical heterogeneity among studies was quantitatively evaluated by I-squared index (I(2)) and trial sequential analysis (TSA) was applied to control random errors. Five RCTs totaling 523 patients were included. There was no statistically significant difference in Constant score (WMD = 1.00; 95 % CI -4.40 to 6.41; P = 0.72), University of California-Los Angeles (UCLA) score (WMD = 0.48; 95 % CI -0.79 to 1.76; P = 0.46), visual analog scale (VAS) for pain (WMD = -0.23; 95 % CI -0.58 to 0.11; P = 0.19) and re-tear rate (RR = 0.46; 95 % CI 0.14 to 1.53; P = 0.21) between acromioplasty group and the nonacromioplasty group. However, it was found to be related to a greater increase in American Shoulder and Elbow Surgeons (ASES) score (WMD = 3.02; 95 % CI 0.24 to 5.80; P = 0.03). Unfortunately, this difference was not reinforced by subsequent TSA. In addition, subgroup analysis showed no substantial difference of ASES score in patients with type-1 (WMD = -8.21; 95 % CI -23.55 to 7.14; P = 0.29), type-2 (WMD = 0.97; 95 % CI -5.10 to 7.05; P = 0.75), or type-3 (WMD = 2.32; 95 % CI -9.96 to 14.61; P = 0.71) acromion. A significant higher ASES score was observed during the comparison despite lacking reinforcement by TSA. No difference was found in Constant score, UCLA score, VAS, re-tear rate and subgroup analysis did not confirm the impact of acromion type on eventual therapeutic outcome. Future studies with large number of participants, long-term follow-ups, data of patient-reported outcomes and stratification for acromion type are of the essence for demonstrating whether functional or structural differences exist in patients undergoing arthroscopic repair of full-thickness rotator cuff tears with or without acromioplasty.

  13. Investigating the collision energy dependence of η /s in the beam energy scan at the BNL Relativistic Heavy Ion Collider using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Auvinen, Jussi; Bernhard, Jonah E.; Bass, Steffen A.; Karpenko, Iurii

    2018-04-01

    We determine the probability distributions of the shear viscosity over the entropy density ratio η /s in the quark-gluon plasma formed in Au + Au collisions at √{sN N}=19.6 ,39 , and 62.4 GeV , using Bayesian inference and Gaussian process emulators for a model-to-data statistical analysis that probes the full input parameter space of a transport + viscous hydrodynamics hybrid model. We find the most likely value of η /s to be larger at smaller √{sN N}, although the uncertainties still allow for a constant value between 0.10 and 0.15 for the investigated collision energy range.

  14. [Pseudoreplication, chatter, and the international nature of science: a response to D.V. Tatarnikov].

    PubMed

    Kozlov, M V; Hurlbert, S H

    2006-01-01

    The commentary by Tatarnikov (2005) on the design and analysis of manipulative experiments in ecology represents an obvious danger to readers with poor knowledge of modern statistics due to its erroneous interpretation of pseudoreplication and statistical independence. Here we offer clarification of those concepts--and related ones such as experimental unit and evaluation unit--by reference to studies cited by Tatarnikov (2005). We stress the necessity of learning from the accumulated experience of the international scientific community in order not to repeat the errors found in earlier publications that have already been analyzed and widely written about. (An Englisch translation of the full article is available as a pdf-file from either or the authors.)

  15. Hyperspectral imaging coupled with chemometric analysis for non-invasive differentiation of black pens

    NASA Astrophysics Data System (ADS)

    Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna

    2016-11-01

    Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.

  16. Kinetic analysis of single molecule FRET transitions without trajectories

    NASA Astrophysics Data System (ADS)

    Schrangl, Lukas; Göhring, Janett; Schütz, Gerhard J.

    2018-03-01

    Single molecule Förster resonance energy transfer (smFRET) is a popular tool to study biological systems that undergo topological transitions on the nanometer scale. smFRET experiments typically require recording of long smFRET trajectories and subsequent statistical analysis to extract parameters such as the states' lifetimes. Alternatively, analysis of probability distributions exploits the shapes of smFRET distributions at well chosen exposure times and hence works without the acquisition of time traces. Here, we describe a variant that utilizes statistical tests to compare experimental datasets with Monte Carlo simulations. For a given model, parameters are varied to cover the full realistic parameter space. As output, the method yields p-values which quantify the likelihood for each parameter setting to be consistent with the experimental data. The method provides suitable results even if the actual lifetimes differ by an order of magnitude. We also demonstrated the robustness of the method to inaccurately determine input parameters. As proof of concept, the new method was applied to the determination of transition rate constants for Holliday junctions.

  17. Impact of specimen adequacy on the assessment of renal allograft biopsy specimens.

    PubMed

    Cimen, S; Geldenhuys, L; Guler, S; Imamoglu, A; Molinari, M

    2016-01-01

    The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.

  18. Detection of Fatty Acids from Intact Microorganisms by Molecular Beam Static Secondary Ion Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Jani Cheri; Lehman, Richard Michael; Bauer, William Francis

    We report the use of a surface analysis approach, static secondary ion mass spectrometry (SIMS) equipped with a molecular (ReO4-) ion primary beam, to analyze the surface of intact microbial cells. SIMS spectra of 28 microorganisms were compared to fatty acid profiles determined by gas chromatographic analysis of transesterfied fatty acids extracted from the same organisms. The results indicate that surface bombardment using the molecular primary beam cleaved the ester linkage characteristic of bacteria at the glycerophosphate backbone of the phospholipid components of the cell membrane. This cleavage enables direct detection of the fatty acid conjugate base of intact microorganismsmore » by static SIMS. The limit of detection for this approach is approximately 107 bacterial cells/cm2. Multivariate statistical methods were applied in a graded approach to the SIMS microbial data. The results showed that the full data set could initially be statistically grouped based upon major differences in biochemical composition of the cell wall. The gram-positive bacteria were further statistically analyzed, followed by final analysis of a specific bacterial genus that was successfully grouped by species. Additionally, the use of SIMS to detect microbes on mineral surfaces is demonstrated by an analysis of Shewanella oneidensis on crushed hematite. The results of this study provide evidence for the potential of static SIMS to rapidly detect bacterial species based on ion fragments originating from cell membrane lipids directly from sample surfaces.« less

  19. Atomic-scale phase composition through multivariate statistical analysis of atom probe tomography data.

    PubMed

    Keenan, Michael R; Smentkowski, Vincent S; Ulfig, Robert M; Oltman, Edward; Larson, David J; Kelly, Thomas F

    2011-06-01

    We demonstrate for the first time that multivariate statistical analysis techniques can be applied to atom probe tomography data to estimate the chemical composition of a sample at the full spatial resolution of the atom probe in three dimensions. Whereas the raw atom probe data provide the specific identity of an atom at a precise location, the multivariate results can be interpreted in terms of the probabilities that an atom representing a particular chemical phase is situated there. When aggregated to the size scale of a single atom (∼0.2 nm), atom probe spectral-image datasets are huge and extremely sparse. In fact, the average spectrum will have somewhat less than one total count per spectrum due to imperfect detection efficiency. These conditions, under which the variance in the data is completely dominated by counting noise, test the limits of multivariate analysis, and an extensive discussion of how to extract the chemical information is presented. Efficient numerical approaches to performing principal component analysis (PCA) on these datasets, which may number hundreds of millions of individual spectra, are put forward, and it is shown that PCA can be computed in a few seconds on a typical laptop computer.

  20. Expression Profiling of Nonpolar Lipids in Meibum From Patients With Dry Eye: A Pilot Study

    PubMed Central

    Chen, Jianzhong; Keirsey, Jeremy K.; Green, Kari B.; Nichols, Kelly K.

    2017-01-01

    Purpose The purpose of this investigation was to characterize differentially expressed lipids in meibum samples from patients with dry eye disease (DED) in order to better understand the underlying pathologic mechanisms. Methods Meibum samples were collected from postmenopausal women with DED (PW-DED; n = 5) and a control group of postmenopausal women without DED (n = 4). Lipid profiles were analyzed by direct infusion full-scan electrospray ionization mass spectrometry (ESI-MS). An initial analysis of 145 representative peaks from four classes of lipids in PW-DED samples revealed that additional manual corrections for peak overlap and isotopes only slightly affected the statistical analysis. Therefore, analysis of uncorrected data, which can be applied to a greater number of peaks, was used to compare more than 500 lipid peaks common to PW-DED and control samples. Statistical analysis of peak intensities identified several lipid species that differed significantly between the two groups. Data from contact lens wearers with DED (CL-DED; n = 5) were also analyzed. Results Many species of the two types of diesters (DE) and very long chain wax esters (WE) were decreased by ∼20% in PW-DED, whereas levels of triacylglycerols were increased by an average of 39% ± 3% in meibum from PW-DED compared to that in the control group. Approximately the same reduction (20%) of similar DE and WE was observed for CL-DED. Conclusions Statistical analysis of peak intensities from direct infusion ESI-MS results identified differentially expressed lipids in meibum from dry eye patients. Further studies are warranted to support these findings. PMID:28426869

  1. Quality classification of Spanish olive oils by untargeted gas chromatography coupled to hybrid quadrupole-time of flight mass spectrometry with atmospheric pressure chemical ionization and metabolomics-based statistical approach.

    PubMed

    Sales, C; Cervera, M I; Gil, R; Portolés, T; Pitarch, E; Beltran, J

    2017-02-01

    The novel atmospheric pressure chemical ionization (APCI) source has been used in combination with gas chromatography (GC) coupled to hybrid quadrupole time-of-flight (QTOF) mass spectrometry (MS) for determination of volatile components of olive oil, enhancing its potential for classification of olive oil samples according to their quality using a metabolomics-based approach. The full-spectrum acquisition has allowed the detection of volatile organic compounds (VOCs) in olive oil samples, including Extra Virgin, Virgin and Lampante qualities. A dynamic headspace extraction with cartridge solvent elution was applied. The metabolomics strategy consisted of three different steps: a full mass spectral alignment of GC-MS data using MzMine 2.0, a multivariate analysis using Ez-Info and the creation of the statistical model with combinations of responses for molecular fragments. The model was finally validated using blind samples, obtaining an accuracy in oil classification of 70%, taking the official established method, "PANEL TEST", as reference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A statistical approach to optimizing concrete mixture design.

    PubMed

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  3. A Statistical Approach to Optimizing Concrete Mixture Design

    PubMed Central

    Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405

  4. Performance of the S - [chi][squared] Statistic for Full-Information Bifactor Models

    ERIC Educational Resources Information Center

    Li, Ying; Rupp, Andre A.

    2011-01-01

    This study investigated the Type I error rate and power of the multivariate extension of the S - [chi][squared] statistic using unidimensional and multidimensional item response theory (UIRT and MIRT, respectively) models as well as full-information bifactor (FI-bifactor) models through simulation. Manipulated factors included test length, sample…

  5. Planck 2015 results. XVI. Isotropy and statistics of the CMB

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.

  6. Planck 2015 results: XVI. Isotropy and statistics of the CMB

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...

    2016-09-20

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  7. Scan statistics with local vote for target detection in distributed system

    NASA Astrophysics Data System (ADS)

    Luo, Junhai; Wu, Qi

    2017-12-01

    Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.

  8. A national streamflow network gap analysis

    USGS Publications Warehouse

    Kiang, Julie E.; Stewart, David W.; Archfield, Stacey A.; Osborne, Emily B.; Eng, Ken

    2013-01-01

    The U.S. Geological Survey (USGS) conducted a gap analysis to evaluate how well the USGS streamgage network meets a variety of needs, focusing on the ability to calculate various statistics at locations that have streamgages (gaged) and that do not have streamgages (ungaged). This report presents the results of analysis to determine where there are gaps in the network of gaged locations, how accurately desired statistics can be calculated with a given length of record, and whether the current network allows for estimation of these statistics at ungaged locations. The analysis indicated that there is variability across the Nation’s streamflow data-collection network in terms of the spatial and temporal coverage of streamgages. In general, the Eastern United States has better coverage than the Western United States. The arid Southwestern United States, Alaska, and Hawaii were observed to have the poorest spatial coverage, using the dataset assembled for this study. Except in Hawaii, these areas also tended to have short streamflow records. Differences in hydrology lead to differences in the uncertainty of statistics calculated in different regions of the country. Arid and semiarid areas of the Central and Southwestern United States generally exhibited the highest levels of interannual variability in flow, leading to larger uncertainty in flow statistics. At ungaged locations, information can be transferred from nearby streamgages if there is sufficient similarity between the gaged watersheds and the ungaged watersheds of interest. Areas where streamgages exhibit high correlation are most likely to be suitable for this type of information transfer. The areas with the most highly correlated streamgages appear to coincide with mountainous areas of the United States. Lower correlations are found in the Central United States and coastal areas of the Southeastern United States. Information transfer from gaged basins to ungaged basins is also most likely to be successful when basin attributes show high similarity. At the scale of the analysis completed in this study, the attributes of basins upstream of USGS streamgages cover the full range of basin attributes observed at potential locations of interest fairly well. Some exceptions included very high or very low elevation areas and very arid areas.

  9. Comparative evaluation of apically extruded debris during root canal preparation using ProTaper™, Hyflex™ and Waveone™ rotary systems

    PubMed Central

    Surakanti, Jayaprada Reddy; Venkata, Ravi Chandra Polavarapu; Vemisetty, Hari Kumar; Dandolu, Ram Kiran; Jaya, Nagendra Krishna Muppalla; Thota, Shirisha

    2014-01-01

    Background and Aims: Extrusion of any debris during endodontic treatment may potentially cause post-operative complications such as flare-ups. The purpose of this in vitro study was to assess the amount of apically extruded debris during the root canal preparation using rotary and reciprocating nickel-titanium instrumentation systems. Materials and Methods: In this study, 60 human mandibular first premolars were randomly assigned to 3 groups (n = 20 teeth/group). The root canals were instrumented according to the manufacturers’ instructions using the Reciprocating single-file system WaveOne™ (Dentsply Maillefer, Ballaigues, Switzerland) and full-sequence rotary Hyflex CM™ (Coltene Whaledent, Allstetten, Switzerland) and ProTaper™ (Dentsply Maillefer, Ballaigues, Switzerland) instruments. The canals were then irrigated using bidistilled water. The debris that was extruded apically was collected in preweighed eppendorf tubes and assessed with an electronic balance and compared. Statistical Analysis Used: The debris extrusion was compared and statistically analyzed using analysis of variance and the post hoc Student-Newman-Keuls test. Results: The WaveOne™ and ProTaper™ rotary instruments produced significantly more debris compared with Hyflex CM™ rotary instruments (P < 0.05). Conclusions: Under the conditions of this study, all systems that were used resulted in extrusion of apical debris. Full-sequence rotary instrumentation was associated with less debris extrusion compared with the use of reciprocating single-file systems. PMID:24778507

  10. The relationship between eating disorder not otherwise specified (EDNOS) and officially recognized eating disorders: meta-analysis and implications for DSM.

    PubMed

    Thomas, Jennifer J; Vartanian, Lenny R; Brownell, Kelly D

    2009-05-01

    Eating disorder not otherwise specified (EDNOS) is the most prevalent eating disorder (ED) diagnosis. In this meta-analysis, the authors aimed to inform Diagnostic and Statistical Manual of Mental Disorders revisions by comparing the psychopathology of EDNOS with that of the officially recognized EDs: anorexia nervosa (AN), bulimia nervosa (BN), and binge eating disorder (BED). A comprehensive literature search identified 125 eligible studies (published and unpublished) appearing in the literature from 1987 to 2007. Random effects analyses indicated that whereas EDNOS did not differ significantly from AN and BED on eating pathology or general psychopathology, BN exhibited greater eating and general psychopathology than EDNOS. Moderator analyses indicated that EDNOS groups who met all diagnostic criteria for AN except for amenorrhea did not differ significantly from full syndrome cases. Similarly, EDNOS groups who met all criteria for BN or BED except for binge frequency did not differ significantly from full syndrome cases. Results suggest that EDNOS represents a set of disorders associated with substantial psychological and physiological morbidity. Although certain EDNOS subtypes could be incorporated into existing Diagnostic and Statistical Manual of Mental Disorders (4th ed.; American Psychiatric Association, 1994) categories, others-such as purging disorder and non-fat-phobic AN-may be best conceptualized as distinct syndromes. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  11. Quantifying long-term human impact in contrasting environments: Statistical analysis of modern and fossil pollen records

    NASA Astrophysics Data System (ADS)

    Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert

    2017-04-01

    Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period onwards (ca. 850 cal a BP) related to increasing human impact in the landscape. At other study sites arboreal cover remained above 25% beside significant human impact. Overall, the presented examples from two contrasting environments shows how cluster analysis and NMDS of modern and fossil pollen data can help to provide quantitative insights in anthropogenic land cover changes. Our study extensively discuss and illustrate the possibilities and limitations of statistical analysis of pollen data to quantify human induced land use changes.

  12. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling

    PubMed Central

    Wood, John

    2017-01-01

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080

  13. Impact of full mental health and substance abuse parity for children in the Federal Employees Health Benefits Program.

    PubMed

    Azrin, Susan T; Huskamp, Haiden A; Azzone, Vanessa; Goldman, Howard H; Frank, Richard G; Burnam, M Audrey; Normand, Sharon-Lise T; Ridgely, M Susan; Young, Alexander S; Barry, Colleen L; Busch, Alisa B; Moran, Garrett

    2007-02-01

    The Federal Employees Health Benefits Program implemented full mental health and substance abuse parity in January 2001. Evaluation of this policy revealed that parity increased adult beneficiaries' financial protection by lowering mental health and substance abuse out-of-pocket costs for service users in most plans studied but did not increase rates of service use or spending among adult service users. This study examined the effects of full mental health and substance abuse parity for children. Employing a quasiexperimental design, we compared children in 7 Federal Employees Health Benefits plans from 1999 to 2002 with children in a matched set of plans that did not have a comparable change in mental health and substance abuse coverage. Using a difference-in-differences analysis, we examined the likelihood of child mental health and substance abuse service use, total spending among child service users, and out-of-pocket spending. The apparent increase in the rate of children's mental health and substance abuse service use after implementation of parity was almost entirely due to secular trends of increased service utilization. Estimates for children's mental health and substance abuse spending conditional on this service use showed significant decreases in spending per user attributable to parity for 2 plans; spending estimates for the other plans were not statistically significant. Children using these services in 3 of 7 plans experienced statistically significant reductions in out-of-pocket spending attributable to the parity policy, and the average dollar savings was sizeable for users in those 3 plans. In the remaining 4 plans, out-of-pocket spending also decreased, but these decreases were not statistically significant. Full mental health and substance abuse parity for children, within the context of managed care, can achieve equivalence of benefits in health insurance coverage and improve financial protection without adversely affecting health care costs but may not expand access for children who need these services.

  14. Analysis of tribological behaviour of zirconia reinforced Al-SiC hybrid composites using statistical and artificial neural network technique

    NASA Astrophysics Data System (ADS)

    Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.

    2018-05-01

    The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.

  15. Optimized design and analysis of preclinical intervention studies in vivo

    PubMed Central

    Laajala, Teemu D.; Jumppanen, Mikael; Huhtaniemi, Riikka; Fey, Vidal; Kaur, Amanpreet; Knuuttila, Matias; Aho, Eija; Oksala, Riikka; Westermarck, Jukka; Mäkelä, Sari; Poutanen, Matti; Aittokallio, Tero

    2016-01-01

    Recent reports have called into question the reproducibility, validity and translatability of the preclinical animal studies due to limitations in their experimental design and statistical analysis. To this end, we implemented a matching-based modelling approach for optimal intervention group allocation, randomization and power calculations, which takes full account of the complex animal characteristics at baseline prior to interventions. In prostate cancer xenograft studies, the method effectively normalized the confounding baseline variability, and resulted in animal allocations which were supported by RNA-seq profiling of the individual tumours. The matching information increased the statistical power to detect true treatment effects at smaller sample sizes in two castration-resistant prostate cancer models, thereby leading to saving of both animal lives and research costs. The novel modelling approach and its open-source and web-based software implementations enable the researchers to conduct adequately-powered and fully-blinded preclinical intervention studies, with the aim to accelerate the discovery of new therapeutic interventions. PMID:27480578

  16. Optimized design and analysis of preclinical intervention studies in vivo.

    PubMed

    Laajala, Teemu D; Jumppanen, Mikael; Huhtaniemi, Riikka; Fey, Vidal; Kaur, Amanpreet; Knuuttila, Matias; Aho, Eija; Oksala, Riikka; Westermarck, Jukka; Mäkelä, Sari; Poutanen, Matti; Aittokallio, Tero

    2016-08-02

    Recent reports have called into question the reproducibility, validity and translatability of the preclinical animal studies due to limitations in their experimental design and statistical analysis. To this end, we implemented a matching-based modelling approach for optimal intervention group allocation, randomization and power calculations, which takes full account of the complex animal characteristics at baseline prior to interventions. In prostate cancer xenograft studies, the method effectively normalized the confounding baseline variability, and resulted in animal allocations which were supported by RNA-seq profiling of the individual tumours. The matching information increased the statistical power to detect true treatment effects at smaller sample sizes in two castration-resistant prostate cancer models, thereby leading to saving of both animal lives and research costs. The novel modelling approach and its open-source and web-based software implementations enable the researchers to conduct adequately-powered and fully-blinded preclinical intervention studies, with the aim to accelerate the discovery of new therapeutic interventions.

  17. Endogenous Retrovirus Insertion in the KIT Oncogene Determines White and White spotting in Domestic Cats

    PubMed Central

    David, Victor A.; Menotti-Raymond, Marilyn; Wallace, Andrea Coots; Roelke, Melody; Kehler, James; Leighty, Robert; Eizirik, Eduardo; Hannah, Steven S.; Nelson, George; Schäffer, Alejandro A.; Connelly, Catherine J.; O’Brien, Stephen J.; Ryugo, David K.

    2014-01-01

    The Dominant White locus (W) in the domestic cat demonstrates pleiotropic effects exhibiting complete penetrance for absence of coat pigmentation and incomplete penetrance for deafness and iris hypopigmentation. We performed linkage analysis using a pedigree segregating White to identify KIT (Chr. B1) as the feline W locus. Segregation and sequence analysis of the KIT gene in two pedigrees (P1 and P2) revealed the remarkable retrotransposition and evolution of a feline endogenous retrovirus (FERV1) as responsible for two distinct phenotypes of the W locus, Dominant White, and white spotting. A full-length (7125 bp) FERV1 element is associated with white spotting, whereas a FERV1 long terminal repeat (LTR) is associated with all Dominant White individuals. For purposes of statistical analysis, the alternatives of wild-type sequence, FERV1 element, and LTR-only define a triallelic marker. Taking into account pedigree relationships, deafness is genetically linked and associated with this marker; estimated P values for association are in the range of 0.007 to 0.10. The retrotransposition interrupts a DNAase I hypersensitive site in KIT intron 1 that is highly conserved across mammals and was previously demonstrated to regulate temporal and tissue-specific expression of KIT in murine hematopoietic and melanocytic cells. A large-population genetic survey of cats (n = 270), representing 30 cat breeds, supports our findings and demonstrates statistical significance of the FERV1 LTR and full-length element with Dominant White/blue iris (P < 0.0001) and white spotting (P < 0.0001), respectively. PMID:25085922

  18. Endogenous retrovirus insertion in the KIT oncogene determines white and white spotting in domestic cats.

    PubMed

    David, Victor A; Menotti-Raymond, Marilyn; Wallace, Andrea Coots; Roelke, Melody; Kehler, James; Leighty, Robert; Eizirik, Eduardo; Hannah, Steven S; Nelson, George; Schäffer, Alejandro A; Connelly, Catherine J; O'Brien, Stephen J; Ryugo, David K

    2014-08-01

    The Dominant White locus (W) in the domestic cat demonstrates pleiotropic effects exhibiting complete penetrance for absence of coat pigmentation and incomplete penetrance for deafness and iris hypopigmentation. We performed linkage analysis using a pedigree segregating White to identify KIT (Chr. B1) as the feline W locus. Segregation and sequence analysis of the KIT gene in two pedigrees (P1 and P2) revealed the remarkable retrotransposition and evolution of a feline endogenous retrovirus (FERV1) as responsible for two distinct phenotypes of the W locus, Dominant White, and white spotting. A full-length (7125 bp) FERV1 element is associated with white spotting, whereas a FERV1 long terminal repeat (LTR) is associated with all Dominant White individuals. For purposes of statistical analysis, the alternatives of wild-type sequence, FERV1 element, and LTR-only define a triallelic marker. Taking into account pedigree relationships, deafness is genetically linked and associated with this marker; estimated P values for association are in the range of 0.007 to 0.10. The retrotransposition interrupts a DNAase I hypersensitive site in KIT intron 1 that is highly conserved across mammals and was previously demonstrated to regulate temporal and tissue-specific expression of KIT in murine hematopoietic and melanocytic cells. A large-population genetic survey of cats (n = 270), representing 30 cat breeds, supports our findings and demonstrates statistical significance of the FERV1 LTR and full-length element with Dominant White/blue iris (P < 0.0001) and white spotting (P < 0.0001), respectively. Copyright © 2014 David et al.

  19. Bilateral filtering using the full noise covariance matrix applied to x-ray phase-contrast computed tomography.

    PubMed

    Allner, S; Koehler, T; Fehringer, A; Birnbacher, L; Willner, M; Pfeiffer, F; Noël, P B

    2016-05-21

    The purpose of this work is to develop an image-based de-noising algorithm that exploits complementary information and noise statistics from multi-modal images, as they emerge in x-ray tomography techniques, for instance grating-based phase-contrast CT and spectral CT. Among the noise reduction methods, image-based de-noising is one popular approach and the so-called bilateral filter is a well known algorithm for edge-preserving filtering. We developed a generalization of the bilateral filter for the case where the imaging system provides two or more perfectly aligned images. The proposed generalization is statistically motivated and takes the full second order noise statistics of these images into account. In particular, it includes a noise correlation between the images and spatial noise correlation within the same image. The novel generalized three-dimensional bilateral filter is applied to the attenuation and phase images created with filtered backprojection reconstructions from grating-based phase-contrast tomography. In comparison to established bilateral filters, we obtain improved noise reduction and at the same time a better preservation of edges in the images on the examples of a simulated soft-tissue phantom, a human cerebellum and a human artery sample. The applied full noise covariance is determined via cross-correlation of the image noise. The filter results yield an improved feature recovery based on enhanced noise suppression and edge preservation as shown here on the example of attenuation and phase images captured with grating-based phase-contrast computed tomography. This is supported by quantitative image analysis. Without being bound to phase-contrast imaging, this generalized filter is applicable to any kind of noise-afflicted image data with or without noise correlation. Therefore, it can be utilized in various imaging applications and fields.

  20. Integrative microbial community analysis reveals full-scale enhanced biological phosphorus removal under tropical conditions

    PubMed Central

    Law, Yingyu; Kirkegaard, Rasmus Hansen; Cokro, Angel Anisa; Liu, Xianghui; Arumugam, Krithika; Xie, Chao; Stokholm-Bjerregaard, Mikkel; Drautz-Moses, Daniela I.; Nielsen, Per Halkjær; Wuertz, Stefan; Williams, Rohan B. H.

    2016-01-01

    Management of phosphorus discharge from human waste is essential for the control of eutrophication in surface waters. Enhanced biological phosphorus removal (EBPR) is a sustainable, efficient way of removing phosphorus from waste water without employing chemical precipitation, but is assumed unachievable in tropical temperatures due to conditions that favour glycogen accumulating organisms (GAOs) over polyphosphate accumulating organisms (PAOs). Here, we show these assumptions are unfounded by studying comparative community dynamics in a full-scale plant following systematic perturbation of operational conditions, which modified community abundance, function and physicochemical state. A statistically significant increase in the relative abundance of the PAO Accumulibacter was associated with improved EBPR activity. GAO relative abundance also increased, challenging the assumption of competition. An Accumulibacter bin-genome was identified from a whole community metagenomic survey, and comparative analysis against extant Accumulibacter genomes suggests a close relationship to Type II. Analysis of the associated metatranscriptome data revealed that genes encoding proteins involved in the tricarboxylic acid cycle and glycolysis pathways were highly expressed, consistent with metabolic modelling results. Our findings show that tropical EBPR is indeed possible, highlight the translational potential of studying competition dynamics in full-scale waste water communities and carry implications for plant design in tropical regions. PMID:27193869

  1. Integrative microbial community analysis reveals full-scale enhanced biological phosphorus removal under tropical conditions

    NASA Astrophysics Data System (ADS)

    Law, Yingyu; Kirkegaard, Rasmus Hansen; Cokro, Angel Anisa; Liu, Xianghui; Arumugam, Krithika; Xie, Chao; Stokholm-Bjerregaard, Mikkel; Drautz-Moses, Daniela I.; Nielsen, Per Halkjær; Wuertz, Stefan; Williams, Rohan B. H.

    2016-05-01

    Management of phosphorus discharge from human waste is essential for the control of eutrophication in surface waters. Enhanced biological phosphorus removal (EBPR) is a sustainable, efficient way of removing phosphorus from waste water without employing chemical precipitation, but is assumed unachievable in tropical temperatures due to conditions that favour glycogen accumulating organisms (GAOs) over polyphosphate accumulating organisms (PAOs). Here, we show these assumptions are unfounded by studying comparative community dynamics in a full-scale plant following systematic perturbation of operational conditions, which modified community abundance, function and physicochemical state. A statistically significant increase in the relative abundance of the PAO Accumulibacter was associated with improved EBPR activity. GAO relative abundance also increased, challenging the assumption of competition. An Accumulibacter bin-genome was identified from a whole community metagenomic survey, and comparative analysis against extant Accumulibacter genomes suggests a close relationship to Type II. Analysis of the associated metatranscriptome data revealed that genes encoding proteins involved in the tricarboxylic acid cycle and glycolysis pathways were highly expressed, consistent with metabolic modelling results. Our findings show that tropical EBPR is indeed possible, highlight the translational potential of studying competition dynamics in full-scale waste water communities and carry implications for plant design in tropical regions.

  2. A Tablet-PC Software Application for Statistics Classes

    ERIC Educational Resources Information Center

    Probst, Alexandre C.

    2014-01-01

    A significant deficiency in the area of introductory statistics education exists: Student performance on standardized assessments after a full semester statistics course is poor and students report a very low desire to learn statistics. Research on the current generation of students indicates an affinity for technology and for multitasking.…

  3. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  4. Uninduced adipose-derived stem cells repair the defect of full-thickness hyaline cartilage.

    PubMed

    Zhang, Hai-Ning; Li, Lei; Leng, Ping; Wang, Ying-Zhen; Lv, Cheng-Yu

    2009-04-01

    To testify the effect of the stem cells derived from the widely distributed fat tissue on repairing full-thickness hyaline cartilage defects. Adipose-derived stem cells (ADSCs) were derived from adipose tissue and cultured in vitro. Twenty-seven New Zealand white rabbits were divided into three groups randomly. The cultured ADSCs mixed with calcium alginate gel were used to fill the full-thickness hyaline cartilage defects created at the patellafemoral joint, and the defects repaired with gel or without treatment served as control groups. After 4, 8 and 12 weeks, the reconstructed tissue was evaluated macroscopically and microscopically. Histological analysis and qualitative scoring were also performed to detect the outcome. Full thickness hyaline cartilage defects were repaired completely with ADSCs-derived tissue. The result was better in ADSCs group than the control ones. The microstructure of reconstructed tissue with ADSCs was similar to that of hyaline cartilage and contained more cells and regular matrix fibers, being better than other groups. Plenty of collagen fibers around cells could be seen under transmission electron microscopy. Statistical analysis revealed a significant difference in comparison with other groups at each time point (t equal to 4.360, P less than 0.01). These results indicate that stem cells derived from mature adipose without induction possess the ability to repair cartilage defects.

  5. Change detection in a time series of polarimetric SAR data by an omnibus test statistic and its factorization (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan A.; Conradsen, Knut; Skriver, Henning

    2016-10-01

    Test statistics for comparison of real (as opposed to complex) variance-covariance matrices exist in the statistics literature [1]. In earlier publications we have described a test statistic for the equality of two variance-covariance matrices following the complex Wishart distribution with an associated p-value [2]. We showed their application to bitemporal change detection and to edge detection [3] in multilook, polarimetric synthetic aperture radar (SAR) data in the covariance matrix representation [4]. The test statistic and the associated p-value is described in [5] also. In [6] we focussed on the block-diagonal case, we elaborated on some computer implementation issues, and we gave examples on the application to change detection in both full and dual polarization bitemporal, bifrequency, multilook SAR data. In [7] we described an omnibus test statistic Q for the equality of k variance-covariance matrices following the complex Wishart distribution. We also described a factorization of Q = R2 R3 … Rk where Q and Rj determine if and when a difference occurs. Additionally, we gave p-values for Q and Rj. Finally, we demonstrated the use of Q and Rj and the p-values to change detection in truly multitemporal, full polarization SAR data. Here we illustrate the methods by means of airborne L-band SAR data (EMISAR) [8,9]. The methods may be applied to other polarimetric SAR data also such as data from Sentinel-1, COSMO-SkyMed, TerraSAR-X, ALOS, and RadarSat-2 and also to single-pol data. The account given here closely follows that given our recent IEEE TGRS paper [7]. Selected References [1] Anderson, T. W., An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third ed. (2003). [2] Conradsen, K., Nielsen, A. A., Schou, J., and Skriver, H., "A test statistic in the complex Wishart distribution and its application to change detection in polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 41(1): 4-19, 2003. [3] Schou, J., Skriver, H., Nielsen, A. A., and Conradsen, K., "CFAR edge detector for polarimetric SAR images," IEEE Transactions on Geoscience and Remote Sensing 41(1): 20-32, 2003. [4] van Zyl, J. J. and Ulaby, F. T., "Scattering matrix representation for simple targets," in Radar Polarimetry for Geoscience Applications, Ulaby, F. T. and Elachi, C., eds., Artech, Norwood, MA (1990). [5] Canty, M. J., Image Analysis, Classification and Change Detection in Remote Sensing,with Algorithms for ENVI/IDL and Python, Taylor & Francis, CRC Press, third revised ed. (2014). [6] Nielsen, A. A., Conradsen, K., and Skriver, H., "Change detection in full and dual polarization, single- and multi-frequency SAR data," IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 8(8): 4041-4048, 2015. [7] Conradsen, K., Nielsen, A. A., and Skriver, H., "Determining the points of change in time series of polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 54(5), 3007-3024, 2016. [9] Christensen, E. L., Skou, N., Dall, J., Woelders, K., rgensen, J. H. J., Granholm, J., and Madsen, S. N., "EMISAR: An absolutely calibrated polarimetric L- and C-band SAR," IEEE Transactions on Geoscience and Remote Sensing 36: 1852-1865 (1998).

  6. Automated Solar Flare Detection and Feature Extraction in High-Resolution and Full-Disk Hα Images

    NASA Astrophysics Data System (ADS)

    Yang, Meng; Tian, Yu; Liu, Yangyi; Rao, Changhui

    2018-05-01

    In this article, an automated solar flare detection method applied to both full-disk and local high-resolution Hα images is proposed. An adaptive gray threshold and an area threshold are used to segment the flare region. Features of each detected flare event are extracted, e.g. the start, peak, and end time, the importance class, and the brightness class. Experimental results have verified that the proposed method can obtain more stable and accurate segmentation results than previous works on full-disk images from Big Bear Solar Observatory (BBSO) and Kanzelhöhe Observatory for Solar and Environmental Research (KSO), and satisfying segmentation results on high-resolution images from the Goode Solar Telescope (GST). Moreover, the extracted flare features correlate well with the data given by KSO. The method may be able to implement a more complicated statistical analysis of Hα solar flares.

  7. Prior topical anesthesia reduces time to full cycloplegia in Chinese.

    PubMed

    Siu, A W; Sum, A C; Lee, D T; Tam, K W; Chan, S W

    1999-01-01

    To investigate the effect of prior anesthesia on the time to full cycloplegia in young Chinese subjects. The amplitude of accommodation was monitored over a 50-minute interval after the application of 1% cyclopentolate hydrochloride with a pretreatment of 0.4% benoxinate (oxybuprocaine) or 0.9% saline solution (control). Using a nonlinear mathematical model, the rate of accommodative loss (k) and the time required for 95% of total cycloplegia (T95%) were determined. Statistical analysis revealed a significantly faster rate of accommodative loss (P < .0001) after prior anesthesia (0.129 +/- 0.05) compared with the controls (0.103 +/- 0.04). T95% was noted at 26.43 +/- 10.22 minutes after prior anesthesia, which was significantly shorter (P < .0001) than that after the saline treatment (35.28 +/- 16.51 minutes). Prior application of topical anesthetic can shorten the time to full cycloplegia for people, such as the Chinese, with dark irides.

  8. Non-arbitrage in financial markets: A Bayesian approach for verification

    NASA Astrophysics Data System (ADS)

    Cerezetti, F. V.; Stern, Julio Michael

    2012-10-01

    The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST - Full Bayesian Significance Test, if there is a relevant price difference between essentially the same financial asset traded at two distinct locations. Specifically, we investigate and compare the behavior of call options on the BOVESPA Index traded at (a) the Equities Segment and (b) the Derivatives Segment of BM&FBovespa. Our results seem to point out significant statistical differences. To what extent this evidence is actually the expression of perennial arbitrage opportunities is still an open question.

  9. Statistical summaries of fatigue data for design purposes

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1983-01-01

    Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.

  10. Identifying natural flow regimes using fish communities

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Tsai, Wen-Ping; Wu, Tzu-Ching; Chen, Hung-kwai; Herricks, Edwin E.

    2011-10-01

    SummaryModern water resources management has adopted natural flow regimes as reasonable targets for river restoration and conservation. The characterization of a natural flow regime begins with the development of hydrologic statistics from flow records. However, little guidance exists for defining the period of record needed for regime determination. In Taiwan, the Taiwan Eco-hydrological Indicator System (TEIS), a group of hydrologic statistics selected for fisheries relevance, is being used to evaluate ecological flows. The TEIS consists of a group of hydrologic statistics selected to characterize the relationships between flow and the life history of indigenous species. Using the TEIS and biosurvey data for Taiwan, this paper identifies the length of hydrologic record sufficient for natural flow regime characterization. To define the ecological hydrology of fish communities, this study connected hydrologic statistics to fish communities by using methods to define antecedent conditions that influence existing community composition. A moving average method was applied to TEIS statistics to reflect the effects of antecedent flow condition and a point-biserial correlation method was used to relate fisheries collections with TEIS statistics. The resulting fish species-TEIS (FISH-TEIS) hydrologic statistics matrix takes full advantage of historical flows and fisheries data. The analysis indicates that, in the watersheds analyzed, averaging TEIS statistics for the present year and 3 years prior to the sampling date, termed MA(4), is sufficient to develop a natural flow regime. This result suggests that flow regimes based on hydrologic statistics for the period of record can be replaced by regimes developed for sampled fish communities.

  11. Genome-wide comparisons of phylogenetic similarities between partial genomic regions and the full-length genome in Hepatitis E virus genotyping.

    PubMed

    Wang, Shuai; Wei, Wei; Luo, Xuenong; Cai, Xuepeng

    2014-01-01

    Besides the complete genome, different partial genomic sequences of Hepatitis E virus (HEV) have been used in genotyping studies, making it difficult to compare the results based on them. No commonly agreed partial region for HEV genotyping has been determined. In this study, we used a statistical method to evaluate the phylogenetic performance of each partial genomic sequence from a genome wide, by comparisons of evolutionary distances between genomic regions and the full-length genomes of 101 HEV isolates to identify short genomic regions that can reproduce HEV genotype assignments based on full-length genomes. Several genomic regions, especially one genomic region at the 3'-terminal of the papain-like cysteine protease domain, were detected to have relatively high phylogenetic correlations with the full-length genome. Phylogenetic analyses confirmed the identical performances between these regions and the full-length genome in genotyping, in which the HEV isolates involved could be divided into reasonable genotypes. This analysis may be of value in developing a partial sequence-based consensus classification of HEV species.

  12. Wash-out in N{sub 2}-dominated leptogenesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hahn-Woernle, F., E-mail: fhahnwo@mppmu.mpg.de

    2010-08-01

    We study the wash-out of a cosmological baryon asymmetry produced via leptogenesis by subsequent interactions. Therefore we focus on a scenario in which a lepton asymmetry is established in the out-of-equilibrium decays of the next-to-lightest right-handed neutrino. We apply the full classical Boltzmann equations without the assumption of kinetic equilibrium and including all quantum statistical factors to calculate the wash-out of the lepton asymmetry by interactions of the lightest right-handed state. We include scattering processes with top quarks in our analysis. This is of particular interest since the wash-out is enhanced by scatterings and the use of mode equations withmore » quantum statistical distribution functions. In this way we provide a restriction on the parameter space for this scenario.« less

  13. VizieR Online Data Catalog: HARPS timeseries data for HD41248 (Jenkins+, 2014)

    NASA Astrophysics Data System (ADS)

    Jenkins, J. S.; Tuomi, M.

    2017-05-01

    We modeled the HARPS radial velocities of HD 42148 by adopting the analysis techniques and the statistical model applied in Tuomi et al. (2014, arXiv:1405.2016). This model contains Keplerian signals, a linear trend, a moving average component with exponential smoothing, and linear correlations with activity indices, namely, BIS, FWHM, and chromospheric activity S index. We applied our statistical model outlined above to the full data set of radial velocities for HD 41248, combining the previously published data in Jenkins et al. (2013ApJ...771...41J) with the newly published data in Santos et al. (2014, J/A+A/566/A35), giving rise to a total time series of 223 HARPS (Mayor et al. 2003Msngr.114...20M) velocities. (1 data file).

  14. An investigation into pilot and system response to critical in-flight events, volume 1

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1981-01-01

    The scope of a critical in-flight event (CIFE) with emphasis on pilot management of available resources is described. Detailed scenarios for both full mission simulation and written testing of pilot responses to CIFE's, and statistical relationships among pilot characteristics and observed responses are developed. A model developed to described pilot response to CIFE and an analysis of professional fight crews compliance with specified operating procedures and the relationships with in-flight errors are included.

  15. Full kinetic chain manual and manipulative therapy plus exercise compared with targeted manual and manipulative therapy plus exercise for symptomatic osteoarthritis of the hip: a randomized controlled trial.

    PubMed

    Brantingham, James W; Parkin-Smith, Gregory; Cassa, Tammy Kay; Globe, Gary A; Globe, Denise; Pollard, Henry; deLuca, Katie; Jensen, Muffit; Mayer, Stephan; Korporaal, Charmaine

    2012-02-01

    To determine the short-term effectiveness of full kinematic chain manual and manipulative therapy (MMT) plus exercise compared with targeted hip MMT plus exercise for symptomatic mild to moderate hip osteoarthritis (OA). Parallel-group randomized trial with 3-month follow-up. Two chiropractic outpatient teaching clinics. Convenience sample of eligible participants (N=111) with symptomatic hip OA were consented and randomly allocated to receive either the experimental or comparison treatment, respectively. Participants in the experimental group received full kinematic chain MMT plus exercise while those in the comparison group received targeted hip MMT plus exercise. Participants in both groups received 9 treatments over a 5-week period. Western Ontario and McMasters Osteoarthritis Index (WOMAC), Harris hip score (HHS), and Overall Therapy Effectiveness, alongside estimation of clinically meaningful outcomes. Total dropout was 9% (n=10) with 7% of total data missing, replaced using a multiple imputation method. No statistically significant differences were found between the 2 groups for any of the outcome measures (analysis of covariance, P=.45 and P=.79 for the WOMAC and HHS, respectively). There were no statistically significant differences in the primary or secondary outcome scores when comparing full kinematic chain MMT plus exercise with targeted hip MMT plus exercise for mild to moderate symptomatic hip OA. Consequently, the nonsignificant findings suggest that there would also be no clinically meaningful difference between the 2 groups. The results of this study provides guidance to musculoskeletal practitioners who regularly use MMT that the full kinematic chain approach does not appear to have any benefit over targeted treatment. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. Full Wave Analysis of RF Signal Attenuation in a Lossy Cave using a High Order Time Domain Vector Finite Element Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pingenot, J; Rieben, R; White, D

    2004-12-06

    We present a computational study of signal propagation and attenuation of a 200 MHz dipole antenna in a cave environment. The cave is modeled as a straight and lossy random rough wall. To simulate a broad frequency band, the full wave Maxwell equations are solved directly in the time domain via a high order vector finite element discretization using the massively parallel CEM code EMSolve. The simulation is performed for a series of random meshes in order to generate statistical data for the propagation and attenuation properties of the cave environment. Results for the power spectral density and phase ofmore » the electric field vector components are presented and discussed.« less

  17. Bayesian Orbit Computation Tools for Objects on Geocentric Orbits

    NASA Astrophysics Data System (ADS)

    Virtanen, J.; Granvik, M.; Muinonen, K.; Oszkiewicz, D.

    2013-08-01

    We consider the space-debris orbital inversion problem via the concept of Bayesian inference. The methodology has been put forward for the orbital analysis of solar system small bodies in early 1990's [7] and results in a full solution of the statistical inverse problem given in terms of a posteriori probability density function (PDF) for the orbital parameters. We demonstrate the applicability of our statistical orbital analysis software to Earth orbiting objects, both using well-established Monte Carlo (MC) techniques (for a review, see e.g. [13] as well as recently developed Markov-chain MC (MCMC) techniques (e.g., [9]). In particular, we exploit the novel virtual observation MCMC method [8], which is based on the characterization of the phase-space volume of orbital solutions before the actual MCMC sampling. Our statistical methods and the resulting PDFs immediately enable probabilistic impact predictions to be carried out. Furthermore, this can be readily done also for very sparse data sets and data sets of poor quality - providing that some a priori information on the observational uncertainty is available. For asteroids, impact probabilities with the Earth from the discovery night onwards have been provided, e.g., by [11] and [10], the latter study includes the sampling of the observational-error standard deviation as a random variable.

  18. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  19. Analysis of variance to assess statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes.

    PubMed

    Makeyev, Oleksandr; Joe, Cody; Lee, Colin; Besio, Walter G

    2017-07-01

    Concentric ring electrodes have shown promise in non-invasive electrophysiological measurement demonstrating their superiority to conventional disc electrodes, in particular, in accuracy of Laplacian estimation. Recently, we have proposed novel variable inter-ring distances concentric ring electrodes. Analytic and finite element method modeling results for linearly increasing distances electrode configurations suggested they may decrease the truncation error resulting in more accurate Laplacian estimates compared to currently used constant inter-ring distances configurations. This study assesses statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes. Full factorial design of analysis of variance was used with one categorical and two numerical factors: the inter-ring distances, the electrode diameter, and the number of concentric rings in the electrode. The response variables were the Relative Error and the Maximum Error of Laplacian estimation computed using a finite element method model for each of the combinations of levels of three factors. Effects of the main factors and their interactions on Relative Error and Maximum Error were assessed and the obtained results suggest that all three factors have statistically significant effects in the model confirming the potential of using inter-ring distances as a means of improving accuracy of Laplacian estimation.

  20. Clinical periodontal variables in patients with and without dementia-a systematic review and meta-analysis.

    PubMed

    Maldonado, Alejandra; Laugisch, Oliver; Bürgin, Walter; Sculean, Anton; Eick, Sigrun

    2018-06-22

    Considering the increasing number of elderly people, dementia has gained an important role in today's society. Although the contributing factors for dementia have not been fully understood, chronic periodontitis (CP) seems to have a possible link to dementia. To conduct a systematic review including meta-analysis in order to assess potential differences in clinical periodontal variables between patients with dementia and non-demented individuals. The following focused question was evaluated: is periodontitis associated with dementia? Electronic searches in two databases, MEDLINE and EMBASE, were conducted. Meta-analysis was performed with the collected data in order to find a statistically significant difference in clinical periodontal variables between the group of dementia and the cognitive normal controls. Forty-two articles remained for full text reading. Finally, seven articles met the inclusion criteria and only five studies provided data suitable for meta-analysis. Periodontal probing depth (PPD), bleeding on probing (BOP), gingival bleeding index (GBI), clinical attachment level (CAL), and plaque index (PI) were included as periodontal variables in the meta-analysis. Each variable revealed a statistically significant difference between the groups. In an attempt to reveal an overall difference between the periodontal variables in dementia patients and non-demented individuals, the chosen variables were transformed into units that resulted in a statistically significant overall difference (p < 0.00001). The current findings indicate that compared to systemically healthy individuals, demented patients show significantly worse clinical periodontal variables. However, further epidemiological studies including a high numbers of participants, the use of exact definitions both for dementia and chronic periodontitis and adjusted for cofounders is warranted. These findings appear to support the putative link between CP and dementia. Consequently, the need for periodontal screening and treatment of elderly demented people should be emphasized.

  1. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  2. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    DOE PAGES

    Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa; ...

    2016-09-07

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less

  3. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less

  4. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    DOE PAGES

    Carlsson, Boris; Forssen, Christian; Fahlin Strömberg, D.; ...

    2016-02-24

    Chiral effective field theory ( ΧEFT) provides a systematic approach to describe low-energy nuclear forces. Moreover, EFT is able to provide well-founded estimates of statistical and systematic uncertainties | although this unique advantage has not yet been fully exploited. We ll this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous t to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of EFT. Finally, we study the effect on other observables by demonstrating forward-error-propagation methodsmore » that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain e cient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to EFT and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling showing that statistical errors are in general small compared to systematic ones. In conclusion, we find that a simultaneous t to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in EFT. Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector; in particlar when varying the cutoff in the chiral potentials. The methodology and results presented in this Paper open a new frontier for uncertainty quantification in ab initio nuclear theory.« less

  5. An R2 statistic for fixed effects in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver

    2008-12-20

    Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.

  6. The association between workplace smoking bans and self-perceived, work-related stress among smoking workers

    PubMed Central

    2012-01-01

    Background There is substantial empirical evidence on the benefits of smoking bans; however, the unintended consequences of this anti-smoking measure have received little attention. This paper examines whether workplace smoking bans (WSB's) are associated with higher self-perceived, work-related stress among smoking workers. Methods A longitudinal representative sample of 3,237 individuals from the Canadian National Population Health Survey from 2000 to 2008 is used. Work-related stress is derived from a 12-item job questionnaire. Two categories of WSB's, full and partial, are included in the analysis, with no ban being the reference category. Analysis also controls for individual socio-demographic characteristics, health status, provincial and occupational fixed-effects. We use fixed-effects linear regression to control for individual time-invariant confounders, both measured and unmeasured, which can affect the relationship between WSB's and work-related stress. To examine the heterogeneous effects of WSB's, the analysis is stratified by gender and age. We check the robustness of our results by re-estimating the baseline specification with the addition of different control variables and a separate analysis for non-smokers. Results Multivariate analysis reveals a positive and statistically significant association between full (β = 0.75, CI = 0.19-1.32) or partial (β = 0.69, CI = 0.12-1.26) WSB's, and the level of self-perceived, work-related stress among smoking workers compared to those with no WSB. We also find that this association varies by gender and age. In particular, WSB's are significantly associated with higher work stress only for males and young adults (aged 18-40). No statistically significant association is found between WSB's and the level of self-perceived work-related stress among non-smoking workers. Conclusion The results of this study do not imply that WSB's are the main determinant of self-perceived, work-related stress among smokers but provides suggestive evidence that these may be positively related. PMID:22329920

  7. The association between workplace smoking bans and self-perceived, work-related stress among smoking workers.

    PubMed

    Azagba, Sunday; Sharaf, Mesbah F

    2012-02-13

    There is substantial empirical evidence on the benefits of smoking bans; however, the unintended consequences of this anti-smoking measure have received little attention. This paper examines whether workplace smoking bans (WSB's) are associated with higher self-perceived, work-related stress among smoking workers. A longitudinal representative sample of 3,237 individuals from the Canadian National Population Health Survey from 2000 to 2008 is used. Work-related stress is derived from a 12-item job questionnaire. Two categories of WSB's, full and partial, are included in the analysis, with no ban being the reference category. Analysis also controls for individual socio-demographic characteristics, health status, provincial and occupational fixed-effects. We use fixed-effects linear regression to control for individual time-invariant confounders, both measured and unmeasured, which can affect the relationship between WSB's and work-related stress. To examine the heterogeneous effects of WSB's, the analysis is stratified by gender and age. We check the robustness of our results by re-estimating the baseline specification with the addition of different control variables and a separate analysis for non-smokers. Multivariate analysis reveals a positive and statistically significant association between full (β = 0.75, CI = 0.19-1.32) or partial (β = 0.69, CI = 0.12-1.26) WSB's, and the level of self-perceived, work-related stress among smoking workers compared to those with no WSB. We also find that this association varies by gender and age. In particular, WSB's are significantly associated with higher work stress only for males and young adults (aged 18-40). No statistically significant association is found between WSB's and the level of self-perceived work-related stress among non-smoking workers. The results of this study do not imply that WSB's are the main determinant of self-perceived, work-related stress among smokers but provides suggestive evidence that these may be positively related.

  8. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  9. Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.

    PubMed

    Lohse, Konrad; Frantz, Laurent A F

    2014-04-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.

  10. Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes

    PubMed Central

    Lohse, Konrad; Frantz, Laurent A. F.

    2014-01-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731

  11. Segment and Fit Thresholding: A New Method for Image Analysis Applied to Microarray and Immunofluorescence Data

    PubMed Central

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.

    2016-01-01

    Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  12. Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Morgenstern, John M.

    2014-01-01

    A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.

  13. The impact of particle size and initial solid loading on thermochemical pretreatment of wheat straw for improving sugar recovery.

    PubMed

    Rojas-Rejón, Oscar A; Sánchez, Arturo

    2014-07-01

    This work studies the effect of initial solid load (4-32 %; w/v, DS) and particle size (0.41-50 mm) on monosaccharide yield of wheat straw subjected to dilute H(2)SO(4) (0.75 %, v/v) pretreatment and enzymatic saccharification. Response surface methodology (RSM) based on a full factorial design (FFD) was used for the statistical analysis of pretreatment and enzymatic hydrolysis. The highest xylose yield obtained during pretreatment (ca. 86 %; of theoretical) was achieved at 4 % (w/v, DS) and 25 mm. The solid fraction obtained from the first set of experiments was subjected to enzymatic hydrolysis at constant enzyme dosage (17 FPU/g); statistical analysis revealed that glucose yield was favored with solids pretreated at low initial solid loads and small particle sizes. Dynamic experiments showed that glucose yield did not increase after 48 h of enzymatic hydrolysis. Once established pretreatment conditions, experiments were carried out with several initial solid loading (4-24 %; w/v, DS) and enzyme dosages (5-50 FPU/g). Two straw sizes (0.41 and 50 mm) were used for verification purposes. The highest glucose yield (ca. 55 %; of theoretical) was achieved at 4 % (w/v, DS), 0.41 mm and 50 FPU/g. Statistical analysis of experiments showed that at low enzyme dosage, particle size had a remarkable effect over glucose yield and initial solid load was the main factor for glucose yield.

  14. Curbing-The Metallic Mode In-between: An empirical study qualifying and categorizing restrained sounds known as Curbing based on audio perception, laryngostroboscopic imaging, acoustics, LTAS, and EGG.

    PubMed

    Thuesen, Mathias Aaen; McGlashan, Julian; Sadolin, Cathrine

    2017-09-01

    This study aims to study the categorization Curbing from the pedagogical method Complete Vocal Technique as a reduced metallic mode compared with the full metallic modes Overdrive and Edge by means of audio perception, laryngostroboscopic imaging, acoustics, long-term average spectrum (LTAS), and electroglottography (EGG). Twenty singers were recorded singing sustained vowels in a restrained character known as Curbing. Two studies were performed: (1) laryngostroboscopic examination using a videonasoendoscopic camera system and the Laryngostrobe program; and (2) simultaneous recording of EGG and acoustic signals using Speech Studio. Images were analyzed based on consensus agreement. Statistical analysis of acoustic, LTAS, and EGG parameters was undertaken using Student paired t tests. The reduced metallic singing mode Curbing has an identifiable laryngeal gesture. Curbing has a more open setting than Overdrive and Edge, with high visibility of the vocal folds, and the false folds giving a rectangular appearance. LTAS showed statistically significant differences between Curbing and the full metallic modes, with less energy across all spectra, yielding a high second and a low third harmonic. Statistically significant differences were identified on Max Qx, Average Qx, Shimmer+, Shimmer-, Shimmer dB, normalized noise energy, cepstral peak prominence, harmonics-to-noise ratio, and mean sound pressure level (P ≤ 0.05). Curbing as a voice production strategy is statistically significantly different from Overdrive and Edge, and can be categorized based on audio perception. This study demonstrates consistently different laryngeal gestures between Curbing and Overdrive and Edge, with high corresponding differences in LTAS, EGG and acoustic measures. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  15. Nonelastic nuclear reactions and accompanying gamma radiation

    NASA Technical Reports Server (NTRS)

    Snow, R.; Rosner, H. R.; George, M. C.; Hayes, J. D.

    1971-01-01

    Several aspects of nonelastic nuclear reactions which proceed through the formation of a compound nucleus are dealt with. The full statistical model and the partial statistical model are described and computer programs based on these models are presented along with operating instructions and input and output for sample problems. A theoretical development of the expression for the reaction cross section for the hybrid case which involves a combination of the continuum aspects of the full statistical model with the discrete level aspects of the partial statistical model is presented. Cross sections for level excitation and gamma production by neutron inelastic scattering from the nuclei Al-27, Fe-56, Si-28, and Pb-208 are calculated and compared with avaliable experimental data.

  16. The significance of intrauterine growth restriction is different from prematurity for the outcome of infants with gastroschisis.

    PubMed

    Puligandla, Pramod S; Janvier, Annie; Flageole, Hélène; Bouchard, Sarah; Mok, Elise; Laberge, Jean-Martin

    2004-08-01

    Recent reviews of gastroschisis identify prematurity and low birth weight as predictors of morbidity and mortality. The authors compared the outcomes of intrauterine growth-restricted infants (IUGR) with gastroschisis to those without growth restriction because IUGR is different from prematurity. A retrospective analysis was performed for infants born with gastroschisis between 1990 and 2000 at 2 pediatric hospitals. Patients were segregated into 3 groups based on birth weight corrected for gestational age: group 1 (IUGR, 25th percentile). Patient demographics, method of closure, number of surgeries, presence of atresia, and time to full enteral feedings (FPO days) were assessed. Mortality rate, length of stay (LOS), and readmission rates were also compared. Analysis of variance (ANOVA)/Student's t test and Fisher's. Exact tests were used for statistical analysis (P <.05 significant). Regression analysis was also performed. One hundred thirteen patients were included (group 1 = 17; group 2 = 43; group 3 = 53). Overall, infants with IUGR had similar outcomes to non-IUGR infants, including FPO and total parenteral nutrition (TPN) days, LOS, readmission, and mortality rates. The method of closure did not affect outcome. Infants with atresia had significantly increased times to full feeding (95 v 34 days; P =.034), more surgeries (2.7 v 1.4; P =.002), and longer LOS (106 v 48 days; P =.011). Infants born at less than 37 weeks' gestation had significantly increased fasting (NPO) days (28 v 18 days; P =.005) and longer LOS (65 v 37 days; P =.006) when compared with infants born at greater than 37 weeks. Logistic regression analysis identified the presence of atresia as an independent risk factor for gastrointestinal dysfunction and the need for prolonged TPN. Prematurity also adversely affected these same parameters, although it did not reach statistical significance. Although infants with gastroschisis are generally small for gestational age, the outcomes of growth-restricted infants are similar to those of other infants. The type of closure does not affect outcome, regardless of birth weight. The presence of atresia or prematurity does lead to longer times for full feeding and LOS. Therefore, routine premature delivery of infants with gastroschisis should not be advocated, even in the context of IUGR.

  17. Quantifying Variation in Gait Features from Wearable Inertial Sensors Using Mixed Effects Models

    PubMed Central

    Cresswell, Kellen Garrison; Shin, Yongyun; Chen, Shanshan

    2017-01-01

    The emerging technology of wearable inertial sensors has shown its advantages in collecting continuous longitudinal gait data outside laboratories. This freedom also presents challenges in collecting high-fidelity gait data. In the free-living environment, without constant supervision from researchers, sensor-based gait features are susceptible to variation from confounding factors such as gait speed and mounting uncertainty, which are challenging to control or estimate. This paper is one of the first attempts in the field to tackle such challenges using statistical modeling. By accepting the uncertainties and variation associated with wearable sensor-based gait data, we shift our efforts from detecting and correcting those variations to modeling them statistically. From gait data collected on one healthy, non-elderly subject during 48 full-factorial trials, we identified four major sources of variation, and quantified their impact on one gait outcome—range per cycle—using a random effects model and a fixed effects model. The methodology developed in this paper lays the groundwork for a statistical framework to account for sources of variation in wearable gait data, thus facilitating informative statistical inference for free-living gait analysis. PMID:28245602

  18. Statistical Characterization of the Mechanical Parameters of Intact Rock Under Triaxial Compression: An Experimental Proof of the Jinping Marble

    NASA Astrophysics Data System (ADS)

    Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo

    2016-12-01

    We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.

  19. Automated Assessment of Child Vocalization Development Using LENA.

    PubMed

    Richards, Jeffrey A; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-07-12

    To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Assessment was based on full-day audio recordings collected in a child's unrestricted, natural language environment. AVA estimates were derived using automatic speech recognition modeling techniques to categorize and quantify the sounds in child vocalizations (e.g., protophones and phonemes). These were expressed as phone and biphone frequencies, reduced to principal components, and inputted to age-based multiple linear regression models to predict independently collected criterion-expressive language scores. From these models, we generated vocal development AVA estimates as age-standardized scores and development age estimates. AVA estimates demonstrated strong statistical reliability and validity when compared with standard criterion expressive language assessments. Automated analysis of child vocalizations extracted from full-day recordings in natural settings offers a novel and efficient means to assess children's expressive vocal development. More research remains to identify specific mechanisms of operation.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  1. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  2. Applied statistical training to strengthen analysis and health research capacity in Rwanda.

    PubMed

    Thomson, Dana R; Semakula, Muhammed; Hirschhorn, Lisa R; Murray, Megan; Ndahindwa, Vedaste; Manzi, Anatole; Mukabutera, Assumpta; Karema, Corine; Condo, Jeanine; Hedt-Gauthier, Bethany

    2016-09-29

    To guide efficient investment of limited health resources in sub-Saharan Africa, local researchers need to be involved in, and guide, health system and policy research. While extensive survey and census data are available to health researchers and program officers in resource-limited countries, local involvement and leadership in research is limited due to inadequate experience, lack of dedicated research time and weak interagency connections, among other challenges. Many research-strengthening initiatives host prolonged fellowships out-of-country, yet their approaches have not been evaluated for effectiveness in involvement and development of local leadership in research. We developed, implemented and evaluated a multi-month, deliverable-driven, survey analysis training based in Rwanda to strengthen skills of five local research leaders, 15 statisticians, and a PhD candidate. Research leaders applied with a specific research question relevant to country challenges and committed to leading an analysis to publication. Statisticians with prerequisite statistical training and experience with a statistical software applied to participate in class-based trainings and complete an assigned analysis. Both statisticians and research leaders were provided ongoing in-country mentoring for analysis and manuscript writing. Participants reported a high level of skill, knowledge and collaborator development from class-based trainings and out-of-class mentorship that were sustained 1 year later. Five of six manuscripts were authored by multi-institution teams and submitted to international peer-reviewed scientific journals, and three-quarters of the participants mentored others in survey data analysis or conducted an additional survey analysis in the year following the training. Our model was effective in utilizing existing survey data and strengthening skills among full-time working professionals without disrupting ongoing work commitments and using few resources. Critical to our success were a transparent, robust application process and time limited training supplemented by ongoing, in-country mentoring toward manuscript deliverables that were led by Rwanda's health research leaders.

  3. MNE software for processing MEG and EEG data

    PubMed Central

    Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals originating from neural currents in the brain. Using these signals to characterize and locate brain activity is a challenging task, as evidenced by several decades of methodological contributions. MNE, whose name stems from its capability to compute cortically-constrained minimum-norm current estimates from M/EEG data, is a software package that provides comprehensive analysis tools and workflows including preprocessing, source estimation, time–frequency analysis, statistical analysis, and several methods to estimate functional connectivity between distributed brain regions. The present paper gives detailed information about the MNE package and describes typical use cases while also warning about potential caveats in analysis. The MNE package is a collaborative effort of multiple institutes striving to implement and share best methods and to facilitate distribution of analysis pipelines to advance reproducibility of research. Full documentation is available at http://martinos.org/mne. PMID:24161808

  4. Computational statistics using the Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  5. Beyond Necrotizing Enterocolitis: Other Clinical Advantages of an Exclusive Human Milk Diet.

    PubMed

    Hair, Amy B; Rechtman, David J; Lee, Martin L; Niklas, Victoria

    2018-06-07

    Articles previously published by Sullivan et al. and Cristofalo et al. were reanalyzed using the proportion of cow milk-based nutrition received to determine whether that affected clinical outcomes during hospitalization for infants birth weight 500-1250 g. Abrams et al. showed in the same cohort incidences of necrotizing enterocolitis (NEC), NEC requiring surgery and sepsis increased proportionally to the amount of dietary cow milk. The data from the two studies conducted under essentially the same protocol were combined yielding a cohort of 260 infants receiving a diet ranging from 0% to 100% cow milk. Data analysis utilized negative binomial regression which mitigates differences between subjects in terms of their time on study by incorporating that number into the statistical model. The percent of cow milk-based nutrition was the only predictor investigated. For all outcomes the larger the amount of cow's milk in the diet the greater the number of days of that intervention required. A trend toward statistical significance was seen for ventilator days; however, only parenteral nutrition (PN) days and days to full feeds achieved statistical significance. Incorporation of any cow milk-based nutrition into the diet of extremely premature infants correlates with more days on PN and a longer time to achieve full feeds. There was a nonstatistically significant trend toward increased ventilator days. These represent additional clinical consequences of the use of any cow milk-based protein in feeding EP infants.

  6. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling

    PubMed Central

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-01-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188

  7. Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.

    PubMed

    Chorin, Alexandre J; Lu, Fei

    2015-08-11

    Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.

  8. Statistical Analysis of Research Data | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  9. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  10. Herschel Shines Light on the Episodic Evolutionary Sequence of Protostars

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; DIGIT; FOOSH; COPS Teams

    2014-01-01

    New far-infrared and submillimeter spectroscopic capabilities, along with moderate spatial and spectral resolution, provide the opportunity to study the diversity of shocks, accretion processes, and compositions of the envelopes of developing protostellar objects in nearby molecular clouds. We present the "COPS" (CO in Protostars) sample; a statistical analysis of the full sample of 30 Class 0/I protostars from the "DIGIT" Key project using Herschel-PACS/SPIRE 50-700 micron spectroscopy. We consider the sample as a whole in characteristic spectral lines, using a standardized data reduction procedure for all targets, and analyze the differences in the continuum and gas over the full sample, presenting an overview of trends. We compare the sources in evolutionary state, envelope mass, and gas properties to more evolved sources from the"FOOSH'' (FUor) samples.

  11. Simulated cosmic microwave background maps at 0.5 deg resolution: Basic results

    NASA Technical Reports Server (NTRS)

    Hinshaw, G.; Bennett, C. L.; Kogut, A.

    1995-01-01

    We have simulated full-sky maps of the cosmic microwave background (CMB) anisotropy expected from cold dark matter (CDM) models at 0.5 deg and 1.0 deg angular resolution. Statistical properties of the maps are presented as a function of sky coverage, angular resolution, and instrument noise, and the implications of these results for observability of the Doppler peak are discussed. The rms fluctuations in a map are not a particularly robust probe of the existence of a Doppler peak; however, a full correlation analysis can provide reasonable sensitivity. We find that sensitivity to the Doppler peak depends primarily on the fraction of sky covered, and only secondarily on the angular resolution and noise level. Color plates of the simulated maps are presented to illustrate the anisotropies.

  12. Influence of hydroxypropyl methylcellulose on drug release pattern of a gastroretentive floating drug delivery system using a 3(2) full factorial design.

    PubMed

    Swain, Kalpana; Pattnaik, Satyanarayan; Mallick, Subrata; Chowdary, Korla Appana

    2009-01-01

    In the present investigation, controlled release gastroretentive floating drug delivery system of theophylline was developed employing response surface methodology. A 3(2) randomized full factorial design was developed to study the effect of formulation variables like various viscosity grades and contents of hydroxypropyl methylcellulose (HPMC) and their interactions on response variables. The floating lag time for all nine experimental trial batches were less than 2 min and floatation time of more than 12 h. Theophylline release from the polymeric matrix system followed non-Fickian anomalous transport. Multiple regression analysis revealed that both viscosity and content of HPMC had statistically significant influence on all dependent variables but the effect of these variables found to be nonlinear above certain threshold values.

  13. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.

    PubMed

    Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P

    2017-08-23

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.

  14. Outcomes assessment of a residency program in laboratory medicine.

    PubMed

    Morse, E E; Pisciotto, P T; Hopfer, S M; Makowski, G; Ryan, R W; Aslanzadeh, J

    1997-01-01

    During a down-sizing of residency programs at a State University Medical School, hospital based residents' positions were eliminated. It was determined to find out the characteristics of the residents who graduated from the Laboratory Medicine Program, to compare women graduates with men graduates, and to compare IMGs with United States Graduates. An assessment of a 25 year program in laboratory medicine which had graduated 100 residents showed that there was no statistically significant difference by chi 2 analysis in positions (laboratory directors or staff), in certification (American Board of Pathology [and subspecialties], American Board of Medical Microbiology, American Board of Clinical Chemistry) nor in academic appointments (assistant professor to full professor) when the male graduates were compared with the female graduates or when graduates of American medical schools were compared with graduates of foreign medical schools. There were statistically significant associations by chi 2 analysis between directorship positions and board certification and between academic appointments and board certification. Of 100 graduates, there were 57 directors, 52 certified, and 41 with academic appointments. Twenty-two graduates (11 women and 11 men) attained all three.

  15. PHOXTRACK-a tool for interpreting comprehensive datasets of post-translational modifications of proteins.

    PubMed

    Weidner, Christopher; Fischer, Cornelius; Sauer, Sascha

    2014-12-01

    We introduce PHOXTRACK (PHOsphosite-X-TRacing Analysis of Causal Kinases), a user-friendly freely available software tool for analyzing large datasets of post-translational modifications of proteins, such as phosphorylation, which are commonly gained by mass spectrometry detection. In contrast to other currently applied data analysis approaches, PHOXTRACK uses full sets of quantitative proteomics data and applies non-parametric statistics to calculate whether defined kinase-specific sets of phosphosite sequences indicate statistically significant concordant differences between various biological conditions. PHOXTRACK is an efficient tool for extracting post-translational information of comprehensive proteomics datasets to decipher key regulatory proteins and to infer biologically relevant molecular pathways. PHOXTRACK will be maintained over the next years and is freely available as an online tool for non-commercial use at http://phoxtrack.molgen.mpg.de. Users will also find a tutorial at this Web site and can additionally give feedback at https://groups.google.com/d/forum/phoxtrack-discuss. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Argonne National Laboratory Li-alloy/FeS cell testing and R and D programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, E.C.

    1982-01-01

    Groups of 12 or more identical Li-alloy/FeS cells fabricated by Eagle-Picher Industries, Inc. and Gould Inc. were operated at Argonne National Laboratory (ANL) in the status cell test program to obtain data for statistical analysis of cell cycle life and failure modes. The cells were full-size electric vehicle battery cells (150 to 350 Ah capacity) and they were cycled at the 4-h discharge rate and 8-h charge rate. The end of life was defined as a 20% loss of capacity or a decrease in the coulombic efficiency to less than 95%. Seventy-four cells (six groups of identical cells) were cycle-lifemore » tested and the results were analyzed statistically. The ultimate goal of this analysis was to predict cell and battery reliability. Testing of groups of identical cells also provided a means of identifying common failure modes which were eliminated by cell design changes. Mean time to failure (MTTF) for the cells based on the Weibull distribution is presented.« less

  17. Conversion of Plastic Surgery meeting abstract presentations to full manuscripts: a brazilian perspective.

    PubMed

    Denadai, Rafael; Pinho, André Silveira; Samartine, Hugo; Denadai, Rodrigo; Raposo-Amaral, Cassio Eduardo

    2017-01-01

    to assess the conversion rate of Plastic Surgery meeting abstract presentations to full manuscript publications and examine factors associated with this conversion. we assessed the abstracts presented at the 47th and 48th Brazilian Congresses of Plastic Surgery by cross-referencing with multiple databases. We analyzed the Abstracts' characteristics associated with full manuscript publications. of the 200 abstracts presented, 50 abstracts were subsequently published in full, giving the conference a conversion rate of 25%. The mean time to publish was 15.00±13.75 months. In total, there were 4.93±1.63 authors per abstract and 67.8±163 subjects per abstract; 43.5% of the abstracts were of retrospective studies; 69% comprised the plastic surgery topics head and neck, and chest and trunk, and 88.5% had no statistical analysis. Overall, 80% of the manuscripts were published in plastic surgery journals, 76% had no impact factor and 52% had no citations. Bivariate and multivariate analyses revealed the presence of statistical analysis to be the most significant (p<0.05) predictive factor of conversion of abstracts into full manuscripts. the conversion rate found from this bibliometric research appeared a bit lower than the conversion trend of international plastic surgery meetings, and statistical analysis was a determinant of conversion success. avaliar a taxa de conversão de resumos apresentados em congressos de Cirurgia Plástica em publicações de manuscritos completos e examinar fatores associados a essa conversão. resumos apresentados nos XLVII e XLVIII Congressos Brasileiros de Cirurgia Plástica foram avaliados por meio de referências cruzadas em diversos bancos de dados. Averiguaram-se as características dos resumos associadas às publicações de manuscritos completos. dos 200 resumos apresentados, 50 foram posteriormente publicados na íntegra, determinando uma taxa de publicação de 25%. O tempo médio para publicação foi 15,00±13,75 meses. No total, houve 4,93±1,63 autores/resumo e 67,8±163 pacientes/resumo; 43,5% dos resumos foram estudos retrospectivos; 69% pertenciam aos tópicos crânio, cabeça e pescoço, e tórax e tronco e 88,5% não apresentavam análise estatística. No geral, 80% dos manuscritos foram publicados em revistas de Cirurgia Plástica, 76% não exibiam fator de impacto e 52% não possuíam citações. As análises bivariada e multivariada revelaram que a presença de análise estatística foi o fator preditivo significativo (p<0,05) para a conversão de resumos em manuscritos completos. a taxa de conversão deste estudo bibliométrico foi inferior à tendência de conversão descrita em congressos internacionais de Cirurgia Plástica, e a presença de análise estatística foi um determinante para o sucesso de conversão.

  18. The use of imputed sibling genotypes in sibship-based association analysis: on modeling alternatives, power and model misspecification.

    PubMed

    Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I

    2013-05-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.

  19. Duration on unemployment: geographic mobility and selectivity bias.

    PubMed

    Goss, E P; Paul, C; Wilhite, A

    1994-01-01

    Modeling the factors affecting the duration of unemployment was found to be influenced by the inclusion of migration factors. Traditional models which did not control for migration factors were found to underestimate movers' probability of finding an acceptable job. The empirical test of the theory, based on the analysis of data on US household heads unemployed in 1982 and employed in 1982 and 1983, found that the cumulative probability of reemployment in the traditional model was .422 and in the migration selectivity model was .624 after 30 weeks of searching. In addition, controlling for selectivity eliminated the significance of the relationship between race and job search duration in the model. The relationship between search duration and the county unemployment rate in 1982 became statistically significant, and the relationship between search duration and 1980 population per square mile in the 1982 county of residence became statistically insignificant. The finding that non-Whites have a longer duration of unemployment can better be understood as non-Whites' lower geographic mobility and lack of greater job contacts. The statistical significance of a high unemployment rate in the home labor market reducing the probability of finding employment was more in keeping with expectations. The findings assumed that the duration of employment accurately reflected the length of job search. The sample was redrawn to exclude discouraged workers and the analysis was repeated. The findings were similar to the full sample, with the coefficient for migration variable being negative and statistically significant and the coefficient for alpha remaining positive and statistically significant. Race in the selectivity model remained statistically insignificant. The findings supported the Schwartz model hypothesizing that the expansion of the radius of the search would reduce the duration of unemployment. The exclusion of the migration factor misspecified the equation for unemployment duration. Policy should be directed to the problems of geographic mobility, particularly among non-Whites.

  20. Galaxy-M: a Galaxy workflow for processing and analyzing direct infusion and liquid chromatography mass spectrometry-based metabolomics data.

    PubMed

    Davidson, Robert L; Weber, Ralf J M; Liu, Haoyu; Sharma-Oates, Archana; Viant, Mark R

    2016-01-01

    Metabolomics is increasingly recognized as an invaluable tool in the biological, medical and environmental sciences yet lags behind the methodological maturity of other omics fields. To achieve its full potential, including the integration of multiple omics modalities, the accessibility, standardization and reproducibility of computational metabolomics tools must be improved significantly. Here we present our end-to-end mass spectrometry metabolomics workflow in the widely used platform, Galaxy. Named Galaxy-M, our workflow has been developed for both direct infusion mass spectrometry (DIMS) and liquid chromatography mass spectrometry (LC-MS) metabolomics. The range of tools presented spans from processing of raw data, e.g. peak picking and alignment, through data cleansing, e.g. missing value imputation, to preparation for statistical analysis, e.g. normalization and scaling, and principal components analysis (PCA) with associated statistical evaluation. We demonstrate the ease of using these Galaxy workflows via the analysis of DIMS and LC-MS datasets, and provide PCA scores and associated statistics to help other users to ensure that they can accurately repeat the processing and analysis of these two datasets. Galaxy and data are all provided pre-installed in a virtual machine (VM) that can be downloaded from the GigaDB repository. Additionally, source code, executables and installation instructions are available from GitHub. The Galaxy platform has enabled us to produce an easily accessible and reproducible computational metabolomics workflow. More tools could be added by the community to expand its functionality. We recommend that Galaxy-M workflow files are included within the supplementary information of publications, enabling metabolomics studies to achieve greater reproducibility.

  1. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.

  2. Notes for Brazil sampling frame evaluation trip

    NASA Technical Reports Server (NTRS)

    Horvath, R. (Principal Investigator); Hicks, D. R. (Compiler)

    1981-01-01

    Field notes describing a trip conducted in Brazil are presented. This trip was conducted for the purpose of evaluating a sample frame developed using LANDSAT full frame images by the USDA Economic and Statistics Service for the eventual purpose of cropland production estimation with LANDSAT by the Foreign Commodity Production Forecasting Project of the AgRISTARS program. Six areas were analyzed on the basis of land use, crop land in corn and soybean, field size and soil type. The analysis indicated generally successful use of LANDSAT images for purposes of remote large area land use stratification.

  3. LEP Events, TLE's, and Q-bursts observed from the Antarctic

    NASA Astrophysics Data System (ADS)

    Moore, R. C.; Kim, D.; Flint, Q. A.

    2017-12-01

    ELF/VLF measurements at Palmer Station, McMurdo Station, and South Pole Station, Antarctica are used to detect lightning-generated ELF/VLF radio atmospherics from around the globe and to remote sense ionospheric disturbances in the Southern hemisphere. The Antarctic ELF/VLF receivers complement a Northern hemisphere ELF/VLF monitoring array. In this paper, we present our latest observational results, including a full statistical analysis of conjugate observations of lightning-induced electron precipitation and radio atmospherics associated specifically with the transient luminous events known as gigantic jets and sprites.

  4. Prevalence of suicidal ideation and suicide attempts in the general population of China: A meta-Analysis

    PubMed Central

    CAO, XIAO-LAN; ZHONG, BAO-LIANG; XIANG, YU-TAO; UNGVARI, GABOR S.; LAI, KELLY Y. C.; CHIU, HELEN F. K.; CAINE, ERIC D.

    2015-01-01

    Objective The objective of this meta-analysis is to estimate the pooled prevalence of suicidal ideation and suicide attempts in the general population of Mainland China. Methods A systematic literature search was conducted via the following databases: PubMed, PsycINFO, MEDLINE, China Journals Full-Text Databases, Chongqing VIP database for Chinese Technical Periodicals and Wan Fang Data. Statistical analysis used the Comprehensive Meta-Analysis program. Results Eight studies met the inclusion criteria for the analysis; five reported on the prevalence of suicidal ideation and seven on that of suicide attempts. The estimated lifetime prevalence figures of suicidal ideation and suicide attempts were 3.9% (95% Confidence interval [CI]: 2.5%–6.0%) and 0.8% (95% CI: 0.7%–0.9%), respectively. The estimated female-male ratio for lifetime prevalence of suicidal ideation and suicide attempts was 1.7 and 2.2, respectively. Only the difference of suicide attempts between the two genders was statistically significant. Conclusion This was the first meta-analysis of the prevalence of suicidal ideation and suicide attempts in the general population of Mainland China. The pooled lifetime prevalence of both suicidal ideation and suicide attempts are relatively low; however, caution is required when assessing these self-report data. Women had a modestly higher prevalence for suicide attempts than men. The frequency for suicidal ideation and suicide attempts in urban regions was similar to those in rural areas. PMID:26060259

  5. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  6. The Effects of the Recession on Child Poverty: Poverty Statistics for 2008 and Growth in Need during 2009

    ERIC Educational Resources Information Center

    Isaacs, Julia B.

    2009-01-01

    Nearly one in five children under age 18 lived in poor families in 2008, according to poverty statistics released by the Census Bureau in September 2009. Though high, this statistic does not capture the full impact of the economic downturn, which is expected to drive poverty even higher in 2009. However, updated poverty statistics will not be…

  7. Statistical atlas based extrapolation of CT data

    NASA Astrophysics Data System (ADS)

    Chintalapani, Gouthami; Murphy, Ryan; Armiger, Robert S.; Lepisto, Jyri; Otake, Yoshito; Sugano, Nobuhiko; Taylor, Russell H.; Armand, Mehran

    2010-02-01

    We present a framework to estimate the missing anatomical details from a partial CT scan with the help of statistical shape models. The motivating application is periacetabular osteotomy (PAO), a technique for treating developmental hip dysplasia, an abnormal condition of the hip socket that, if untreated, may lead to osteoarthritis. The common goals of PAO are to reduce pain, joint subluxation and improve contact pressure distribution by increasing the coverage of the femoral head by the hip socket. While current diagnosis and planning is based on radiological measurements, because of significant structural variations in dysplastic hips, a computer-assisted geometrical and biomechanical planning based on CT data is desirable to help the surgeon achieve optimal joint realignments. Most of the patients undergoing PAO are young females, hence it is usually desirable to minimize the radiation dose by scanning only the joint portion of the hip anatomy. These partial scans, however, do not provide enough information for biomechanical analysis due to missing iliac region. A statistical shape model of full pelvis anatomy is constructed from a database of CT scans. The partial volume is first aligned with the statistical atlas using an iterative affine registration, followed by a deformable registration step and the missing information is inferred from the atlas. The atlas inferences are further enhanced by the use of X-ray images of the patient, which are very common in an osteotomy procedure. The proposed method is validated with a leave-one-out analysis method. Osteotomy cuts are simulated and the effect of atlas predicted models on the actual procedure is evaluated.

  8. Interpreting the gamma statistic in phylogenetic diversification rate studies: a rate decrease does not necessarily indicate an early burst.

    PubMed

    Fordyce, James A

    2010-07-23

    Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.

  9. Association between retinal nerve fiber layer thickness and magnetic resonance imaging findings and intelligence in patients with multiple sclerosis.

    PubMed

    Ashtari, Fereshteh; Emami, Parisa; Akbari, Mojtaba

    2015-01-01

    Multiple Sclerosis (MS) is a neurological disease in which demyelination and axonal loss leads to progressive disability. Cognition impairment is among the most common complication. Studying axonal loss in the retina is a new marker for MS. The main goal of our study is to search for correlations between magnetic resonance imaging (MRI) findings and the retinal nerve fiber layer (RNFL) thickness at the macula and head of the optic nerve and Wechsler Adult Intelligence Scale-Revised (WAIS-R) Scores that assess multiple domains of intelligence, and to explore the relationship between changes in the RNFL thickness with intellectual and cognitive dysfunction. A prospective cross-sectional study was conducted at the University Hospital of Kashani, Isfahan, Iran, from September to December 2013. All patients were assessed with a full-scale intelligence quotient (IQ) on the WAIS-R. An optical coherence tomography study and brain MRI were performed in the same week for all the patients. Statistical analysis was conducted by using a bivariate correlation, by utilizing SPSS 20.0. A P value ≤ 0.05 was the threshold of statistical significance. Examination of a 100 patients showed a significant correlation between the average RNFL thickness of the macula and the verbal IQ (P value = 0.01) and full IQ (P value = 0.01). There was a significant correlation between brain atrophy and verbal IQ. The RNFL loss was correlated with verbal IQ and full IQ.

  10. Performance evaluation of 388 full-scale waste stabilization pond systems with seven different configurations.

    PubMed

    Espinosa, Maria Fernanda; von Sperling, Marcos; Verbyla, Matthew E

    2017-02-01

    Waste stabilization ponds (WSPs) and their variants are one the most widely used wastewater treatment systems in the world. However, the scarcity of systematic performance data from full-scale plants has led to challenges associated with their design. The objective of this research was to assess the performance of 388 full-scale WSP systems located in Brazil, Ecuador, Bolivia and the United States through the statistical analysis of available monitoring data. Descriptive statistics were calculated of the influent and effluent concentrations and the removal efficiencies for 5-day biochemical oxygen demand (BOD 5 ), total suspended solids (TSS), ammonia nitrogen (N-Ammonia), and either thermotolerant coliforms (TTC) or Escherichia coli for each WSP system, leading to a broad characterization of actual treatment performance. Compliance with different water quality and system performance goals was also evaluated. The treatment plants were subdivided into seven different categories, according to their units and flowsheet. The median influent concentrations of BOD 5 and TSS were 431 mg/L and 397 mg/L and the effluent concentrations varied from technology to technology, but median values were 50 mg/L and 47 mg/L, respectively. The median removal efficiencies were 85% for BOD 5 and 75% for TSS. The overall removals of TTC and E. coli were 1.74 and 1.63 log 10 units, respectively. Future research is needed to better understand the influence of design, operational and environmental factors on WSP system performance.

  11. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  12. Predicting Slag Generation in Sub-Scale Test Motors Using a Neural Network

    NASA Technical Reports Server (NTRS)

    Wiesenberg, Brent

    1999-01-01

    Generation of slag (aluminum oxide) is an important issue for the Reusable Solid Rocket Motor (RSRM). Thiokol performed testing to quantify the relationship between raw material variations and slag generation in solid propellants by testing sub-scale motors cast with propellant containing various combinations of aluminum fuel and ammonium perchlorate (AP) oxidizer particle sizes. The test data were analyzed using statistical methods and an artificial neural network. This paper primarily addresses the neural network results with some comparisons to the statistical results. The neural network showed that the particle sizes of both the aluminum and unground AP have a measurable effect on slag generation. The neural network analysis showed that aluminum particle size is the dominant driver in slag generation, about 40% more influential than AP. The network predictions of the amount of slag produced during firing of sub-scale motors were 16% better than the predictions of a statistically derived empirical equation. Another neural network successfully characterized the slag generated during full-scale motor tests. The success is attributable to the ability of neural networks to characterize multiple complex factors including interactions that affect slag generation.

  13. High efficiency family shuffling based on multi-step PCR and in vivo DNA recombination in yeast: statistical and functional analysis of a combinatorial library between human cytochrome P450 1A1 and 1A2.

    PubMed

    Abécassis, V; Pompon, D; Truan, G

    2000-10-15

    The design of a family shuffling strategy (CLERY: Combinatorial Libraries Enhanced by Recombination in Yeast) associating PCR-based and in vivo recombination and expression in yeast is described. This strategy was tested using human cytochrome P450 CYP1A1 and CYP1A2 as templates, which share 74% nucleotide sequence identity. Construction of highly shuffled libraries of mosaic structures and reduction of parental gene contamination were two major goals. Library characterization involved multiprobe hybridization on DNA macro-arrays. The statistical analysis of randomly selected clones revealed a high proportion of chimeric genes (86%) and a homogeneous representation of the parental contribution among the sequences (55.8 +/- 2.5% for parental sequence 1A2). A microtiter plate screening system was designed to achieve colorimetric detection of polycyclic hydrocarbon hydroxylation by transformed yeast cells. Full sequences of five randomly picked and five functionally selected clones were analyzed. Results confirmed the shuffling efficiency and allowed calculation of the average length of sequence exchange and mutation rates. The efficient and statistically representative generation of mosaic structures by this type of family shuffling in a yeast expression system constitutes a novel and promising tool for structure-function studies and tuning enzymatic activities of multicomponent eucaryote complexes involving non-soluble enzymes.

  14. Thirty Meter Telescope Site Testing V: Seeing and Isoplanatic Angle

    NASA Astrophysics Data System (ADS)

    Skidmore, Warren; Els, Sebastian; Travouillon, Tony; Riddle, Reed; Schöck, Matthias; Bustos, Edison; Seguel, Juan; Walker, David

    2009-10-01

    In this article we present an analysis of the statistical and temporal properties of seeing and isoplanatic angle measurements obtained with combined Differential Image Motion Monitor (DIMM) and Multi-Aperture Scintillation Sensor (MASS) units at the Thirty Meter Telescope (TMT) candidate sites. For each of the five candidate sites we obtained multiyear, high-cadence, high-quality seeing measurements. These data allow for a broad and detailed analysis, giving us a good understanding of the characteristics of each of the sites. The overall seeing statistics for the five candidate sites are presented, broken into total seeing (measured by the DIMM), free-atmosphere seeing and isoplanatic angle (measured by the MASS), and ground-layer seeing (difference between the total and free-atmosphere seeing). We examine the statistical distributions of seeing measurements and investigate annual and nightly behavior. The properties of the seeing measurements are discussed in terms of the geography and meteorological conditions at each site. The temporal variability of the seeing measurements over timescales of minutes to hours is derived for each site. We find that each of the TMT candidate sites has its own strengths and weaknesses when compared against the other candidate sites. The results presented in this article form part of the full set of results that are used for the TMT site-selection process. This is the fifth article in a series discussing the TMT site-testing project.

  15. Ontology-based content analysis of US patent applications from 2001-2010.

    PubMed

    Weber, Lutz; Böhme, Timo; Irmer, Matthias

    2013-01-01

    Ontology-based semantic text analysis methods allow to automatically extract knowledge relationships and data from text documents. In this review, we have applied these technologies for the systematic analysis of pharmaceutical patents. Hierarchical concepts from the knowledge domains of chemical compounds, diseases and proteins were used to annotate full-text US patent applications that deal with pharmacological activities of chemical compounds and filed in the years 2001-2010. Compounds claimed in these applications have been classified into their respective compound classes to review the distribution of scaffold types or general compound classes such as natural products in a time-dependent manner. Similarly, the target proteins and claimed utility of the compounds have been classified and the most relevant were extracted. The method presented allows the discovery of the main areas of innovation as well as emerging fields of patenting activities - providing a broad statistical basis for competitor analysis and decision-making efforts.

  16. esATAC: An Easy-to-use Systematic pipeline for ATAC-seq data analysis.

    PubMed

    Wei, Zheng; Zhang, Wei; Fang, Huan; Li, Yanda; Wang, Xiaowo

    2018-03-07

    ATAC-seq is rapidly emerging as one of the major experimental approaches to probe chromatin accessibility genome-wide. Here, we present "esATAC", a highly integrated easy-to-use R/Bioconductor package, for systematic ATAC-seq data analysis. It covers essential steps for full analyzing procedure, including raw data processing, quality control and downstream statistical analysis such as peak calling, enrichment analysis and transcription factor footprinting. esATAC supports one command line execution for preset pipelines, and provides flexible interfaces for building customized pipelines. esATAC package is open source under the GPL-3.0 license. It is implemented in R and C ++. Source code and binaries for Linux, MAC OS X and Windows are available through Bioconductor https://www.bioconductor.org/packages/release/bioc/html/esATAC.html). xwwang@tsinghua.edu.cn. Supplementary data are available at Bioinformatics online.

  17. A retrospective study of a modified 1-minute formocresol pulpotomy technique part 1: clinical and radiographic findings.

    PubMed

    Kurji, Zahra A; Sigal, Michael J; Andrews, Paul; Titley, Keith

    2011-01-01

    The purpose of this study was to assess the clinical and radiographic outcomes of a 1-minute application of full-strength Buckley's formocresol with concurrent hemostasis using the medicated cotton pledget in human primary teeth. Using a retrospective chart review, clinical and radiographic data were available for 557 primary molars in 320 patients. Descriptive statistics and survival analysis were used to assess outcomes. Overall clinical success, radiographic success, and cumulative 5-year survival rates were approximately 99%, 90%, and 87%, respectively. Internal root resorption (∼5%) and pulp canal obliteration (∼2%) were the most frequently observed radiographic failures. Thirty-nine teeth were extracted due to clinical and or radiographic failure. Mandibular molars were 6 times more prone to radiographic failure than maxillary molars. Success rates for the modified technique are comparable to techniques that use the 5-minute diluted or full-strength solutions reported in the literature. This 1-minute full-strength formocresol technique is an acceptable alternative to published traditional techniques.

  18. Passive detection of vehicle loading

    NASA Astrophysics Data System (ADS)

    McKay, Troy R.; Salvaggio, Carl; Faulring, Jason W.; Salvaggio, Philip S.; McKeown, Donald M.; Garrett, Alfred J.; Coleman, David H.; Koffman, Larry D.

    2012-01-01

    The Digital Imaging and Remote Sensing Laboratory (DIRS) at the Rochester Institute of Technology, along with the Savannah River National Laboratory is investigating passive methods to quantify vehicle loading. The research described in this paper investigates multiple vehicle indicators including brake temperature, tire temperature, engine temperature, acceleration and deceleration rates, engine acoustics, suspension response, tire deformation and vibrational response. Our investigation into these variables includes building and implementing a sensing system for data collection as well as multiple full-scale vehicle tests. The sensing system includes; infrared video cameras, triaxial accelerometers, microphones, video cameras and thermocouples. The full scale testing includes both a medium size dump truck and a tractor-trailer truck on closed courses with loads spanning the full range of the vehicle's capacity. Statistical analysis of the collected data is used to determine the effectiveness of each of the indicators for characterizing the weight of a vehicle. The final sensing system will monitor multiple load indicators and combine the results to achieve a more accurate measurement than any of the indicators could provide alone.

  19. (FEDSTATS)

    EPA Science Inventory

    Federal Statistics (FedStats) offers the full range of official statistical information available to the public from the Federal Government. It uses the Internet's powerful linking and searching capabilities to track economic and population trends, education, health care costs, a...

  20. Neurotoxicological and statistical analyses of a mixture of five organophosphorus pesticides using a ray design.

    PubMed

    Moser, V C; Casey, M; Hamm, A; Carter, W H; Simmons, J E; Gennings, C

    2005-07-01

    Environmental exposures generally involve chemical mixtures instead of single chemicals. Statistical models such as the fixed-ratio ray design, wherein the mixing ratio (proportions) of the chemicals is fixed across increasing mixture doses, allows for the detection and characterization of interactions among the chemicals. In this study, we tested for interaction(s) in a mixture of five organophosphorus (OP) pesticides (chlorpyrifos, diazinon, dimethoate, acephate, and malathion). The ratio of the five pesticides (full ray) reflected the relative dietary exposure estimates of the general population as projected by the US EPA Dietary Exposure Evaluation Model (DEEM). A second mixture was tested using the same dose levels of all pesticides, but excluding malathion (reduced ray). The experimental approach first required characterization of dose-response curves for the individual OPs to build a dose-additivity model. A series of behavioral measures were evaluated in adult male Long-Evans rats at the time of peak effect following a single oral dose, and then tissues were collected for measurement of cholinesterase (ChE) activity. Neurochemical (blood and brain cholinesterase [ChE] activity) and behavioral (motor activity, gait score, tail-pinch response score) endpoints were evaluated statistically for evidence of additivity. The additivity model constructed from the single chemical data was used to predict the effects of the pesticide mixture along the full ray (10-450 mg/kg) and the reduced ray (1.75-78.8 mg/kg). The experimental mixture data were also modeled and statistically compared to the additivity models. Analysis of the 5-OP mixture (the full ray) revealed significant deviation from additivity for all endpoints except tail-pinch response. Greater-than-additive responses (synergism) were observed at the lower doses of the 5-OP mixture, which contained non-effective dose levels of each of the components. The predicted effective doses (ED20, ED50) were about half that predicted by additivity, and for brain ChE and motor activity, there was a threshold shift in the dose-response curves. For the brain ChE and motor activity, there was no difference between the full (5-OP mixture) and reduced (4-OP mixture) rays, indicating that malathion did not influence the non-additivity. While the reduced ray for blood ChE showed greater deviation from additivity without malathion in the mixture, the non-additivity observed for the gait score was reversed when malathion was removed. Thus, greater-than-additive interactions were detected for both the full and reduced ray mixtures, and the role of malathion in the interactions varied depending on the endpoint. In all cases, the deviations from additivity occurred at the lower end of the dose-response curves.

  1. Lungworm Infections in German Dairy Cattle Herds — Seroprevalence and GIS-Supported Risk Factor Analysis

    PubMed Central

    Schunn, Anne-Marie; Conraths, Franz J.; Staubach, Christoph; Fröhlich, Andreas; Forbes, Andrew; Strube, Christina

    2013-01-01

    In November 2008, a total of 19,910 bulk tank milk (BTM) samples were obtained from dairy farms from all over Germany, corresponding to about 20% of all German dairy herds, and analysed for antibodies against the bovine lungworm Dictyocaulus viviparus by use of the recombinant MSP-ELISA. A total number of 3,397 (17.1%; n = 19,910) BTM samples tested seropositive. The prevalences in individual German federal states varied between 0.0% and 31.2% positive herds. A geospatial map was drawn to show the distribution of seropositive and seronegative herds per postal code area. ELISA results were further analysed for associations with land-use and climate data. Bivariate statistical analysis was used to identify potential spatial risk factors for dictyocaulosis. Statistically significant positive associations were found between lungworm seropositive herds and the proportion of water bodies and grassed area per postal code area. Variables that showed a statistically significant association with a positive BTM test were included in a logistic regression model, which was further refined by controlled stepwise selection of variables. The low Pseudo R2 values (0.08 for the full model and 0.06 for the final model) and further evaluation of the model by ROC analysis indicate that additional, unrecorded factors (e.g. management factors) or random effects may substantially contribute to lungworm infections in dairy cows. Veterinarians should include lungworms in the differential diagnosis of respiratory disease in dairy cattle, particularly those at pasture. Monitoring of herds through BTM screening for antibodies can help farmers and veterinarians plan and implement appropriate control measures. PMID:24040243

  2. Statistical analysis of hail characteristics in the hail-protected western part of Croatia using data from hail suppression stations

    NASA Astrophysics Data System (ADS)

    Počakal, Damir; Štalec, Janez

    In the continental part of Croatia, operational hail suppression has been conducted for more than 30 years. The current protected area is 25,177 km 2 and has about 492 hail suppression stations which are managed with eight weather radar centres. This paper present a statistical analysis of parameters connected with hail occurrence on hail suppression stations in the western part of protected area in 1981-2000 period. This analysis compares data of two periods with different intensity of hail suppression activity and is made as a part of a project for assessment of hail suppression efficiency in Croatia. Because of disruption in hail suppression system during the independence war in Croatia (1991-1995), lack of rockets and other objective circumstances, it is considered that in the 1991-2000 period, hail suppression system could not act properly. Because of that, a comparison of hail suppression data for two periods was made. The first period (1981-1990), which is characterised with full application of hail suppression technology is compared with the second period (1991-2000). The protected area is divided into quadrants (9×9 km), such that every quadrant has at least one hail suppression station and intercomparison is more precise. Discriminant analysis was performed for the yearly values of each quadrant. These values included number of cases with solid precipitation, hail damage, heavy hail damage, number of active hail suppression stations, number of days with solid precipitation, solid precipitation damage, heavy solid precipitation damage and the number and duration of air traffic control bans. The discriminant analysis shows that there is a significant difference between the two periods. Average values of observed periods on isolated discriminant function 1 are for the first period (1981-1990) -0.36 and for the second period +0.23 standard deviation of all observations. The analysis for all eight variables shows statistically substantial differences in the number of hail suppression stations (which have a positive correlation) and in the number of cases with air traffic control ban, which have, like all other variables, a negative correlation. Results of statistical analysis for two periods show positive influence of hail suppression system. The discriminant analysis made for three periods shows that these three periods can not be compared because of the short time period, the difference in hail suppression technology, working conditions and possible differences in meteorological conditions. Therefore, neither the effectiveness nor ineffectiveness of hail suppression operations nor their efficiency can be statistically proven. For an exact assessment of hail suppression effectiveness, it is necessary to develop a project, which would take into consideration all the parameters used in such previous projects around the world—a hailpad polygon.

  3. Statistical Compression for Climate Model Output

    NASA Astrophysics Data System (ADS)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  4. A Handbook of Sound and Vibration Parameters

    DTIC Science & Technology

    1978-09-18

    fixed in space. (Reference 1.) no motion atay node Static Divergence: (See Divergence.) Statistical Energy Analysis (SEA): Statistical energy analysis is...parameters of the circuits come from statistics of the vibrational characteristics of the structure. Statistical energy analysis is uniquely successful

  5. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  6. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  7. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  8. Investigation of Particle Sampling Bias in the Shear Flow Field Downstream of a Backward Facing Step

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Kjelgaard, Scott O.; Hepner, Timothy E.

    1990-01-01

    The flow field about a backward facing step was investigated to determine the characteristics of particle sampling bias in the various flow phenomena. The investigation used the calculation of the velocity:data rate correlation coefficient as a measure of statistical dependence and thus the degree of velocity bias. While the investigation found negligible dependence within the free stream region, increased dependence was found within the boundary and shear layers. Full classic correction techniques over-compensated the data since the dependence was weak, even in the boundary layer and shear regions. The paper emphasizes the necessity to determine the degree of particle sampling bias for each measurement ensemble and not use generalized assumptions to correct the data. Further, it recommends the calculation of the velocity:data rate correlation coefficient become a standard statistical calculation in the analysis of all laser velocimeter data.

  9. Coronal emission-line polarization from the statistical equilibrium of magnetic sublevels. II. Fe XIV 5303 A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    House, L.L.; Querfeld, C.W.; Rees, D.E.

    1982-04-15

    Coronal magnetic fields influence in the intensity and linear polarization of light scattered by coronal Fe XIV ions. To interpret polarization measurements of Fe XIV 5303 A coronal emission requires a detailed understanding of the dependence of the emitted Stokes vector on coronal magnetic field direction, electron density, and temperature and on height of origin. The required dependence is included in the solutions of statistical equilibrium for the ion which are solved explicitly for 34 magnetic sublevels in both the ground and four excited terms. The full solutions are reduced to equivalent simple analytic forms which clearly show the requiredmore » dependence on coronal conditions. The analytic forms of the reduced solutions are suitable for routine analysis of 5303 green line polarimetric data obtained at Pic du Midi and from the Solar Maximum Mission Coronagraph/Polarimeter.« less

  10. Index Blood Tests and National Early Warning Scores within 24 Hours of Emergency Admission Can Predict the Risk of In-Hospital Mortality: A Model Development and Validation Study

    PubMed Central

    Mohammed, Mohammed A.; Rudge, Gavin; Watson, Duncan; Wood, Gordon; Smith, Gary B.; Prytherch, David R.; Girling, Alan; Stevens, Andrew

    2013-01-01

    Background We explored the use of routine blood tests and national early warning scores (NEWS) reported within ±24 hours of admission to predict in-hospital mortality in emergency admissions, using empirical decision Tree models because they are intuitive and may ultimately be used to support clinical decision making. Methodology A retrospective analysis of adult emergency admissions to a large acute hospital during April 2009 to March 2010 in the West Midlands, England, with a full set of index blood tests results (albumin, creatinine, haemoglobin, potassium, sodium, urea, white cell count and an index NEWS undertaken within ±24 hours of admission). We developed a Tree model by randomly splitting the admissions into a training (50%) and validation dataset (50%) and assessed its accuracy using the concordance (c-) statistic. Emergency admissions (about 30%) did not have a full set of index blood tests and/or NEWS and so were not included in our analysis. Results There were 23248 emergency admissions with a full set of blood tests and NEWS with an in-hospital mortality of 5.69%. The Tree model identified age, NEWS, albumin, sodium, white cell count and urea as significant (p<0.001) predictors of death, which described 17 homogeneous subgroups of admissions with mortality ranging from 0.2% to 60%. The c-statistic for the training model was 0.864 (95%CI 0.852 to 0.87) and when applied to the testing data set this was 0.853 (95%CI 0.840 to 0.866). Conclusions An easy to interpret validated risk adjustment Tree model using blood test and NEWS taken within ±24 hours of admission provides good discrimination and offers a novel approach to risk adjustment which may potentially support clinical decision making. Given the nature of the clinical data, the results are likely to be generalisable but further research is required to investigate this promising approach. PMID:23734195

  11. Unbiased metabolite profiling by liquid chromatography-quadrupole time-of-flight mass spectrometry and multivariate data analysis for herbal authentication: classification of seven Lonicera species flower buds.

    PubMed

    Gao, Wen; Yang, Hua; Qi, Lian-Wen; Liu, E-Hu; Ren, Mei-Ting; Yan, Yu-Ting; Chen, Jun; Li, Ping

    2012-07-06

    Plant-based medicines become increasingly popular over the world. Authentication of herbal raw materials is important to ensure their safety and efficacy. Some herbs belonging to closely related species but differing in medicinal properties are difficult to be identified because of similar morphological and microscopic characteristics. Chromatographic fingerprinting is an alternative method to distinguish them. Existing approaches do not allow a comprehensive analysis for herbal authentication. We have now developed a strategy consisting of (1) full metabolic profiling of herbal medicines by rapid resolution liquid chromatography (RRLC) combined with quadrupole time-of-flight mass spectrometry (QTOF MS), (2) global analysis of non-targeted compounds by molecular feature extraction algorithm, (3) multivariate statistical analysis for classification and prediction, and (4) marker compounds characterization. This approach has provided a fast and unbiased comparative multivariate analysis of the metabolite composition of 33-batch samples covering seven Lonicera species. Individual metabolic profiles are performed at the level of molecular fragments without prior structural assignment. In the entire set, the obtained classifier for seven Lonicera species flower buds showed good prediction performance and a total of 82 statistically different components were rapidly obtained by the strategy. The elemental compositions of discriminative metabolites were characterized by the accurate mass measurement of the pseudomolecular ions and their chemical types were assigned by the MS/MS spectra. The high-resolution, comprehensive and unbiased strategy for metabolite data analysis presented here is powerful and opens the new direction of authentication in herbal analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. More Results from the Opera Experiment at the Gran Sasso Underground Lab

    NASA Astrophysics Data System (ADS)

    Kamiscioglu, Mustafa

    The OPERA experiment reached its main goal by proving the appearance of ντ in the CNGS νμ beam. Five ντ candidates fulfilling the analysis defined in the proposal were detected with a S/B ratio of about ten allowing to reject the null hypothesis at 5.1σ. The search has been extended by loosening the selection criteria in order to obtain a statistically enhanced, lower purity, signal sample. One such interesting neutrino interaction with a double vertex topology having a high probability of being a ντ interaction with charm production is reported. Based on the enlarged data sample the estimation of Δm232 in appearance mode is presented. The search for νe interactions has been extended over the full data set with a more than twofold increase in statistics with respect to published data. The analysis of the νμ → νe channel is updated and the implications of the electron neutrino sample in the framework of the 3+1 neutrino model is discussed. An analysis of νμ → ντ interactions in the framework of the sterile neutrino model has also been performed. Finally, the results of the study of charged hadron multiplicity distributions is presented.

  13. Sulcal depth-based cortical shape analysis in normal healthy control and schizophrenia groups

    NASA Astrophysics Data System (ADS)

    Lyu, Ilwoo; Kang, Hakmook; Woodward, Neil D.; Landman, Bennett A.

    2018-03-01

    Sulcal depth is an important marker of brain anatomy in neuroscience/neurological function. Previously, sulcal depth has been explored at the region-of-interest (ROI) level to increase statistical sensitivity to group differences. In this paper, we present a fully automated method that enables inferences of ROI properties from a sulcal region- focused perspective consisting of two main components: 1) sulcal depth computation and 2) sulcal curve-based refined ROIs. In conventional statistical analysis, the average sulcal depth measurements are employed in several ROIs of the cortical surface. However, taking the average sulcal depth over the full ROI blurs overall sulcal depth measurements which may result in reduced sensitivity to detect sulcal depth changes in neurological and psychiatric disorders. To overcome such a blurring effect, we focus on sulcal fundic regions in each ROI by filtering out other gyral regions. Consequently, the proposed method results in more sensitive to group differences than a traditional ROI approach. In the experiment, we focused on a cortical morphological analysis to sulcal depth reduction in schizophrenia with a comparison to the normal healthy control group. We show that the proposed method is more sensitivity to abnormalities of sulcal depth in schizophrenia; sulcal depth is significantly smaller in most cortical lobes in schizophrenia compared to healthy controls (p < 0.05).

  14. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  15. Quantitative Methods for Analysing Joint Questionnaire Data: Exploring the Role of Joint in Force Design

    DTIC Science & Technology

    2015-08-01

    the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research

  16. Statistics and Informatics in Space Astrophysics

    NASA Astrophysics Data System (ADS)

    Feigelson, E.

    2017-12-01

    The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.

  17. Spectral analysis of time series of categorical variables in earth sciences

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier

    2016-10-01

    Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.

  18. A Statistical Analysis of Corona Topography: New Insights into Corona Formation and Evolution

    NASA Technical Reports Server (NTRS)

    Stofan, E. R.; Glaze, L. S.; Smrekar, S. E.; Baloga, S. M.

    2003-01-01

    Extensive mapping of the surface of Venus and continued analysis of Magellan data have allowed a more comprehensive survey of coronae to be conducted. Our updated corona database contains 514 features, an increase from the 326 coronae of the previous survey. We include a new set of 106 Type 2 or stealth coronae, which have a topographic rather than a fracture annulus. The large increase in the number of coronae over the 1992 survey results from several factors, including the use of the full Magellan data set and the addition of features identified as part of the systematic geologic mapping of Venus. Parameters of the population that we have analyzed to date include size and topography.

  19. [The future of forensic DNA analysis for criminal justice].

    PubMed

    Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent

    2017-11-01

    In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.

  20. Image-analysis library

    NASA Technical Reports Server (NTRS)

    1980-01-01

    MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.

  1. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    PubMed

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  2. Effects of preparation relief and flow channels on seating full coverage castings during cementation.

    PubMed

    Webb, E L; Murray, H V; Holland, G A; Taylor, D F

    1983-06-01

    Machined steel dies were used to study the effects of three die modifications on seating full coverage castings during cementation. The die modifications consisted of occlusal channels, occlusal surface relief, and axial channels. Fourteen specimens having one or more forms of die modification were compared with two control specimens having no die modifications. Statistical analysis of the data revealed that the addition of four axial channels to the simulated preparation on the steel die produced a significant reduction in the mean marginal discrepancy during cementation. Occlusal modifications alone failed to produce significant reductions in marginal discrepancies when compared with the control specimens. Occlusal modifications in conjunction with axial channels failed to produce further significant reductions in marginal discrepancies when compared with those reductions observed in specimens having only axial channels.

  3. Complexity and dynamics of topological and community structure in complex networks

    NASA Astrophysics Data System (ADS)

    Berec, Vesna

    2017-07-01

    Complexity is highly susceptible to variations in the network dynamics, reflected on its underlying architecture where topological organization of cohesive subsets into clusters, system's modular structure and resulting hierarchical patterns, are cross-linked with functional dynamics of the system. Here we study connection between hierarchical topological scales of the simplicial complexes and the organization of functional clusters - communities in complex networks. The analysis reveals the full dynamics of different combinatorial structures of q-th-dimensional simplicial complexes and their Laplacian spectra, presenting spectral properties of resulting symmetric and positive semidefinite matrices. The emergence of system's collective behavior from inhomogeneous statistical distribution is induced by hierarchically ordered topological structure, which is mapped to simplicial complex where local interactions between the nodes clustered into subcomplexes generate flow of information that characterizes complexity and dynamics of the full system.

  4. Decreased Surgical Site Infection Rate in Hysterectomy: Effect of a Gynecology-Specific Bundle.

    PubMed

    Andiman, Sarah E; Xu, Xiao; Boyce, John M; Ludwig, Elizabeth M; Rillstone, Heidi R W; Desai, Vrunda B; Fan, Linda L

    2018-06-01

    We implemented a hysterectomy-specific surgical site infection prevention bundle after a higher-than-expected surgical site infection rate was identified at our institution. We evaluate how this bundle affected the surgical site infection rate, length of hospital stay, and 30-day postoperative readmission rate. This is a quality improvement study featuring retrospective analysis of a prospectively implemented, multidisciplinary team-designed surgical site infection prevention bundle that consisted of chlorhexidine-impregnated preoperative wipes, standardized aseptic surgical preparation, standardized antibiotic dosing, perioperative normothermia, surgical dressing maintenance, and direct feedback to clinicians when the protocol was breached. There were 2,099 hysterectomies completed during the 33-month study period. There were 61 surgical site infections (4.51%) in the pre-full bundle implementation period and 14 (1.87%) in the post-full bundle implementation period; we found a sustained reduction in the proportion of patients experiencing surgical site infection during the last 8 months of the study period. After adjusting for clinical characteristics, patients who underwent surgery after full implementation were less likely to develop a surgical site infection (adjusted odds ratio [OR] 0.46, P=.01) than those undergoing surgery before full implementation. Multivariable regression analysis showed no statistically significant difference in postoperative days of hospital stay (adjusted mean ratio 0.95, P=.09) or rate of readmission for surgical site infection-specific indication (adjusted OR 2.65, P=.08) between the before and after full-bundle implementation periods. The multidisciplinary implementation of a gynecologic perioperative surgical site infection prevention bundle was associated with a significant reduction in surgical site infection rate in patients undergoing hysterectomy.

  5. Handling incomplete correlated continuous and binary outcomes in meta-analysis of individual participant data.

    PubMed

    Gomes, Manuel; Hatfield, Laura; Normand, Sharon-Lise

    2016-09-20

    Meta-analysis of individual participant data (IPD) is increasingly utilised to improve the estimation of treatment effects, particularly among different participant subgroups. An important concern in IPD meta-analysis relates to partially or completely missing outcomes for some studies, a problem exacerbated when interest is on multiple discrete and continuous outcomes. When leveraging information from incomplete correlated outcomes across studies, the fully observed outcomes may provide important information about the incompleteness of the other outcomes. In this paper, we compare two models for handling incomplete continuous and binary outcomes in IPD meta-analysis: a joint hierarchical model and a sequence of full conditional mixed models. We illustrate how these approaches incorporate the correlation across the multiple outcomes and the between-study heterogeneity when addressing the missing data. Simulations characterise the performance of the methods across a range of scenarios which differ according to the proportion and type of missingness, strength of correlation between outcomes and the number of studies. The joint model provided confidence interval coverage consistently closer to nominal levels and lower mean squared error compared with the fully conditional approach across the scenarios considered. Methods are illustrated in a meta-analysis of randomised controlled trials comparing the effectiveness of implantable cardioverter-defibrillator devices alone to implantable cardioverter-defibrillator combined with cardiac resynchronisation therapy for treating patients with chronic heart failure. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  6. The effect of antimicrobial agents on bond strength of orthodontic adhesives: a meta-analysis of in vitro studies.

    PubMed

    Altmann, A S P; Collares, F M; Leitune, V C B; Samuel, S M W

    2016-02-01

    Antimicrobial orthodontic adhesives aim to reduce white spot lesions' incidence in orthodontic patients, but they should not jeopardizing its properties. Systematic review and meta-analysis were performed to answer the question whether the association of antimicrobial agents with orthodontic adhesives compromises its mechanical properties and whether there is a superior antimicrobial agent. PubMed and Scopus databases. In vitro studies comparing shear bond strength of conventional photo-activated orthodontic adhesives to antimicrobial photo-activated orthodontic adhesives were considered eligible. Search terms included the following: orthodontics, orthodontic, antimicrobial, antibacterial, bactericidal, adhesive, resin, resin composite, bonding agent, bonding system, and bond strength. The searches yielded 494 citations, which turned into 467 after duplicates were discarded. Titles and abstracts were read and 13 publications were selected for full-text reading. Twelve studies were included in the meta-analysis. The global analysis showed no statistically significant difference between control and experimental groups. In the subgroup analysis, only the chlorhexidine subgroup showed a statistically significant difference, where the control groups had higher bond strength than the experimental groups. Many studies on in vitro orthodontic bond strength fail to report test conditions that could affect their outcomes. The pooled in vitro data suggest that adding an antimicrobial agent to an orthodontic adhesive system does not influence bond strength to enamel. It is not possible to state which antimicrobial agent is better to be associated. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Quantitative topographic differentiation of the neonatal EEG.

    PubMed

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  8. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    NASA Astrophysics Data System (ADS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.

  9. Point estimation following two-stage adaptive threshold enrichment clinical trials.

    PubMed

    Kimani, Peter K; Todd, Susan; Renfro, Lindsay A; Stallard, Nigel

    2018-05-31

    Recently, several study designs incorporating treatment effect assessment in biomarker-based subpopulations have been proposed. Most statistical methodologies for such designs focus on the control of type I error rate and power. In this paper, we have developed point estimators for clinical trials that use the two-stage adaptive enrichment threshold design. The design consists of two stages, where in stage 1, patients are recruited in the full population. Stage 1 outcome data are then used to perform interim analysis to decide whether the trial continues to stage 2 with the full population or a subpopulation. The subpopulation is defined based on one of the candidate threshold values of a numerical predictive biomarker. To estimate treatment effect in the selected subpopulation, we have derived unbiased estimators, shrinkage estimators, and estimators that estimate bias and subtract it from the naive estimate. We have recommended one of the unbiased estimators. However, since none of the estimators dominated in all simulation scenarios based on both bias and mean squared error, an alternative strategy would be to use a hybrid estimator where the estimator used depends on the subpopulation selected. This would require a simulation study of plausible scenarios before the trial. © 2018 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  10. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  11. Canonical Statistical Model for Maximum Expected Immission of Wire Conductor in an Aperture Enclosure

    NASA Technical Reports Server (NTRS)

    Bremner, Paul G.; Vazquez, Gabriel; Christiano, Daniel J.; Trout, Dawn H.

    2016-01-01

    Prediction of the maximum expected electromagnetic pick-up of conductors inside a realistic shielding enclosure is an important canonical problem for system-level EMC design of space craft, launch vehicles, aircraft and automobiles. This paper introduces a simple statistical power balance model for prediction of the maximum expected current in a wire conductor inside an aperture enclosure. It calculates both the statistical mean and variance of the immission from the physical design parameters of the problem. Familiar probability density functions can then be used to predict the maximum expected immission for deign purposes. The statistical power balance model requires minimal EMC design information and solves orders of magnitude faster than existing numerical models, making it ultimately viable for scaled-up, full system-level modeling. Both experimental test results and full wave simulation results are used to validate the foundational model.

  12. A new full-field interferometry approach for counting and differentiating aquatic biotic nanoparticles (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Boccara, A. Claude; Fedala, Yasmina; Voronkoff, Justine; Paffoni, Nina; Boccara, Martine

    2017-03-01

    Due to the huge abundance and the major role that viruses and membrane vesicles play in the seas or rivers ecosystems it is necessary to develop simple, sensitive, compact and reliable methods for their detection and characterization. Our approach is based on the measurement of the weak light level scattered by the biotic nanoparticles. We describe a new full-field, incoherently illuminated, shot-noise limited, common-path interferometric detection method coupled with the analysis of Brownian motion to detect, quantify, and differentiate biotic nanoparticles. The last developments take advantage of a new fast (700 Hz) camera with 2 Me- full well capacity that improves the signal to noise ratio and increases the precision of the Brownian motion characterization. We validated the method with calibrated nanoparticles and homogeneous DNA or RNA.viruses. The smallest virus size that we characterized with a suitable signal-to-noise ratio was around 30 nm in diameter with a target towards the numerous 20 nm diameter viruses. We show for the first time anisotropic trajectories for myoviruses meaning that there is a memory of the initial direction of their Brownian motions. Significant improvements have been made in the handling of the sample as well as in the statistical analysis for differentiating the various families of vesicles and virus. We further applied the method for vesicles detection and for analysis of coastal and oligotrophic samples from Tara Oceans circumnavigation as well of various rivers.

  13. Neutrino oscillations: what do we know about θ13

    NASA Astrophysics Data System (ADS)

    Ernst, David

    2008-10-01

    The phenomenon of neutrino oscillations is reviewed. A new analysis tool for the recent, more finely binned Super-K atmospheric data is outlined. This analysis incorporates the full three-neutrino oscillation probabilities, including the mixing angle θ13 to all orders, and a full three- neutrino treatment of the Earth's MSW effect. Combined with the K2K, MINOS, and CHOOZ data, the upper bound on θ13 is found to arise from the Super-K atmospheric data, while the lower bound arises from CHOOZ. This is caused by the linear in θ13 terms which are of particualr importance in the region L/E>10^4 m/MeV where the sub-dominant expansion is not convergent. In addition, the enhancement of θ12 by the Earth MSW effect is found to be important for this result. The best fit value of θ13 is found to be (statistically insignificantly) negative and given by θ13=-0.07^+0.18-0.11. In collaboration with Jesus Escamilla, Vanderbilt University and David Latimer, University of Kentucky.

  14. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  15. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.

  16. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  17. A collaborative sequential meta-analysis of individual patient data from randomized trials of endovascular therapy and tPA vs. tPA alone for acute ischemic stroke: ThRombEctomy And tPA (TREAT) analysis: statistical analysis plan for a sequential meta-analysis performed within the VISTA-Endovascular collaboration.

    PubMed

    MacIsaac, Rachael L; Khatri, Pooja; Bendszus, Martin; Bracard, Serge; Broderick, Joseph; Campbell, Bruce; Ciccone, Alfonso; Dávalos, Antoni; Davis, Stephen M; Demchuk, Andrew; Diener, Hans-Christoph; Dippel, Diederik; Donnan, Geoffrey A; Fiehler, Jens; Fiorella, David; Goyal, Mayank; Hacke, Werner; Hill, Michael D; Jahan, Reza; Jauch, Edward; Jovin, Tudor; Kidwell, Chelsea S; Liebeskind, David; Majoie, Charles B; Martins, Sheila Cristina Ouriques; Mitchell, Peter; Mocco, J; Muir, Keith W; Nogueira, Raul; Saver, Jeffrey L; Schonewille, Wouter J; Siddiqui, Adnan H; Thomalla, Götz; Tomsick, Thomas A; Turk, Aquilla S; White, Philip; Zaidat, Osama; Lees, Kennedy R

    2015-10-01

    Endovascular treatment has been shown to restore blood flow effectively. Second-generation medical devices such as stent retrievers are now showing overwhelming efficacy in clinical trials, particularly in conjunction with intravenous recombinant tissue plasminogen activator. This statistical analysis plan utilizing a novel, sequential approach describes a prospective, individual patient data analysis of endovascular therapy in conjunction with intravenous recombinant tissue plasminogen activator agreed upon by the Thrombectomy and Tissue Plasminogen Activator Collaborative Group. This protocol will specify the primary outcome for efficacy, as 'favorable' outcome defined by the ordinal distribution of the modified Rankin Scale measured at three-months poststroke, but with modified Rankin Scales 5 and 6 collapsed into a single category. The primary analysis will aim to answer the questions: 'what is the treatment effect of endovascular therapy with intravenous recombinant tissue plasminogen activator compared to intravenous tissue plasminogen activator alone on full scale modified Rankin Scale at 3 months?' and 'to what extent do key patient characteristics influence the treatment effect of endovascular therapy?'. Key secondary outcomes include effect of endovascular therapy on death within 90 days; analyses of modified Rankin Scale using dichotomized methods; and effects of endovascular therapy on symptomatic intracranial hemorrhage. Several secondary analyses will be considered as well as expanding patient cohorts to intravenous recombinant tissue plasminogen activator-ineligible patients, should data allow. This collaborative meta-analysis of individual participant data from randomized trials of endovascular therapy vs. control in conjunction with intravenous thrombolysis will demonstrate the efficacy and generalizability of endovascular therapy with intravenous thrombolysis as a concomitant medication. © 2015 World Stroke Organization.

  18. Depth-Dependent Glycosaminoglycan Concentration in Articular Cartilage by Quantitative Contrast-Enhanced Micro–Computed Tomography

    PubMed Central

    Mittelstaedt, Daniel

    2015-01-01

    Objective A quantitative contrast-enhanced micro–computed tomography (qCECT) method was developed to investigate the depth dependency and heterogeneity of the glycosaminoglycan (GAG) concentration of ex vivo cartilage equilibrated with an anionic radiographic contrast agent, Hexabrix. Design Full-thickness fresh native (n = 19 in 3 subgroups) and trypsin-degraded (n = 6) articular cartilage blocks were imaged using micro–computed tomography (μCT) at high resolution (13.4 μm3) before and after equilibration with various Hexabrix bathing concentrations. The GAG concentration was calculated depth-dependently based on Gibbs-Donnan equilibrium theory. Analysis of variance with Tukey’s post hoc was used to test for statistical significance (P < 0.05) for effect of Hexabrix bathing concentration, and for differences in bulk and zonal GAG concentrations individually and compared between native and trypsin-degraded cartilage. Results The bulk GAG concentration was calculated to be 74.44 ± 6.09 and 11.99 ± 4.24 mg/mL for native and degraded cartilage, respectively. A statistical difference was demonstrated for bulk and zonal GAG between native and degraded cartilage (P < 0.032). A statistical difference was not demonstrated for bulk GAG when comparing Hexabrix bathing concentrations (P > 0.3214) for neither native nor degraded cartilage. Depth-dependent GAG analysis of native cartilage revealed a statistical difference only in the radial zone between 30% and 50% Hexabrix bathing concentrations. Conclusions This nondestructive qCECT methodology calculated the depth-dependent GAG concentration for both native and trypsin-degraded cartilage at high spatial resolution. qCECT allows for more detailed understanding of the topography and depth dependency, which could help diagnose health, degradation, and repair of native and contrived cartilage. PMID:26425259

  19. Methodological approaches in analysing observational data: A practical example on how to address clustering and selection bias.

    PubMed

    Trutschel, Diana; Palm, Rebecca; Holle, Bernhard; Simon, Michael

    2017-11-01

    Because not every scientific question on effectiveness can be answered with randomised controlled trials, research methods that minimise bias in observational studies are required. Two major concerns influence the internal validity of effect estimates: selection bias and clustering. Hence, to reduce the bias of the effect estimates, more sophisticated statistical methods are needed. To introduce statistical approaches such as propensity score matching and mixed models into representative real-world analysis and to conduct the implementation in statistical software R to reproduce the results. Additionally, the implementation in R is presented to allow the results to be reproduced. We perform a two-level analytic strategy to address the problems of bias and clustering: (i) generalised models with different abilities to adjust for dependencies are used to analyse binary data and (ii) the genetic matching and covariate adjustment methods are used to adjust for selection bias. Hence, we analyse the data from two population samples, the sample produced by the matching method and the full sample. The different analysis methods in this article present different results but still point in the same direction. In our example, the estimate of the probability of receiving a case conference is higher in the treatment group than in the control group. Both strategies, genetic matching and covariate adjustment, have their limitations but complement each other to provide the whole picture. The statistical approaches were feasible for reducing bias but were nevertheless limited by the sample used. For each study and obtained sample, the pros and cons of the different methods have to be weighted. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. A Comparative Evaluation of Accuracy of the Dies Affected by Tray Type, Material Viscosity, and Pouring Sequence of Dual and Single Arch Impressions- An In vitro Study

    PubMed Central

    Kulkarni, Rahul S.; Shah, Rupal J.; Chhajlani, Rahul; Saklecha, Bhuwan; Maru, Kavita

    2017-01-01

    Introduction The clinician’s skill, impression techniques, and materials play a very important role in recording fine details in an impression for accuracy of fixed partial denture prosthesis. Impression of prepared teeth and of the opposing arch can be recorded simultaneously by dual-arch trays, while the full arch metal trays are used for impressions of prepared teeth in one arch. Aim To measure and compare the accuracy of working dies made from impressions with metal and plastic dual arch trays and metal full arch trays, for two viscosities of impression material and by changing the sequence of pour of working and non-working sides. Materials and Methods A balanced design with independent samples was used to study the three variables (tray type, impression material viscosity, and pouring sequence). An impression made by dual arch trays and single arch trays were divided in to three groups (Group A-plastic dual arch tray, Group B-metal dual arch tray, Group C-full arch metal stock tray). Out of these three groups, two groups (Group A and B) were subdivided in to four subgroups each and one group (Group C) was subdivided in to two subgroups. A sample size of 30 was used in each subgroup yielding a total 300 impressions in three groups or ten subgroups. Impressions were made of a machined circular stainless steel die. All three dimensions (Occlusogingival, Mesiodistal, and Buccolingual) of the working dies as well as stainless steel standard die were measured three times, and the mean was used for the three standard sample values to which all working dies means were compared. Statistical analysis used for this study was a 3-factor analysis of variance with hypothesis testing at α =0.05. Results With respect to the selection of impression material viscosity statistically significant differences were found in the dies for the buccolingual and mesiodistal dimensions. Metal dual arch trays were slightly more accurate in the mesiodistal dimension in comparison to the plastic trays in reference of tray selection and in view of pouring sequence no differences were observed in occlusogingival dimension but in buccolingual and mesiodistal dimensions nonworking side was more accurate. Conclusion The gypsum dies produced from the dual arch impressions were generally smaller in all three dimensions than the stainless steel standard die. Plastic dual-arch trays were more accurate with rigid impression material and there was not statistically significant difference for sequence of pouring. Metal dual-arch trays were more accurate with monophase impression material and working side was more accurate. Stock metal full arch trays were more accurate for monophase impression material. PMID:28571280

  1. Statistical indicators of collective behavior and functional clusters in gene networks of yeast

    NASA Astrophysics Data System (ADS)

    Živković, J.; Tadić, B.; Wick, N.; Thurner, S.

    2006-03-01

    We analyze gene expression time-series data of yeast (S. cerevisiae) measured along two full cell-cycles. We quantify these data by using q-exponentials, gene expression ranking and a temporal mean-variance analysis. We construct gene interaction networks based on correlation coefficients and study the formation of the corresponding giant components and minimum spanning trees. By coloring genes according to their cell function we find functional clusters in the correlation networks and functional branches in the associated trees. Our results suggest that a percolation point of functional clusters can be identified on these gene expression correlation networks.

  2. ENSO related variability in the Southern Hemisphere, 1948-2000

    NASA Astrophysics Data System (ADS)

    Ribera, Pedro; Mann, Michael E.

    2003-01-01

    The spatiotemporal evolution of Southern Hemisphere climate variability is diagnosed based on the use of the NCEP reanalysis (1948-2000) dataset. Using the MTM-SVD analysis method, significant narrowband variability is isolated from the multi-variate dataset. It is found that the ENSO signal exhibits statistically significant behavior at quasiquadrennial (3-6 yr) timescales for the full time-period. A significant quasibiennial (2-3 yr) timescales emerges only for the latter half of period. Analyses of the spatial evolution of the two reconstructed signals shed additional light on linkages between low and high-latitude Southern Hemisphere climate anomalies.

  3. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  4. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  5. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  6. Common pitfalls in statistical analysis: Clinical versus statistical significance

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754

  7. Transit safety & security statistics & analysis 2003 annual report (formerly SAMIS)

    DOT National Transportation Integrated Search

    2005-12-01

    The Transit Safety & Security Statistics & Analysis 2003 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  8. Transit safety & security statistics & analysis 2002 annual report (formerly SAMIS)

    DOT National Transportation Integrated Search

    2004-12-01

    The Transit Safety & Security Statistics & Analysis 2002 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  9. Apically extruded debris with reciprocating single-file and full-sequence rotary instrumentation systems.

    PubMed

    Bürklein, Sebastian; Schäfer, Edgar

    2012-06-01

    The purpose of this in vitro study was to assess the amount of apically extruded debris using rotary and reciprocating nickel-titanium instrumentation systems. Eighty human mandibular central incisors were randomly assigned to 4 groups (n = 20 teeth per group). The root canals were instrumented according to the manufacturers' instructions using the 2 reciprocating single-file systems Reciproc (VDW, Munich, Germany) and WaveOne (Dentsply Maillefer, Ballaigues, Switzerland) and the 2 full-sequence rotary Mtwo (VDW, Munich, Germany) and ProTaper (Dentsply Maillefer, Ballaigues, Switzerland) instruments. Bidistilled water was used as irrigant. The apically extruded debris was collected in preweighted glass vials using the Myers and Montgomery method. After drying, the mean weight of debris was assessed with a microbalance and statistically analyzed using analysis of variance and the post hoc Student-Newman-Keuls test. The time required to prepare the canals with the different instruments was also recorded. The reciprocating files produced significantly more debris compared with both rotary systems (P < .05). Although no statistically significant difference was obtained between the 2 rotary instruments (P > .05), the reciprocating single-file system Reciproc produced significantly more debris compared with all other instruments (P < .05). Instrumentation was significantly faster using Reciproc than with all other instrument (P < .05). Under the condition of this study, all systems caused apical debris extrusion. Full-sequence rotary instrumentation was associated with less debris extrusion compared with the use of reciprocating single-file systems. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. Influence of eye biometrics and corneal micro-structure on noncontact tonometry.

    PubMed

    Jesus, Danilo A; Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D Robert

    2017-01-01

    Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements.

  11. Influence of eye biometrics and corneal micro-structure on noncontact tonometry

    PubMed Central

    Majewska, Małgorzata; Krzyżanowska-Berkowska, Patrycja; Iskander, D. Robert

    2017-01-01

    Purpose Tonometry is widely used as the main screening tool supporting glaucoma diagnosis. Still, its accuracy could be improved if full knowledge about the variation of the corneal biomechanical properties was available. In this study, Optical Coherence Tomography (OCT) speckle statistics are used to infer the organisation of the corneal micro-structure and hence, to analyse its influence on intraocular pressure (IOP) measurements. Methods Fifty-six subjects were recruited for this prospective study. Macro and micro-structural corneal parameters as well as subject age were considered. Macro-structural analysis included the parameters that are associated with the ocular anatomy, such as central corneal thickness (CCT), corneal radius, axial length, anterior chamber depth and white-to-white corneal diameter. Micro-structural parameters which included OCT speckle statistics were related to the internal organisation of the corneal tissue and its physiological changes during lifetime. The corneal speckle obtained from OCT was modelled with the Generalised Gamma (GG) distribution that is characterised with a scale parameter and two shape parameters. Results In macro-structure analysis, only CCT showed a statistically significant correlation with IOP (R2 = 0.25, p<0.001). The scale parameter and the ratio of the shape parameters of GG distribution showed statistically significant correlation with IOP (R2 = 0.19, p<0.001 and R2 = 0.17, p<0.001, respectively). For the studied group, a weak, although significant correlation was found between age and IOP (R2 = 0.053, p = 0.04). Forward stepwise regression showed that CCT and the scale parameter of the Generalised Gamma distribution can be combined in a regression model (R2 = 0.39, p<0.001) to study the role of the corneal structure on IOP. Conclusions We show, for the first time, that corneal micro-structure influences the IOP measurements obtained from noncontact tonometry. OCT speckle statistics can be employed to learn about the corneal micro-structure and hence, to further calibrate the IOP measurements. PMID:28472178

  12. A Multiphase Validation of Atlas-Based Automatic and Semiautomatic Segmentation Strategies for Prostate MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London

    2013-01-01

    Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less

  13. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: I. Theory

    NASA Astrophysics Data System (ADS)

    Ren, W. X.; Lin, Y. Q.; Fang, S. E.

    2011-11-01

    One of the key issues in vibration-based structural health monitoring is to extract the damage-sensitive but environment-insensitive features from sampled dynamic response measurements and to carry out the statistical analysis of these features for structural damage detection. A new damage feature is proposed in this paper by using the system matrices of the forward innovation model based on the covariance-driven stochastic subspace identification of a vibrating system. To overcome the variations of the system matrices, a non-singularity transposition matrix is introduced so that the system matrices are normalized to their standard forms. For reducing the effects of modeling errors, noise and environmental variations on measured structural responses, a statistical pattern recognition paradigm is incorporated into the proposed method. The Mahalanobis and Euclidean distance decision functions of the damage feature vector are adopted by defining a statistics-based damage index. The proposed structural damage detection method is verified against one numerical signal and two numerical beams. It is demonstrated that the proposed statistics-based damage index is sensitive to damage and shows some robustness to the noise and false estimation of the system ranks. The method is capable of locating damage of the beam structures under different types of excitations. The robustness of the proposed damage detection method to the variations in environmental temperature is further validated in a companion paper by a reinforced concrete beam tested in the laboratory and a full-scale arch bridge tested in the field.

  14. Viewpoint: observations on scaled average bioequivalence.

    PubMed

    Patterson, Scott D; Jones, Byron

    2012-01-01

    The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Investigation of machinability characteristics on EN47 steel for cutting force and tool wear using optimization technique

    NASA Astrophysics Data System (ADS)

    M, Vasu; Shivananda Nayaka, H.

    2018-06-01

    In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.

  16. 20 CFR 416.2161 - Charges to States.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND... eligibility determinations. (3) The State must pay our full additional cost for statistical or other studies... for Medicaid purposes and for statistical or other studies and any other services. ...

  17. 20 CFR 416.2161 - Charges to States.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND... eligibility determinations. (3) The State must pay our full additional cost for statistical or other studies... for Medicaid purposes and for statistical or other studies and any other services. ...

  18. 20 CFR 416.2161 - Charges to States.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND... eligibility determinations. (3) The State must pay our full additional cost for statistical or other studies... for Medicaid purposes and for statistical or other studies and any other services. ...

  19. 20 CFR 416.2161 - Charges to States.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND... eligibility determinations. (3) The State must pay our full additional cost for statistical or other studies... for Medicaid purposes and for statistical or other studies and any other services. ...

  20. 20 CFR 416.2161 - Charges to States.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND... eligibility determinations. (3) The State must pay our full additional cost for statistical or other studies... for Medicaid purposes and for statistical or other studies and any other services. ...

  1. On Conceptual Analysis as the Primary Qualitative Approach to Statistics Education Research in Psychology

    ERIC Educational Resources Information Center

    Petocz, Agnes; Newbery, Glenn

    2010-01-01

    Statistics education in psychology often falls disappointingly short of its goals. The increasing use of qualitative approaches in statistics education research has extended and enriched our understanding of statistical cognition processes, and thus facilitated improvements in statistical education and practices. Yet conceptual analysis, a…

  2. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  3. VOXEL-LEVEL MAPPING OF TRACER KINETICS IN PET STUDIES: A STATISTICAL APPROACH EMPHASIZING TISSUE LIFE TABLES.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Mankoff, David A; Eary, Janet F; Spence, Alexander M; Krohn, Kenneth A

    2014-06-01

    Most radiotracers used in dynamic positron emission tomography (PET) scanning act in a linear time-invariant fashion so that the measured time-course data are a convolution between the time course of the tracer in the arterial supply and the local tissue impulse response, known as the tissue residue function. In statistical terms the residue is a life table for the transit time of injected radiotracer atoms. The residue provides a description of the tracer kinetic information measurable by a dynamic PET scan. Decomposition of the residue function allows separation of rapid vascular kinetics from slower blood-tissue exchanges and tissue retention. For voxel-level analysis, we propose that residues be modeled by mixtures of nonparametrically derived basis residues obtained by segmentation of the full data volume. Spatial and temporal aspects of diagnostics associated with voxel-level model fitting are emphasized. Illustrative examples, some involving cancer imaging studies, are presented. Data from cerebral PET scanning with 18 F fluoro-deoxyglucose (FDG) and 15 O water (H2O) in normal subjects is used to evaluate the approach. Cross-validation is used to make regional comparisons between residues estimated using adaptive mixture models with more conventional compartmental modeling techniques. Simulations studies are used to theoretically examine mean square error performance and to explore the benefit of voxel-level analysis when the primary interest is a statistical summary of regional kinetics. The work highlights the contribution that multivariate analysis tools and life-table concepts can make in the recovery of local metabolic information from dynamic PET studies, particularly ones in which the assumptions of compartmental-like models, with residues that are sums of exponentials, might not be certain.

  4. Role strain among male RNs in the critical care setting: Perceptions of an unfriendly workplace.

    PubMed

    Carte, Nicholas S; Williams, Collette

    2017-12-01

    Traditionally, nursing has been a female-dominated profession. Men employed as registered nurses have been in the minority and little is known about the experiences of this demographic. The purpose of this descriptive, quantitative study was to understand the relationship between the variables of demographics and causes of role strain among male nurses in critical care settings. The Sherrod Role Strain Scale assesses role strain within the context of role conflict, role overload, role ambiguity and role incongruity. Data analysis of the results included descriptive and inferential statistics. Inferential statistics involved the use of repeated measures ANOVA testing for significant difference in the causes of role strain between male nurses employed in critical care settings and a post hoc comparison of specific demographic data using multivariate analyses of variance (MANOVAs). Data from 37 male nurses in critical care settings from the northeast of the United States were used to calculate descriptive statistics standard deviation, mean of the data analysis and results of the repeated ANOVA and the post hoc secondary MANOVA analysis. The descriptive data showed that all participants worked full-time. There was an even split from those participants who worked day shift (46%) vs. night shift (43%), most the participants indicated they had 15 years or more experience as an registered nurse (54%). Significant findings of this study include two causes of role strain in male nurses employed in critical care settings which are: role ambiguity and role overload based on ethnicity. Consistent with previous research findings, the results of this study suggest that male registered nurses employed in critical care settings do experience role strain. The two main causes of role strain in male nurses are role ambiguity and role overload. Copyright © 2017. Published by Elsevier Ltd.

  5. Transdermal granisetron for the prevention of nausea and vomiting following moderately or highly emetogenic chemotherapy in Chinese patients: a randomized, double-blind, phase III study.

    PubMed

    Yang, Liu-Qing; Sun, Xin-Chen; Qin, Shu-Kui; Chen, Ying-Xia; Zhang, He-Long; Cheng, Ying; Chen, Zhen-Dong; Shi, Jian-Hua; Wu, Qiong; Bai, Yu-Xian; Han, Bao-Hui; Liu, Wei; Ouyang, Xue-Nong; Liu, Ji-Wei; Zhang, Zhi-Hui; Li, Yong-Qiang; Xu, Jian-Ming; Yu, Shi-Ying

    2016-12-01

    The granisetron transdermal delivery system (GTDS) has been demonstrated effectiveness in the control of chemotherapy-induced nausea and vomiting (CINV) in previous studies. This is the first phase III study to evaluate the efficacy and tolerability of GTDS in patients receiving moderately emetogenic chemotherapy (MEC) or highly emetogenic chemotherapy (HEC) in China. A total of 313 patients were randomized into the GTDS group (one transdermal granisetron patch, 7 days) or the oral granisetron group (granisetron oral 2 mg/day, ≥2 days). The primary endpoint was the percentage of patients achieving complete control (CC) from chemotherapy initiation until 24 h after final administration (PEEP). Chi-square test and Fisher's exact test were used for statistical analysis. Two hundred eighty-one patients were included in the per protocol analysis. During PEEP, CC was achieved by 67 (47.52%) patients in the GTDS group and 83 (59.29%) patients in the oral granisetron group. There was no statistical significance between the groups (P=0.0559). However, the difference of the CC percentage mainly occurred on the first day of chemotherapy between the groups. The CC was 70.13% on day 1 in the GTDS group, which was significantly lower than that of 91.03% in the oral granisetron group in the full analysis set. In the following days of chemotherapy, the CC was similar between the groups. In terms of cisplatin-contained regimen and female, there was statistical significance between the groups. Both treatments were well tolerated and safe. The most common adverse event was constipation. GTDS provided effective and well-tolerated control of CINV in Chinese patients, especially to non-cisplatin-contained regimen.

  6. Effect of piracetam and nimodipine on full-thickness skin burns in rabbits.

    PubMed

    Sari, Elif; Dincel, Gungor C

    2016-08-01

    The potential of several drugs for full-thickness skin burns has been investigated, but the treatment of such burns remains a challenge in plastic surgery. The present study was designed to determine the effect of systemic and topical administration of piracetam and nimodipine on full-thickness skin burn wound healing. A total of 36 New Zealand male rabbits were divided into six groups. Full-thickness skin burns were produced in all the groups, except the control group. Piracetam was administered systemically (piracetam-IV) and topically (piracetam-C) for 14 days, and nimodipine was administered systemically (nimodipine-IV) and topically (nimodipine-C) over the burn wounds for 14 days. The sham group underwent burn injury but was not administered any drug. After 21 days, gross examination and histopathological analysis were performed and the results were compared statistically. Nimodipine-C and nimodipine-IV had no effect on burn wound healing. However, both piracetam-IV and piracetam-C significantly enhanced the healing of the full-thickness skin burn wounds, although the latter was more effective, useful and practical in burn wound healing. The histopathological features of the wounds in the piracetam-C group were closer to those of the control group than those of the other groups. Piracetam-C rather than piracetam-IV may promote full-thickness burn wound healing in rabbits. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  7. Trends in Fetal Medicine: A 10-Year Bibliometric Analysis of Prenatal Diagnosis

    PubMed Central

    Dhombres, Ferdinand; Bodenreider, Olivier

    2018-01-01

    The objective is to automatically identify trends in Fetal Medicine over the past 10 years through a bibliometric analysis of articles published in Prenatal Diagnosis, using text mining techniques. We processed 2,423 full-text articles published in Prenatal Diagnosis between 2006 and 2015. We extracted salient terms, calculated their frequencies over time, and established evolution profiles for terms, from which we derived falling, stable, and rising trends. We identified 618 terms with a falling trend, 2,142 stable terms, and 839 terms with a rising trend. Terms with increasing frequencies include those related to statistics and medical study design. The most recent of these terms reflect the new opportunities of next- generation sequencing. Many terms related to cytogenetics exhibit a falling trend. A bibliometric analysis based on text mining effectively supports identification of trends over time. This scalable approach is complementary to analyses based on metadata or expert opinion. PMID:29295220

  8. Unmasking the masked Universe: the 2M++ catalogue through Bayesian eyes

    NASA Astrophysics Data System (ADS)

    Lavaux, Guilhem; Jasche, Jens

    2016-01-01

    This work describes a full Bayesian analysis of the Nearby Universe as traced by galaxies of the 2M++ survey. The analysis is run in two sequential steps. The first step self-consistently derives the luminosity-dependent galaxy biases, the power spectrum of matter fluctuations and matter density fields within a Gaussian statistic approximation. The second step makes a detailed analysis of the three-dimensional large-scale structures, assuming a fixed bias model and a fixed cosmology. This second step allows for the reconstruction of both the final density field and the initial conditions at z = 1000 assuming a fixed bias model. From these, we derive fields that self-consistently extrapolate the observed large-scale structures. We give two examples of these extrapolation and their utility for the detection of structures: the visibility of the Sloan Great Wall, and the detection and characterization of the Local Void using DIVA, a Lagrangian based technique to classify structures.

  9. IVHS Countermeasures for Rear-End Collisions, Task 1; Vol. II: Statistical Analysis

    DOT National Transportation Integrated Search

    1994-02-25

    This report is from the NHTSA sponsored program, "IVHS Countermeasures for Rear-End Collisions". This Volume, Volume II, Statistical Analysis, presents the statistical analysis of rear-end collision accident data that characterizes the accidents with...

  10. Full Wave Analysis of RF Signal Attenuation in a Lossy Rough Surface Cave using a High Order Time Domain Vector Finite Element Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pingenot, J; Rieben, R; White, D

    2005-10-31

    We present a computational study of signal propagation and attenuation of a 200 MHz planar loop antenna in a cave environment. The cave is modeled as a straight and lossy random rough wall. To simulate a broad frequency band, the full wave Maxwell equations are solved directly in the time domain via a high order vector finite element discretization using the massively parallel CEM code EMSolve. The numerical technique is first verified against theoretical results for a planar loop antenna in a smooth lossy cave. The simulation is then performed for a series of random rough surface meshes in ordermore » to generate statistical data for the propagation and attenuation properties of the antenna in a cave environment. Results for the mean and variance of the power spectral density of the electric field are presented and discussed.« less

  11. Effect of Stagger on the Vibroacoustic Loads from Clustered Rockets

    NASA Technical Reports Server (NTRS)

    Rojo, Raymundo; Tinney, Charles E.; Ruf, Joseph H.

    2016-01-01

    The effect of stagger startup on the vibro-acoustic loads that form during the end- effects-regime of clustered rockets is studied using both full-scale (hot-gas) and laboratory scale (cold gas) data. Both configurations comprise three nozzles with thrust optimized parabolic contours that undergo free shock separated flow and restricted shock separated flow as well as an end-effects regime prior to flowing full. Acoustic pressure waveforms recorded at the base of the nozzle clusters are analyzed using various statistical metrics as well as time-frequency analysis. The findings reveal a significant reduction in end- effects-regime loads when engine ignition is staggered. However, regardless of stagger, both the skewness and kurtosis of the acoustic pressure time derivative elevate to the same levels during the end-effects-regime event thereby demonstrating the intermittence and impulsiveness of the acoustic waveforms that form during engine startup.

  12. Statistical error propagation in ab initio no-core full configuration calculations of light nuclei

    DOE PAGES

    Navarro Pérez, R.; Amaro, J. E.; Ruiz Arriola, E.; ...

    2015-12-28

    We propagate the statistical uncertainty of experimental N N scattering data into the binding energy of 3H and 4He. Here, we also study the sensitivity of the magnetic moment and proton radius of the 3 H to changes in the N N interaction. The calculations are made with the no-core full configuration method in a sufficiently large harmonic oscillator basis. For those light nuclei we obtain Δ E stat (3H) = 0.015 MeV and Δ E stat ( 4He) = 0.055 MeV .

  13. Time series analysis for minority game simulations of financial markets

    NASA Astrophysics Data System (ADS)

    Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy

    2003-04-01

    The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  14. Quantitative analysis of the correlations in the Boltzmann-Grad limit for hard spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulvirenti, M.

    2014-12-09

    In this contribution I consider the problem of the validity of the Boltzmann equation for a system of hard spheres in the Boltzmann-Grad limit. I briefly review the results available nowadays with a particular emphasis on the celebrated Lanford’s validity theorem. Finally I present some recent results, obtained in collaboration with S. Simonella, concerning a quantitative analysis of the propagation of chaos. More precisely we introduce a quantity (the correlation error) measuring how close a j-particle rescaled correlation function at time t (sufficiently small) is far from the full statistical independence. Roughly speaking, a correlation error of order k, measuresmore » (in the context of the BBKGY hierarchy) the event in which k tagged particles form a recolliding group.« less

  15. The Heuristic Value of p in Inductive Statistical Inference

    PubMed Central

    Krueger, Joachim I.; Heck, Patrick R.

    2017-01-01

    Many statistical methods yield the probability of the observed data – or data more extreme – under the assumption that a particular hypothesis is true. This probability is commonly known as ‘the’ p-value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p-value has been subjected to much speculation, analysis, and criticism. We explore how well the p-value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p-value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p-value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say. PMID:28649206

  16. The Heuristic Value of p in Inductive Statistical Inference.

    PubMed

    Krueger, Joachim I; Heck, Patrick R

    2017-01-01

    Many statistical methods yield the probability of the observed data - or data more extreme - under the assumption that a particular hypothesis is true. This probability is commonly known as 'the' p -value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p -value has been subjected to much speculation, analysis, and criticism. We explore how well the p -value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p -value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p -value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say.

  17. Neuropsychological study of IQ scores in offspring of parents with bipolar I disorder.

    PubMed

    Sharma, Aditya; Camilleri, Nigel; Grunze, Heinz; Barron, Evelyn; Le Couteur, James; Close, Andrew; Rushton, Steven; Kelly, Thomas; Ferrier, Ian Nicol; Le Couteur, Ann

    2017-01-01

    Studies comparing IQ in Offspring of Bipolar Parents (OBP) with Offspring of Healthy Controls (OHC) have reported conflicting findings. They have included OBP with mental health/neurodevelopmental disorders and/or pharmacological treatment which could affect results. This UK study aimed to assess IQ in OBP with no mental health/neurodevelopmental disorder and assess the relationship of sociodemographic variables with IQ. IQ data using the Wechsler Abbreviated Scale of Intelligence (WASI) from 24 OBP and 34 OHC from the North East of England was analysed using mixed-effects modelling. All participants had IQ in the average range. OBP differed statistically significantly from OHC on Full Scale IQ (p = .001), Performance IQ (PIQ) (p = .003) and Verbal IQ (VIQ) (p = .001) but not on the PIQ-VIQ split. OBP and OHC groups did not differ on socio-economic status (SES) and gender. SES made a statistically significant contribution to the variance of IQ scores (p = .001). Using a robust statistical model of analysis, the OBP with no current/past history of mental health/neurodevelopmental disorders had lower IQ scores compared to OHC. This finding should be borne in mind when assessing and recommending interventions for OBP.

  18. Policing in the United States: Has the Time Come for a Full-Time National Police Force

    DTIC Science & Technology

    2016-06-10

    content/pub/pdf/fleo00.pdf. 103 Scalia, John. NCJ-163066, Bureau of Justice Statistics Bulletin: Juvenile Delinquents in the Federal Criminal Justice...96 viii ACRONYMS BJS Bureau of Justice Statistics CALEA Commission on Accrediting Law Enforcement Agencies COMPSTAT...Computer Statistics COR Corrections CRCC Civilian Review and Complaint Commission CRT Court Operations DEA Drug Enforcement Agency DHS

  19. [Analysis on composition and medication regularities of prescriptions treating hypochondriac pain based on traditional Chinese medicine inheritance support system inheritance support platform].

    PubMed

    Zhao, Yan-qing; Teng, Jing

    2015-03-01

    To analyze the composition and medication regularities of prescriptions treating hypochondriac pain in Chinese journal full-text database (CNKI) based on the traditional Chinese medicine inheritance support system, in order to provide a reference for further research and development for new traditional Chinese medicines treating hypochondriac pain. The traditional Chinese medicine inheritance support platform software V2. 0 was used to build a prescription database of Chinese medicines treating hypochondriac pain. The software integration data mining method was used to distribute prescriptions according to "four odors", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis were made for 192 prescriptions treating hypochondriac pain to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations and summarize 15 new prescriptions. This study indicated that the prescriptions treating hypochondriac pain in Chinese journal full-text database are mostly those for soothing liver-qi stagnation, promoting qi and activating blood, clearing heat and promoting dampness, and invigorating spleen and removing phlem, with a cold property and bitter taste, and reflect the principles of "distinguish deficiency and excess and relieving pain by smoothening meridians" in treating hypochondriac pain.

  20. Full Transcriptome Analysis of Early Dorsoventral Patterning in Zebrafish

    PubMed Central

    Horváth, Balázs; Molnár, János; Nagy, István; Tóth, Gábor; Wilson, Stephen W.; Varga, Máté

    2013-01-01

    Understanding the molecular interactions that lead to the establishment of the major body axes during embryogenesis is one of the main goals of developmental biology. Although the past two decades have revolutionized our knowledge about the genetic basis of these patterning processes, the list of genes involved in axis formation is unlikely to be complete. In order to identify new genes involved in the establishment of the dorsoventral (DV) axis during early stages of zebrafish embryonic development, we employed next generation sequencing for full transcriptome analysis of normal embryos and embryos lacking overt DV pattern. A combination of different statistical approaches yielded 41 differentially expressed candidate genes and we confirmed by in situ hybridization the early dorsal expression of 32 genes that are transcribed shortly after the onset of zygotic transcription. Although promoter analysis of the validated genes suggests no general enrichment for the binding sites of early acting transcription factors, most of these genes carry “bivalent” epigenetic histone modifications at the time when zygotic transcription is initiated, suggesting a “poised” transcriptional status. Our results reveal some new candidates of the dorsal gene regulatory network and suggest that a plurality of the earliest upregulated genes on the dorsal side have a role in the modulation of the canonical Wnt pathway. PMID:23922899

  1. Uncovering a latent multinomial: Analysis of mark-recapture data with misidentification

    USGS Publications Warehouse

    Link, W.A.; Yoshizaki, J.; Bailey, L.L.; Pollock, K.H.

    2010-01-01

    Natural tags based on DNA fingerprints or natural features of animals are now becoming very widely used in wildlife population biology. However, classic capture-recapture models do not allow for misidentification of animals which is a potentially very serious problem with natural tags. Statistical analysis of misidentification processes is extremely difficult using traditional likelihood methods but is easily handled using Bayesian methods. We present a general framework for Bayesian analysis of categorical data arising from a latent multinomial distribution. Although our work is motivated by a specific model for misidentification in closed population capture-recapture analyses, with crucial assumptions which may not always be appropriate, the methods we develop extend naturally to a variety of other models with similar structure. Suppose that observed frequencies f are a known linear transformation f = A???x of a latent multinomial variable x with cell probability vector ?? = ??(??). Given that full conditional distributions [?? | x] can be sampled, implementation of Gibbs sampling requires only that we can sample from the full conditional distribution [x | f, ??], which is made possible by knowledge of the null space of A???. We illustrate the approach using two data sets with individual misidentification, one simulated, the other summarizing recapture data for salamanders based on natural marks. ?? 2009, The International Biometric Society.

  2. Uncovering a Latent Multinomial: Analysis of Mark-Recapture Data with Misidentification

    USGS Publications Warehouse

    Link, W.A.; Yoshizaki, J.; Bailey, L.L.; Pollock, K.H.

    2009-01-01

    Natural tags based on DNA fingerprints or natural features of animals are now becoming very widely used in wildlife population biology. However, classic capture-recapture models do not allow for misidentification of animals which is a potentially very serious problem with natural tags. Statistical analysis of misidentification processes is extremely difficult using traditional likelihood methods but is easily handled using Bayesian methods. We present a general framework for Bayesian analysis of categorical data arising from a latent multinomial distribution. Although our work is motivated by a specific model for misidentification in closed population capture-recapture analyses, with crucial assumptions which may not always be appropriate, the methods we develop extend naturally to a variety of other models with similar structure. Suppose that observed frequencies f are a known linear transformation f=A'x of a latent multinomial variable x with cell probability vector pi= pi(theta). Given that full conditional distributions [theta | x] can be sampled, implementation of Gibbs sampling requires only that we can sample from the full conditional distribution [x | f, theta], which is made possible by knowledge of the null space of A'. We illustrate the approach using two data sets with individual misidentification, one simulated, the other summarizing recapture data for salamanders based on natural marks.

  3. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  4. Computer Automated Ultrasonic Inspection System

    DTIC Science & Technology

    1985-02-06

    Reports 74 3.1.4 Statistical Analysis Capability 74 3.2 Nondestructive Evaluation Terminal Hardware 76 3.3 Nondestructive Evaluation Terminal Vendor...3.4.2.6 Create a Hold Tape 103 vi TABLE OF CONTENTS SECTION PAGE 3.4.3 System Status 104 3.4.4 Statistical Analysis 105 3.4.4.1 Statistical Analysis...Data Extraction 105 3.4.4.2 Statistical Analysis Report and Display Generation 106 3.4.5 Quality Assurance Reports 106 3.4.6 Nondestructive Inspection

  5. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    PubMed

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  6. Application and Effects of Linguistic Functions on Information Retrieval in a German Language Full-Text Database: Comparison between Retrieval in Abstract and Full Text.

    ERIC Educational Resources Information Center

    Tauchert, Wolfgang; And Others

    1991-01-01

    Describes the PADOK-II project in Germany, which was designed to give information on the effects of linguistic algorithms on retrieval in a full-text database, the German Patent Information System (GPI). Relevance assessments are discussed, statistical evaluations are described, and searches are compared for the full-text section versus the…

  7. MEG and EEG data analysis with MNE-Python.

    PubMed

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-12-26

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne.

  8. MEG and EEG data analysis with MNE-Python

    PubMed Central

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A.; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne. PMID:24431986

  9. A comparative study of inter-abutment distance of dies made from full arch dual-arch impression trays with those made from full arch stock trays: an in vitro study.

    PubMed

    Reddy, Jagan Mohan; Prashanti, E; Kumar, G Vinay; Suresh Sajjan, M C; Mathew, Xavier

    2009-01-01

    The dual-arch impression technique is convenient in that it makes the required maxillary and mandibular impressions, as well as the inter-occlusal record in one procedure. The accuracy of inter-abutment distance in dies fabricated from dual-arch impression technique remains in question because there is little information available in the literature. This study was conducted to evaluate the accuracy of inter-abutment distance in dies obtained from full arch dual-arch trays with those obtained from full arch stock metal trays. The metal dual-arch trays showed better accuracy followed by the plastic dual-arch and stock dentulous trays, respectively, though statistically insignificant. The pouring sequence did not have any effect on the inter-abutment distance statistically, though pouring the non-working side of the dual-arch impression first showed better accuracy.

  10. Multivariate random-parameters zero-inflated negative binomial regression model: an application to estimate crash frequencies at intersections.

    PubMed

    Dong, Chunjiao; Clarke, David B; Yan, Xuedong; Khattak, Asad; Huang, Baoshan

    2014-09-01

    Crash data are collected through police reports and integrated with road inventory data for further analysis. Integrated police reports and inventory data yield correlated multivariate data for roadway entities (e.g., segments or intersections). Analysis of such data reveals important relationships that can help focus on high-risk situations and coming up with safety countermeasures. To understand relationships between crash frequencies and associated variables, while taking full advantage of the available data, multivariate random-parameters models are appropriate since they can simultaneously consider the correlation among the specific crash types and account for unobserved heterogeneity. However, a key issue that arises with correlated multivariate data is the number of crash-free samples increases, as crash counts have many categories. In this paper, we describe a multivariate random-parameters zero-inflated negative binomial (MRZINB) regression model for jointly modeling crash counts. The full Bayesian method is employed to estimate the model parameters. Crash frequencies at urban signalized intersections in Tennessee are analyzed. The paper investigates the performance of MZINB and MRZINB regression models in establishing the relationship between crash frequencies, pavement conditions, traffic factors, and geometric design features of roadway intersections. Compared to the MZINB model, the MRZINB model identifies additional statistically significant factors and provides better goodness of fit in developing the relationships. The empirical results show that MRZINB model possesses most of the desirable statistical properties in terms of its ability to accommodate unobserved heterogeneity and excess zero counts in correlated data. Notably, in the random-parameters MZINB model, the estimated parameters vary significantly across intersections for different crash types. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Detailed analysis of grid-based molecular docking: A case study of CDOCKER-A CHARMm-based MD docking algorithm.

    PubMed

    Wu, Guosheng; Robertson, Daniel H; Brooks, Charles L; Vieth, Michal

    2003-10-01

    The influence of various factors on the accuracy of protein-ligand docking is examined. The factors investigated include the role of a grid representation of protein-ligand interactions, the initial ligand conformation and orientation, the sampling rate of the energy hyper-surface, and the final minimization. A representative docking method is used to study these factors, namely, CDOCKER, a molecular dynamics (MD) simulated-annealing-based algorithm. A major emphasis in these studies is to compare the relative performance and accuracy of various grid-based approximations to explicit all-atom force field calculations. In these docking studies, the protein is kept rigid while the ligands are treated as fully flexible and a final minimization step is used to refine the docked poses. A docking success rate of 74% is observed when an explicit all-atom representation of the protein (full force field) is used, while a lower accuracy of 66-76% is observed for grid-based methods. All docking experiments considered a 41-member protein-ligand validation set. A significant improvement in accuracy (76 vs. 66%) for the grid-based docking is achieved if the explicit all-atom force field is used in a final minimization step to refine the docking poses. Statistical analysis shows that even lower-accuracy grid-based energy representations can be effectively used when followed with full force field minimization. The results of these grid-based protocols are statistically indistinguishable from the detailed atomic dockings and provide up to a sixfold reduction in computation time. For the test case examined here, improving the docking accuracy did not necessarily enhance the ability to estimate binding affinities using the docked structures. Copyright 2003 Wiley Periodicals, Inc.

  12. Computers as an Instrument for Data Analysis. Technical Report No. 11.

    ERIC Educational Resources Information Center

    Muller, Mervin E.

    A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…

  13. Shock and Vibration Symposium (59th) Held in Albuquerque, New Mexico on 18-20 October 1988. Volume 1

    DTIC Science & Technology

    1988-10-01

    Partial contents: The Quest for Omega = sq root(K/M) -- Notes on the development of vibration analysis; An overview of Statistical Energy analysis ; Its...and inplane vibration transmission in statistical energy analysis ; Vibroacoustic response using the finite element method and statistical energy analysis ; Helium

  14. Parameter optimization in biased decoy-state quantum key distribution with both source errors and statistical fluctuations

    NASA Astrophysics Data System (ADS)

    Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin

    2017-10-01

    The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.

  15. Angular photogrammetric analysis of the soft-tissue facial profile of Indian adults.

    PubMed

    Pandian, K Saravana; Krishnan, Sindhuja; Kumar, S Aravind

    2018-01-01

    Soft-tissue analysis has become an important component of orthodontic diagnosis and treatment planning. Photographic evaluation of an orthodontic patient is a very close representation of the appearance of the person. The previously established norms for soft-tissue analysis will vary for different ethnic groups. Thus, there is a need to develop soft-tissue facial profile norms pertaining to Indian ethnic groups. The aim of this study is to establish the angular photogrammetric standards of soft-tissue facial profile for Indian males and females and also to compare sexual dimorphism present between them. The lateral profile photographs of 300 random participants (150 males and 150 females) between ages 18 and 25 years were taken and analyzed using FACAD tracing software. Inclusion criteria were angles Class I molar occlusion with acceptable crowding and proclination, normal growth and development with well-aligned dental arches, and full complements of permanent teeth irrespective of third molar status. This study was conducted in Indian population, and samples were taken from various cities across India. Descriptive statistical analysis was carried out, and sexual dimorphism was evaluated by Student's t-test between males and females. The results of the present study showed statistically significant (P < 0.05) gender difference in 5 parameters out of 12 parameters in Indian population. In the present study, soft-tissue facial measurements were established by means of photogrammetric analysis to facilitate orthodontists to carry out more quantitative evaluation and make disciplined decisions. The mean values obtained can be used for comparison with records of participants with the same characteristics by following this photogrammetric technique.

  16. Evaluation of breast reduction surgery effect on body posture and gait pattern using three-dimensional gait analysis.

    PubMed

    Sahin, Ismail; Iskender, Salim; Ozturk, Serdar; Balaban, Birol; Isik, Selcuk

    2013-06-01

    Breast hypertrophy is a significant health burden with symptoms of back and shoulder pain, intertrigo, and shoulder grooving from the bra straps. Women often rely on surgery to relieve these symptoms, and they are mostly satisfied with the results. The satisfaction from surgery usually is evaluated by subjective measures. Objective evidence testing of the surgical outcomes is lacking. In this study, 10 women with breast hypertrophy underwent reduction mammaplasty. Their surgical outcomes were evaluated using three-dimensional gait analysis before surgery and 2 months afterward. A statistical difference was sought between the kinematic data of the spine, hip, knee, and ankle joints. The average maximum anterior pelvic tilt angles decreased 41 %, and the average maximum spine anterior flexion angles decreased 30 %. The difference between the pre- and postoperative values was statistically significant. The analysis of the kinematic data showed no significant difference in the hip, knee, or ankle joint angles postoperatively. The outcomes of breast reduction surgery have been evaluated mostly by subjective means until recently. As an objective evidence for surgical gain in the current study, reduction mammaplasty resulted in the patients' improved body posture when walking. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  17. The STAT7 Code for Statistical Propagation of Uncertainties In Steady-State Thermal Hydraulics Analysis of Plate-Fueled Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Floyd E.; Hu, Lin-wen; Wilson, Erik

    The STAT code was written to automate many of the steady-state thermal hydraulic safety calculations for the MIT research reactor, both for conversion of the reactor from high enrichment uranium fuel to low enrichment uranium fuel and for future fuel re-loads after the conversion. A Monte-Carlo statistical propagation approach is used to treat uncertainties in important parameters in the analysis. These safety calculations are ultimately intended to protect against high fuel plate temperatures due to critical heat flux or departure from nucleate boiling or onset of flow instability; but additional margin is obtained by basing the limiting safety settings onmore » avoiding onset of nucleate boiling. STAT7 can simultaneously analyze all of the axial nodes of all of the fuel plates and all of the coolant channels for one stripe of a fuel element. The stripes run the length of the fuel, from the bottom to the top. Power splits are calculated for each axial node of each plate to determine how much of the power goes out each face of the plate. By running STAT7 multiple times, full core analysis has been performed by analyzing the margin to ONB for each axial node of each stripe of each plate of each element in the core.« less

  18. Eagle Plus Air Superiority into the 21st Century

    DTIC Science & Technology

    1996-04-01

    18 Data Collection Method ....................................................................................... 18 Statistical Trend Analysis...19 Statistical Readiness Analysis.................................................................................... 20 Aging Aircraft...generated by Mr. Jeff Hill served as the foundation of our statistical analysis. Special thanks go out to Mrs. Betsy Mullis, LFLL branch chief, and to

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  20. The Content of Statistical Requirements for Authors in Biomedical Research Journals

    PubMed Central

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-01-01

    Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343

  1. The Content of Statistical Requirements for Authors in Biomedical Research Journals.

    PubMed

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-10-20

    Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.

  2. Numerical Estimation of Sound Transmission Loss in Launch Vehicle Payload Fairing

    NASA Astrophysics Data System (ADS)

    Chandana, Pawan Kumar; Tiwari, Shashi Bhushan; Vukkadala, Kishore Nath

    2017-08-01

    Coupled acoustic-structural analysis of a typical launch vehicle composite payload faring is carried out, and results are validated with experimental data. Depending on the frequency range of interest, prediction of vibro-acoustic behavior of a structure is usually done using the finite element method, boundary element method or through statistical energy analysis. The present study focuses on low frequency dynamic behavior of a composite payload fairing structure using both coupled and uncoupled vibro-acoustic finite element models up to 710 Hz. A vibro-acoustic model, characterizing the interaction between the fairing structure, air cavity, and satellite, is developed. The external sound pressure levels specified for the payload fairing's acoustic test are considered as external loads for the analysis. Analysis methodology is validated by comparing the interior noise levels with those obtained from full scale Acoustic tests conducted in a reverberation chamber. The present approach has application in the design and optimization of acoustic control mechanisms at lower frequencies.

  3. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    PubMed

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants. Copyright © 2018. Published by Elsevier Ltd.

  4. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  5. Tri-Center Analysis: Determining Measures of Trichotomous Central Tendency for the Parametric Analysis of Tri-Squared Test Results

    ERIC Educational Resources Information Center

    Osler, James Edward

    2014-01-01

    This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…

  6. Interventions for unilateral and bilateral refractive amblyopia.

    PubMed

    Taylor, Kate; Powell, Christine; Hatt, Sarah R; Stewart, Catherine

    2012-04-18

    Refractive amblyopia is a common cause of reduced visual acuity in childhood, but optimal treatment is not well defined. This review examined the treatment effect from spectacles and conventional occlusion. Evaluation of the evidence of the effectiveness of spectacles, occlusion or both in the treatment of unilateral and bilateral refractive amblyopia. We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2012, Issue 1), MEDLINE (January 1950 to January 2012), EMBASE (January 1980 to January 2012), Latin American and Caribbean Health Sciences Literature Database (LILACS) (January 1982 to January 2012), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov) and the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). There were no date or language restrictions in the electronic searches for trials. We last searched the electronic databases on 24 January 2012. We manually searched relevant conference proceedings. Randomised controlled trials of treatment for unilateral and bilateral refractive amblyopia by spectacles, with or without occlusion, were eligible. We included studies with participants of any age. Two authors independently assessed abstracts identified by the searches. We obtained full-text copies and contacted study authors where necessary. Eleven trials were eligible for inclusion. We extracted data from eight. Insufficient data were present for the remaining three trials so data extraction was not possible. We identified no trials as containing participants with bilateral amblyopia. We performed no meta-analysis as there were insufficient trials for each outcome. For all studies mean acuity (standard deviation (SD)) in the amblyopic eye post-treatment was reported. All included trials reported treatment for unilateral refractive amblyopia.One study randomised participants to spectacles only compared to no treatment, spectacles plus occlusion compared to no treatment and spectacles plus occlusion versus spectacles only. For spectacles only versus no treatment, mean (SD) visual acuity was: spectacles group 0.31 (0.17); no treatment group 0.42 (0.19) and mean difference (MD) between groups was -0.11 (borderline statistical significance: 95% confidence interval (CI) -0.22 to 0.00). For spectacles plus occlusion versus no treatment, mean (SD) visual acuity was: full treatment 0.22 (0.13); no treatment 0.42 (0.19). Mean difference (MD) between the groups -0.20 (statistically significant: 95% CI -0.30 to -0.10). For spectacles plus occlusion versus spectacles only, MD was -0.09 (borderline statistical significance 95% CI -0.18 to 0.00). For two other trials that also looked at this comparison MD was -0.15 (not statistically significant 95% CI -0.32 to 0.02) for one trial and MD 0.01 (not statistically significant 95% CI -0.08 to 0.10) for the second trial.Three trials reviewed occlusion regimes.One trial looked at two hours versus six hours for moderate amblyopia: MD 0.01 (not statistically significant: 95% CI -0.06 to 0.08); a second trial 2003b reviewed six hours versus full-time for severe amblyopia: MD 0.03 (not statistically significant: 95% CI -0.08 to 0.14) and a third trial looked at six hours versus full-time occlusion: MD -0.12 (not statistically significant: 95% CI -0.27 to 0.03). One trial looked at occlusion supplemented with near or distance activities: MD-0.03 (not statistically significant 95% CI -0.09 to 0.03). One trial looked at partial occlusion and glasses versus glasses only: MD -0.01 (not statistically significant: 95% CI -0.05 to 0.03). In some cases of unilateral refractive amblyopia it appears that there is a treatment benefit from refractive correction alone. Where amblyopia persists there is evidence that adding occlusion further improves vision. Despite advances in the understanding of the treatment of amblyopia it is currently still not possible to tailor individual treatment plans for amblyopia. The nature of any dose/response effect from occlusion still needs to be clarified. Partial occlusion appears to have the same treatment effect as glasses alone when started simultaneously for the treatment of unilateral refractive amblyopia. Treatment regimes for bilateral and unilateral refractive amblyopia need to be investigated further.

  7. Hematological parameters of human immunodeficiency virus positive pregnant women on antiretroviral therapy in Aminu Kano Teaching Hospital Kano, North Western Nigeria.

    PubMed

    Abdulqadir, Ibrahim; Ahmed, Sagir Gumel; Kuliya, Aisha Gwarzo; Tukur, Jamilu; Yusuf, Aminu Abba; Musa, Abubakar Umar

    2018-01-01

    Human immunodeficiency virus (HIV) scourge continues to affect young women within the reproductive age group and pregnancy is a recognized indication for the use antiretroviral (ARV) drugs among HIV-positive women. The aim is to determine the combined effect of pregnancy, HIV and ARV drugs on the hematological parameters of the pregnant women. This was a comparative cross-sectional study conducted among 70 each of HIV-positive and negative pregnant women. Bio-demographic and clinical data were extracted from the client folder and 4 ml of blood sample was obtained from each participant. Full blood count was generated using Swelab automatic hematology analyzer while reticulocyte count and erythrocyte sedimentation rate (ESR) were conducted manually. Data analysis was performed using SPSS version software 16 while P < 0.05 was considered statistically significant. Pregnant women with HIV had statistically significant lower hematocrit and white blood cell (WBC) and higher ESR than pregnant women without HIV ( P < 0.000). There was no statistically significant difference between the two groups in terms of platelet and reticulocyte ( P > 0.05). However, among HIV positive pregnant women, those with CD4 count <350/μL had statistically significant lower WBC and lymphocyte count than those with CD4 count ≥350/μL ( P < 0.05), whereas, those on zidovudine (AZT)-containing treatment had statistically significant lower hematocrit and higher mean cell volume than those on non-AZT-containing treatment ( P < 0.05), but there was no statistically significant difference in any of the hematological parameters ( P > 0.050) between women on first- and second-line ARV regimens. There is a significant difference in terms of hematological parameters between HIV-positive and HIV-negative pregnant women in this environment.

  8. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  9. Meteorological Development Laboratory Student Career Experience Program

    NASA Astrophysics Data System (ADS)

    McCalla, C., Sr.

    2007-12-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Weather Service (NWS) provides weather, hydrologic, and climate forecasts and warnings for the protection of life and property and the enhancement of the national economy. The NWS's Meteorological Development Laboratory (MDL) supports this mission by developing meteorological prediction methods. Given this mission, NOAA, NWS, and MDL all have a need to continually recruit talented scientists. One avenue for recruiting such talented scientist is the Student Career Experience Program (SCEP). Through SCEP, MDL offers undergraduate and graduate students majoring in meteorology, computer science, mathematics, oceanography, physics, and statistics the opportunity to alternate full-time paid employment with periods of full-time study. Using SCEP as a recruiting vehicle, MDL has employed students who possess some of the very latest technical skills and knowledge needed to make meaningful contributions to projects within the lab. MDL has recently expanded its use of SCEP and has increased the number of students (sometimes called co- ops) in its program. As a co-op, a student can expect to develop and implement computer based scientific techniques, participate in the development of statistical algorithms, assist in the analysis of meteorological data, and verify forecasts. This presentation will focus on describing recruitment, projects, and the application process related to MDL's SCEP. In addition, this presentation will also briefly explore the career paths of students who successfully completed the program.

  10. Teaching Principles of Linkage and Gene Mapping with the Tomato.

    ERIC Educational Resources Information Center

    Hawk, James A.; And Others

    1980-01-01

    A three-point linkage system in tomatoes is used to explain concepts of gene mapping, linking and statistical analysis. The system is designed for teaching the effective use of statistics, and the power of genetic analysis from statistical analysis of phenotypic ratios. (Author/SA)

  11. APPLICATION OF STATISTICAL ENERGY ANALYSIS TO VIBRATIONS OF MULTI-PANEL STRUCTURES.

    DTIC Science & Technology

    cylindrical shell are compared with predictions obtained from statistical energy analysis . Generally good agreement is observed. The flow of mechanical...the coefficients of proportionality between power flow and average modal energy difference, which one must know in order to apply statistical energy analysis . No

  12. 76 FR 11195 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ..., econometrics, cognitive psychology, and computer science as they pertain to the full range of Census Bureau... technical expertise from the following disciplines: demography, economics, geography, psychology, statistics..., psychology, statistics, survey methodology, social and behavioral sciences, Information Technology, computing...

  13. The German Registry of immune tolerance treatment in hemophilia--1999 update.

    PubMed

    Lenk, H

    2000-10-01

    As of 1999, the German registry of immune tolerance treatment in hemophilia has received reports on 146 patients who have undergone this therapy from 25 hemophilia centers. In 16 of the reported patients treatment is ongoing. Therapy has been completed in 126 patients of all groups with hemophilia A; most of them are children. In 78.6% of hemophilia A patients full success was achieved, 8.7% finished with partial success, and in 12.7% ITT failed. Statistical analysis demonstrates that interruptions of therapy have a negative influence on success. The inhibitor titer has the highest predictive value for success or failure of therapy. A high maximum titer as well as a high titer at start of treatment were related to a low success rate. Other variables such as exposure days and time interval between inhibitor detection and start of ITT were not statistically significant. Four patients with hemophilia B have also completed therapy, only one of them with success.

  14. Diffraction based Hanbury Brown and Twiss interferometry at a hard x-ray free-electron laser

    DOE PAGES

    Gorobtsov, O. Yu.; Mukharamova, N.; Lazarev, S.; ...

    2018-02-02

    X-ray free-electron lasers (XFELs) provide extremely bright and highly spatially coherent x-ray radiation with femtosecond pulse duration. Currently, they are widely used in biology and material science. Knowledge of the XFEL statistical properties during an experiment may be vitally important for the accurate interpretation of the results. Here, for the first time, we demonstrate Hanbury Brown and Twiss (HBT) interferometry performed in diffraction mode at an XFEL source. It allowed us to determine the XFEL statistical properties directly from the Bragg peaks originating from colloidal crystals. This approach is different from the traditional one when HBT interferometry is performed inmore » the direct beam without a sample. Our analysis has demonstrated nearly full (80%) global spatial coherence of the XFEL pulses and an average pulse duration on the order of ten femtoseconds for the monochromatized beam, which is significantly shorter than expected from the electron bunch measurements.« less

  15. Extracting semantic representations from word co-occurrence statistics: stop-lists, stemming, and SVD.

    PubMed

    Bullinaria, John A; Levy, Joseph P

    2012-09-01

    In a previous article, we presented a systematic computational study of the extraction of semantic representations from the word-word co-occurrence statistics of large text corpora. The conclusion was that semantic vectors of pointwise mutual information values from very small co-occurrence windows, together with a cosine distance measure, consistently resulted in the best representations across a range of psychologically relevant semantic tasks. This article extends that study by investigating the use of three further factors--namely, the application of stop-lists, word stemming, and dimensionality reduction using singular value decomposition (SVD)--that have been used to provide improved performance elsewhere. It also introduces an additional semantic task and explores the advantages of using a much larger corpus. This leads to the discovery and analysis of improved SVD-based methods for generating semantic representations (that provide new state-of-the-art performance on a standard TOEFL task) and the identification and discussion of problems and misleading results that can arise without a full systematic study.

  16. pyblocxs: Bayesian Low-Counts X-ray Spectral Analysis in Sherpa

    NASA Astrophysics Data System (ADS)

    Siemiginowska, A.; Kashyap, V.; Refsdal, B.; van Dyk, D.; Connors, A.; Park, T.

    2011-07-01

    Typical X-ray spectra have low counts and should be modeled using the Poisson distribution. However, χ2 statistic is often applied as an alternative and the data are assumed to follow the Gaussian distribution. A variety of weights to the statistic or a binning of the data is performed to overcome the low counts issues. However, such modifications introduce biases or/and a loss of information. Standard modeling packages such as XSPEC and Sherpa provide the Poisson likelihood and allow computation of rudimentary MCMC chains, but so far do not allow for setting a full Bayesian model. We have implemented a sophisticated Bayesian MCMC-based algorithm to carry out spectral fitting of low counts sources in the Sherpa environment. The code is a Python extension to Sherpa and allows to fit a predefined Sherpa model to high-energy X-ray spectral data and other generic data. We present the algorithm and discuss several issues related to the implementation, including flexible definition of priors and allowing for variations in the calibration information.

  17. Diffraction based Hanbury Brown and Twiss interferometry at a hard x-ray free-electron laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorobtsov, O. Yu.; Mukharamova, N.; Lazarev, S.

    X-ray free-electron lasers (XFELs) provide extremely bright and highly spatially coherent x-ray radiation with femtosecond pulse duration. Currently, they are widely used in biology and material science. Knowledge of the XFEL statistical properties during an experiment may be vitally important for the accurate interpretation of the results. Here, for the first time, we demonstrate Hanbury Brown and Twiss (HBT) interferometry performed in diffraction mode at an XFEL source. It allowed us to determine the XFEL statistical properties directly from the Bragg peaks originating from colloidal crystals. This approach is different from the traditional one when HBT interferometry is performed inmore » the direct beam without a sample. Our analysis has demonstrated nearly full (80%) global spatial coherence of the XFEL pulses and an average pulse duration on the order of ten femtoseconds for the monochromatized beam, which is significantly shorter than expected from the electron bunch measurements.« less

  18. Outcry Consistency and Prosecutorial Decisions in Child Sexual Abuse Cases.

    PubMed

    Bracewell, Tammy E

    2018-05-18

    This study examines the correlation between the consistency in a child's sexual abuse outcry and the prosecutorial decision to accept or reject cases of child sexual abuse. Case-specific information was obtained from one Texas Children's Advocacy Center on all cases from 2010 to 2013. After the needed deletion, the total number of cases included in the analysis was 309. An outcry was defined as a sexual abuse disclosure. Consistency was measured at both the forensic interview and the sexual assault exam. Logistic regression was used to evaluate whether a correlation existed between disclosure and prosecutorial decisions. Disclosure was statistically significant. Partial disclosure (disclosure at one point in time and denial at another) versus full disclosure (disclosure at two points in time) had a statistically significant odds ratio of 4.801. Implications are discussed, specifically, how the different disciplines involved in child protection should take advantage of the expertise of both forensic interviewers and forensic nurses to inform their decisions.

  19. Real-Time Noise Removal for Line-Scanning Hyperspectral Devices Using a Minimum Noise Fraction-Based Approach

    PubMed Central

    Bjorgan, Asgeir; Randeberg, Lise Lyngsnes

    2015-01-01

    Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717

  20. Statistical analysis of lithium iron sulfide status cell cycle life and failure mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, E.C.; Battles, J.E.; Miller, W.E.

    1983-08-01

    A statistical model was developed for life cycle testing of electrochemical cell life cycle trials and verified experimentally. The Weibull distribution was selected to predict the end of life for a cell, based on a 20 percent loss of initial stabilized capacity or a decrease to less than 95 percent coulombic efficiency. Groups of 12 or more Li-alloy/FeS cells were cycled to determine the mean time to failure (MTTF) and also to identify the failure modes. The cells were all full size electric vehicle batteries with 150-350 A-hr capacity. The Weibull shape factors were determined and verified in prediction ofmore » the number of cell failures in two 10 cell modules. The short circuit failure in the cells with BN-felt and MgO powder separators were found to be caused by the formation of Li-Al protrusions that penetrated the BN-felt separators, and the extrusion of active material at the edge of the electrodes.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A., E-mail: kaurov@uchicago.edu

    The methods for studying the epoch of cosmic reionization vary from full radiative transfer simulations to purely analytical models. While numerical approaches are computationally expensive and are not suitable for generating many mock catalogs, analytical methods are based on assumptions and approximations. We explore the interconnection between both methods. First, we ask how the analytical framework of excursion set formalism can be used for statistical analysis of numerical simulations and visual representation of the morphology of ionization fronts. Second, we explore the methods of training the analytical model on a given numerical simulation. We present a new code which emergedmore » from this study. Its main application is to match the analytical model with a numerical simulation. Then, it allows one to generate mock reionization catalogs with volumes exceeding the original simulation quickly and computationally inexpensively, meanwhile reproducing large-scale statistical properties. These mock catalogs are particularly useful for cosmic microwave background polarization and 21 cm experiments, where large volumes are required to simulate the observed signal.« less

  2. Statics and Dynamics of Selfish Interactions in Distributed Service Systems

    PubMed Central

    Altarelli, Fabrizio; Braunstein, Alfredo; Dall’Asta, Luca

    2015-01-01

    We study a class of games which models the competition among agents to access some service provided by distributed service units and which exhibits congestion and frustration phenomena when service units have limited capacity. We propose a technique, based on the cavity method of statistical physics, to characterize the full spectrum of Nash equilibria of the game. The analysis reveals a large variety of equilibria, with very different statistical properties. Natural selfish dynamics, such as best-response, usually tend to large-utility equilibria, even though those of smaller utility are exponentially more numerous. Interestingly, the latter actually can be reached by selecting the initial conditions of the best-response dynamics close to the saturation limit of the service unit capacities. We also study a more realistic stochastic variant of the game by means of a simple and effective approximation of the average over the random parameters, showing that the properties of the average-case Nash equilibria are qualitatively similar to the deterministic ones. PMID:26177449

  3. A Frequency Domain Approach to Pretest Analysis Model Correlation and Model Updating for the Mid-Frequency Range

    DTIC Science & Technology

    2009-02-01

    range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The corresponding...frequency range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The...predictions. The averaging process is consistent with the averaging done in statistical energy analysis for stochastic systems. The FEM will always

  4. The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics

    DTIC Science & Technology

    1974-08-01

    VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated

  5. Influence of maxillary canine gingival margin asymmetries on the perception of smile esthetics among orthodontists and laypersons.

    PubMed

    Correa, Bruna Dieder; Vieira Bittencourt, Marcos Alan; Machado, Andre Wilson

    2014-01-01

    Our objective was to determine the perception of smile esthetics among orthodontists and laypeople with respect to asymmetries in the maxillary canines' gingival margins in full-face and close-up smile analyses. Full-face and close-up photographs of the frontal smiles of 4 subjects (2 women, 2 men) were used. The images were digitally altered to create a symmetrical image with the gingival margin levels of the maxillary canines matching the central incisors. From this new image, 5 stages of alterations were made in the gingival margin of the right canine in 0.5-mm increments. Final full-face and close-up images of the smiles were assessed by 50 orthodontists and 50 laypeople, who indicated the level of attractiveness of each smile on visual analog scales. The data collected were statistically analyzed by means of 1-way analysis of variance with the Tukey post-hoc test and the unpaired Student t test. In general, the most attractive smiles for the orthodontists were those without asymmetries and the one with a 0.5-mm asymmetry, whereas laypersons could not detect an asymmetry up to 1.5 mm (P <0.05). For both groups of raters, the lowest scores were assigned for the smiles with asymmetries of 2.0 and 2.5 mm (P <0.05). When opinions of orthodontists and laypersons were compared, in most situations a statistically significant difference was found, with orthodontists more sensitive in detecting deviations (P <0.001). Moreover, there was no significant difference (P >0.05) between the full-face and close-up assessments of the smiles. It can be concluded that the perceptions of unilateral asymmetries in the gingival margin levels of the maxillary canines were 1.0 mm for orthodontists and 1.5 to 2.0 mm for laypersons. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  6. A Statistical Analysis of the Output Signals of an Acousto-Optic Spectrum Analyzer for CW (Continuous-Wave) Signals

    DTIC Science & Technology

    1988-10-01

    A statistical analysis on the output signals of an acousto - optic spectrum analyzer (AOSA) is performed for the case when the input signal is a...processing, Electronic warfare, Radar countermeasures, Acousto - optic , Spectrum analyzer, Statistical analysis, Detection, Estimation, Canada, Modelling.

  7. Statistical Power in Meta-Analysis

    ERIC Educational Resources Information Center

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  8. Wafer-scale solution-derived molecular gate dielectrics for low-voltage graphene electronics

    NASA Astrophysics Data System (ADS)

    Sangwan, Vinod K.; Jariwala, Deep; Everaerts, Ken; McMorrow, Julian J.; He, Jianting; Grayson, Matthew; Lauhon, Lincoln J.; Marks, Tobin J.; Hersam, Mark C.

    2014-02-01

    Graphene field-effect transistors are integrated with solution-processed multilayer hybrid organic-inorganic self-assembled nanodielectrics (SANDs). The resulting devices exhibit low-operating voltage (2 V), negligible hysteresis, current saturation with intrinsic gain >1.0 in vacuum (pressure < 2 × 10-5 Torr), and overall improved performance compared to control devices on conventional SiO2 gate dielectrics. Statistical analysis of the field-effect mobility and residual carrier concentration demonstrate high spatial uniformity of the dielectric interfacial properties and graphene transistor characteristics over full 3 in. wafers. This work thus establishes SANDs as an effective platform for large-area, high-performance graphene electronics.

  9. Large signal design - Performance and simulation of a 3 W C-band GaAs power MMIC

    NASA Astrophysics Data System (ADS)

    White, Paul M.; Hendrickson, Mary A.; Chang, Wayne H.; Curtice, Walter R.

    1990-04-01

    This paper describes a C-band GaAs power MMIC amplifier that achieved a gain of 17 dB and 1 dB compressed CW power output of 34 dBm across a 4.5-6.25-GHz frequency range, without design iteration. The first-pass design success was achieved due to the application of a harmonic balance simulator to define the optimum output load, using a large-signal FET model determined statistically on a well controlled foundry-ready process line. The measured performance was close to that predicted by a full harmonic balance circuit analysis.

  10. Journal of Transportation and Statistics, Vol. 3, No. 2 : special issue on the statistical analysis and modeling of automotive emissions

    DOT National Transportation Integrated Search

    2000-09-01

    This special issue of the Journal of Transportation and Statistics is devoted to the statistical analysis and modeling of automotive emissions. It contains many of the papers presented in the mini-symposium last August and also includes one additiona...

  11. Prognostic value of stromal decorin expression in patients with breast cancer: a meta-analysis.

    PubMed

    Li, Shuang-Jiang; Chen, Da-Li; Zhang, Wen-Biao; Shen, Cheng; Che, Guo-Wei

    2015-11-01

    Numbers of studies have investigated the biological functions of decorin (DCN) in oncogenesis, tumor progression, angiogenesis and metastasis. Although many of them aim to highlight the prognostic value of stromal DCN expression in breast cancer, some controversial results still exist and a consensus has not been reached until now. Therefore, our meta-analysis aims to determine the prognostic significance of stromal DCN expression in breast cancer patients. PubMed, EMBASE, the Web of Science and China National Knowledge Infrastructure (CNKI) databases were searched for full-text literatures met out inclusion criteria. We applied the hazard ratio (HR) with 95% confidence interval (CI) as the appropriate summarized statistics. Q-test and I(2) statistic were employed to estimate the level of heterogeneity across the included studies. Sensitivity analysis was conducted to further identify the possible origins of heterogeneity. The publication bias was detected by Begg's test and Egger's test. There were three English literatures (involving 6 studies) included into our meta-analysis. On the one hand, both the summarized outcomes based on univariate analysis (HR: 0.513; 95% CI: 0.406-0.648; P<0.001) and multivariate analysis (HR: 0.544; 95% CI: 0.388-0.763; P<0.001) indicated that stromal DCN expression could promise the high cancer-specific survival (CSS) of breast cancer patients. On the other hand, both the summarized outcomes based on univariate analysis (HR: 0.504; 95% CI: 0.389-0.651; P<0.001) and multivariate analysis (HR: 0.568; 95% CI: 0.400-0.806; P=0.002) also indicated that stromal DCN expression was positively associated with high disease-free survival (DFS) of breast cancer patients. No significant heterogeneity or publication bias was observed within this meta-analysis. The present evidences indicate that high stromal DCN expression can significantly predict the good prognosis in patients with breast cancer. The discoveries from our meta-analysis have better be confirmed in the updated review pooling more relevant investigations in the future.

  12. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  13. Mean template for tensor-based morphometry using deformation tensors.

    PubMed

    Leporé, Natasha; Brun, Caroline; Pennec, Xavier; Chou, Yi-Yu; Lopez, Oscar L; Aizenstein, Howard J; Becker, James T; Toga, Arthur W; Thompson, Paul M

    2007-01-01

    Tensor-based morphometry (TBM) studies anatomical differences between brain images statistically, to identify regions that differ between groups, over time, or correlate with cognitive or clinical measures. Using a nonlinear registration algorithm, all images are mapped to a common space, and statistics are most commonly performed on the Jacobian determinant (local expansion factor) of the deformation fields. In, it was shown that the detection sensitivity of the standard TBM approach could be increased by using the full deformation tensors in a multivariate statistical analysis. Here we set out to improve the common space itself, by choosing the shape that minimizes a natural metric on the deformation tensors from that space to the population of control subjects. This method avoids statistical bias and should ease nonlinear registration of new subjects data to a template that is 'closest' to all subjects' anatomies. As deformation tensors are symmetric positive-definite matrices and do not form a vector space, all computations are performed in the log-Euclidean framework. The control brain B that is already the closest to 'average' is found. A gradient descent algorithm is then used to perform the minimization that iteratively deforms this template and obtains the mean shape. We apply our method to map the profile of anatomical differences in a dataset of 26 HIV/AIDS patients and 14 controls, via a log-Euclidean Hotelling's T2 test on the deformation tensors. These results are compared to the ones found using the 'best' control, B. Statistics on both shapes are evaluated using cumulative distribution functions of the p-values in maps of inter-group differences.

  14. Spatiotemporal correlation of optical coherence tomography in-vivo images of rabbit airway for the diagnosis of edema

    NASA Astrophysics Data System (ADS)

    Kang, DongYel; Wang, Alex; Volgger, Veronika; Chen, Zhongping; Wong, Brian J. F.

    2015-07-01

    Detection of an early stage of subglottic edema is vital for airway management and prevention of stenosis, a life-threatening condition in critically ill neonates. As an observer for the task of diagnosing edema in vivo, we investigated spatiotemporal correlation (STC) of full-range optical coherence tomography (OCT) images acquired in the rabbit airway with experimentally simulated edema. Operating the STC observer on OCT images generates STC coefficients as test statistics for the statistical decision task. Resulting from this, the receiver operating characteristic (ROC) curves for the diagnosis of airway edema with full-range OCT in-vivo images were extracted and areas under ROC curves were calculated. These statistically quantified results demonstrated the potential clinical feasibility of the STC method as a means to identify early airway edema.

  15. Contrast enema as a guide for senna-based laxatives in managing overflow retentive stool incontinence in pediatrics.

    PubMed

    Radwan, Ahmed Bassiuony; El-Debeiky, Mohammed Soliman; Abdel-Hay, Sameh

    2015-08-01

    Overflow retentive stool incontinence (ORSI) is secondary to constipation and fecal loading. In our study, the dose and duration of senna-based laxatives (SBL) treatment to achieve full defecatory control will be examined for possible correlation with new parameters measured from the initial contrast enema. Initially, an observational study was conducted prospectively on a group of patient with ORSI to define the optimum dose of SBL to achieve full defecatory control with measurement of six parameters in the initial contrast enema (level of colonic dilatation, recto-anal angle, ratio of maximal diameter of dilated colon to last lumbar spine, ratio of maximum diameter of dilated colon to normal descending colon, immediate and after 24-h post-evacuation residual contrast). The result was analyzed statistically to reach a correlation between the radiological data and prescribed dose. Over 2 and half years, 72 patients were included in the study; their mean age was 6.3 ± 3.33 years. The mean effective starting dose of SBL was 57 ± 18.13 mg/day and the mean effective ending dose was 75 ± 31.68 mg/day. Time lapsed till full defecatory control ranged from 1 to 16 weeks. Statistical correlation revealed that mean effective ending dose of SBL treatment significantly increased with higher levels of colonic dilatation. A weak positive correlation was found for both the mean effective starting and ending doses with the ratio of maximum colonic diameter to last lumbar spine and descending colonic diameters ratio. Senna-based laxatives are effective treatment for overflow retentive stool incontinence and their doses can be adjusted initially depending on the analysis of the radiological data.

  16. Analysis of conference abstract-to-publication rate in UK orthopaedic research.

    PubMed

    Collier, Thomas; Roadley-Battin, Michelle; Darlow, Chloe; Chant, Philip; Hing, Caroline B; Smith, Toby O

    2018-02-01

    Presentation of research at orthopaedic conferences is an important component for surgical evidence-based practice. However, there remains uncertainty as to how many conference abstracts proceed to achieve full-text publication (FTP) for wider dissemination. This study aimed to determine the abstract-to-publication rate (APR) of research presented in the largest hip and knee orthopaedic meetings in the UK, and to identify predictive factors which influence the APR. All published abstracts (n=744) from the 2006, 2008, 2009 and 2010 British Hip Society (BHS) and the 2007, 2009, 2010 and 2011 British Association for Surgery of the Knee (BASK) annual conference meetings were examined by four researchers independently. To determine whether abstracts had been published in full-text form, Google Scholar, Medline and EMBASE evidence databases were used to verify FTP status. Variables including sample size, statistical significance, grade of the first author, research affiliated institution and research design were extracted and analysed to identify whether these were associated with FTP. 176 out of 744 abstracts achieved FTP status (APR: 23.7%). Factors associated with FTP status included statistically significant results (P<0.01) and research design (P=0.02). Factors not associated included sample size, grade of the first author and research affiliated institution (P>0.05). APRs of the assessed BHS and BASK annual conference presentations are low in comparison to other scientific meetings. Encouragement should be provided to clinicians and academics to submit their work for publication to address this short fall, thereby enhancing the potential for full-text research publications to inform evidence-based orthopaedics. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Using Optimal Test Assembly Methods for Shortening Patient-Reported Outcome Measures: Development and Validation of the Cochin Hand Function Scale-6: A Scleroderma Patient-Centered Intervention Network Cohort Study.

    PubMed

    Levis, Alexander W; Harel, Daphna; Kwakkenbos, Linda; Carrier, Marie-Eve; Mouthon, Luc; Poiraudeau, Serge; Bartlett, Susan J; Khanna, Dinesh; Malcarne, Vanessa L; Sauve, Maureen; van den Ende, Cornelia H M; Poole, Janet L; Schouffoer, Anne A; Welling, Joep; Thombs, Brett D

    2016-11-01

    To develop and validate a short form of the Cochin Hand Function Scale (CHFS), which measures hand disability, for use in systemic sclerosis, using objective criteria and reproducible techniques. Responses on the 18-item CHFS were obtained from English-speaking patients enrolled in the Scleroderma Patient-Centered Intervention Network Cohort. CHFS unidimensionality was verified using confirmatory factor analysis, and an item response theory model was fit to CHFS items. Optimal test assembly (OTA) methods identified a maximally precise short form for each possible form length between 1 and 17 items. The final short form selected was the form with the least number of items that maintained statistically equivalent convergent validity, compared to the full-length CHFS, with the Health Assessment Questionnaire (HAQ) disability index (DI) and the physical function domain of the 29-item Patient-Reported Outcomes Measurement Information System (PROMIS-29). There were 601 patients included. A 6-item short form of the CHFS (CHFS-6) was selected. The CHFS-6 had a Cronbach's alpha of 0.93. Correlations of the CHFS-6 summed score with HAQ DI (r = 0.79) and PROMIS-29 physical function (r = -0.54) were statistically equivalent to the CHFS (r = 0.81 and r = -0.56). The correlation with the full CHFS was high (r = 0.98). The OTA procedure generated a valid short form of the CHFS with minimal loss of information compared to the full-length form. The OTA method used was based on objective, prespecified criteria, but should be further studied for viability as a general procedure for shortening patient-reported outcome measures in health research. © 2016, American College of Rheumatology.

  18. Can use of adaptive statistical iterative reconstruction reduce radiation dose in unenhanced head CT? An analysis of qualitative and quantitative image quality

    PubMed Central

    Heggen, Kristin Livelten; Pedersen, Hans Kristian; Andersen, Hilde Kjernlie; Martinsen, Anne Catrine T

    2016-01-01

    Background Iterative reconstruction can reduce image noise and thereby facilitate dose reduction. Purpose To evaluate qualitative and quantitative image quality for full dose and dose reduced head computed tomography (CT) protocols reconstructed using filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR). Material and Methods Fourteen patients undergoing follow-up head CT were included. All patients underwent full dose (FD) exam and subsequent 15% dose reduced (DR) exam, reconstructed using FBP and 30% ASIR. Qualitative image quality was assessed using visual grading characteristics. Quantitative image quality was assessed using ROI measurements in cerebrospinal fluid (CSF), white matter, peripheral and central gray matter. Additionally, quantitative image quality was measured in Catphan and vendor’s water phantom. Results There was no significant difference in qualitative image quality between FD FBP and DR ASIR. Comparing same scan FBP versus ASIR, a noise reduction of 28.6% in CSF and between −3.7 and 3.5% in brain parenchyma was observed. Comparing FD FBP versus DR ASIR, a noise reduction of 25.7% in CSF, and −7.5 and 6.3% in brain parenchyma was observed. Image contrast increased in ASIR reconstructions. Contrast-to-noise ratio was improved in DR ASIR compared to FD FBP. In phantoms, noise reduction was in the range of 3 to 28% with image content. Conclusion There was no significant difference in qualitative image quality between full dose FBP and dose reduced ASIR. CNR improved in DR ASIR compared to FD FBP mostly due to increased contrast, not reduced noise. Therefore, we recommend using caution if reducing dose and applying ASIR to maintain image quality. PMID:27583169

  19. Can use of adaptive statistical iterative reconstruction reduce radiation dose in unenhanced head CT? An analysis of qualitative and quantitative image quality.

    PubMed

    Østerås, Bjørn Helge; Heggen, Kristin Livelten; Pedersen, Hans Kristian; Andersen, Hilde Kjernlie; Martinsen, Anne Catrine T

    2016-08-01

    Iterative reconstruction can reduce image noise and thereby facilitate dose reduction. To evaluate qualitative and quantitative image quality for full dose and dose reduced head computed tomography (CT) protocols reconstructed using filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR). Fourteen patients undergoing follow-up head CT were included. All patients underwent full dose (FD) exam and subsequent 15% dose reduced (DR) exam, reconstructed using FBP and 30% ASIR. Qualitative image quality was assessed using visual grading characteristics. Quantitative image quality was assessed using ROI measurements in cerebrospinal fluid (CSF), white matter, peripheral and central gray matter. Additionally, quantitative image quality was measured in Catphan and vendor's water phantom. There was no significant difference in qualitative image quality between FD FBP and DR ASIR. Comparing same scan FBP versus ASIR, a noise reduction of 28.6% in CSF and between -3.7 and 3.5% in brain parenchyma was observed. Comparing FD FBP versus DR ASIR, a noise reduction of 25.7% in CSF, and -7.5 and 6.3% in brain parenchyma was observed. Image contrast increased in ASIR reconstructions. Contrast-to-noise ratio was improved in DR ASIR compared to FD FBP. In phantoms, noise reduction was in the range of 3 to 28% with image content. There was no significant difference in qualitative image quality between full dose FBP and dose reduced ASIR. CNR improved in DR ASIR compared to FD FBP mostly due to increased contrast, not reduced noise. Therefore, we recommend using caution if reducing dose and applying ASIR to maintain image quality.

  20. 76 FR 16385 - Gulf of Mexico Fishery Management Council; Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... INFORMATION CONTACT: Dr. Stephen Bortone, Executive Director, Gulf of Mexico Fishery Management Council...; Scientific & Statistical Committee Selection; and Reef Fish. 3:45 p.m.-4:15 p.m.--Other Business items will....--Closed Session--The Scientific & Statistical Committees Selection Committee/Full Council will meet to...

  1. Teaching Primary School Mathematics and Statistics: Evidence-Based Practice

    ERIC Educational Resources Information Center

    Averill, Robin; Harvey, Roger

    2010-01-01

    Here is the only reference book you will ever need for teaching primary school mathematics and statistics. It is full of exciting and engaging snapshots of excellent classroom practice relevant to "The New Zealand Curriculum" and national mathematics standards. There are many fascinating examples of investigative learning experiences,…

  2. Development of Composite Materials with High Passive Damping Properties

    DTIC Science & Technology

    2006-05-15

    frequency response function analysis. Sound transmission through sandwich panels was studied using the statistical energy analysis (SEA). Modal density...2.2.3 Finite element models 14 2.2.4 Statistical energy analysis method 15 CHAPTER 3 ANALYSIS OF DAMPING IN SANDWICH MATERIALS. 24 3.1 Equation of...sheets and the core. 2.2.4 Statistical energy analysis method Finite element models are generally only efficient for problems at low and middle frequencies

  3. Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arul Kumar, M.; Wroński, M.; McCabe, Rodney James

    In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less

  4. Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study

    DOE PAGES

    Arul Kumar, M.; Wroński, M.; McCabe, Rodney James; ...

    2018-02-01

    In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less

  5. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (15th) Held in Kansas City, Missouri on 4-8 May 1987

    DTIC Science & Technology

    1987-08-01

    HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band

  6. Full Moment Tensor Analysis Using First Motion Data at The Geysers Geothermal Field

    NASA Astrophysics Data System (ADS)

    Boyd, O.; Dreger, D. S.; Lai, V. H.; Gritto, R.

    2012-12-01

    Seismicity associated with geothermal energy production at The Geysers Geothermal Field in northern California has been increasing during the last forty years. We investigate source models of over fifty earthquakes with magnitudes ranging from Mw 3.5 up to Mw 4.5. We invert three-component, complete waveform data from broadband stations of the Berkeley Digital Seismic Network, the Northern California Seismic Network and the USA Array deployment (2005-2007) for the complete, six-element moment tensor. Some solutions are double-couple while others have substantial non-double-couple components. To assess the stability and significance of non-double-couple components, we use a suite of diagnostic tools including the F-test, Jackknife test, bootstrap and network sensitivity solution (NSS). The full moment tensor solutions of the studied events tend to plot in the upper half of the Hudson source type diagram where the fundamental source types include +CLVD, +LVD, tensile-crack, DC and explosion. Using the F-test to compare the goodness-of-fit values between the full and deviatoric moment tensor solutions, most of the full moment tensor solutions do not show a statistically significant improvement in fit over the deviatoric solutions. Because a small isotropic component may not significantly improve the fit, we include first motion polarity data to better constrain the full moment tensor solutions.

  7. [Diversity and frequency of scientific research design and statistical methods in the "Arquivos Brasileiros de Oftalmologia": a systematic review of the "Arquivos Brasileiros de Oftalmologia"--1993-2002].

    PubMed

    Crosta, Fernando; Nishiwaki-Dantas, Maria Cristina; Silvino, Wilmar; Dantas, Paulo Elias Correa

    2005-01-01

    To verify the frequency of study design, applied statistical analysis and approval by institutional review offices (Ethics Committee) of articles published in the "Arquivos Brasileiros de Oftalmologia" during a 10-year interval, with later comparative and critical analysis by some of the main international journals in the field of Ophthalmology. Systematic review without metanalysis was performed. Scientific papers published in the "Arquivos Brasileiros de Oftalmologia" between January 1993 and December 2002 were reviewed by two independent reviewers and classified according to the applied study design, statistical analysis and approval by the institutional review offices. To categorize those variables, a descriptive statistical analysis was used. After applying inclusion and exclusion criteria, 584 articles for evaluation of statistical analysis and, 725 articles for evaluation of study design were reviewed. Contingency table (23.10%) was the most frequently applied statistical method, followed by non-parametric tests (18.19%), Student's t test (12.65%), central tendency measures (10.60%) and analysis of variance (9.81%). Of 584 reviewed articles, 291 (49.82%) presented no statistical analysis. Observational case series (26.48%) was the most frequently used type of study design, followed by interventional case series (18.48%), observational case description (13.37%), non-random clinical study (8.96%) and experimental study (8.55%). We found a higher frequency of observational clinical studies, lack of statistical analysis in almost half of the published papers. Increase in studies with approval by institutional review Ethics Committee was noted since it became mandatory in 1996.

  8. Experimental Quiet Sprocket Design and Noise Reduction in Tracked Vehicles

    DTIC Science & Technology

    1981-04-01

    Track and Suspension Noise Reduction Statistical Energy Analysis Mechanical Impedance Measurement Finite Element Modal Analysis\\Noise Sources 2...shape and idler attachment are different. These differen- ces were investigated using the concepts of statistical energy analysis for hull generated noise...element r,’calculated from Statistical Energy Analysis . Such an approach will be valid within reasonable limits for frequencies of about 200 Hz and

  9. Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems

    DTIC Science & Technology

    1985-10-01

    statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful

  10. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  11. Population-based mammography screening: comparison of screen-film and full-field digital mammography with soft-copy reading--Oslo I study.

    PubMed

    Skaane, Per; Young, Kari; Skjennald, Arnulf

    2003-12-01

    To compare screen-film and full-field digital mammography with soft-copy reading in a population-based screening program. Full-field digital and screen-film mammography were performed in 3,683 women aged 50-69 years. Two standard views of each breast were acquired with each modality. Images underwent independent double reading with use of a five-point rating scale for probability of cancer. Recall rates and positive predictive values were calculated. Cancer detection rates determined with both modalities were compared by using the McNemar test for paired proportions. Retrospective side-by-side analysis for conspicuity of cancers was performed by an external independent radiologist group with experience in both modalities. In 3,683 cases, 31 cancers were detected. Screen-film mammography depicted 28 (0.76%) malignancies, and full-field digital mammography depicted 23 (0.62%) malignancies. The difference between cancer detection rates was not significant (P =.23). The recall rate for full-field digital mammography (4.6%; 168 of 3,683 cases) was slightly higher than that for screen-film mammography (3.5%; 128 of 3,683 cases). The positive predictive value based on needle biopsy results was 46% for screen-film mammography and 39% for full-field digital mammography. Side-by-side image comparison for cancer conspicuity led to classification of 19 cancers as equal for probability of malignancy, six cancers as slightly better demonstrated at screen-film mammography, and six cancers as slightly better demonstrated at full-field digital mammography. There was no statistically significant difference in cancer detection rate between screen-film and full-field digital mammography. Cancer conspicuity was equal with both modalities. Full-field digital mammography with soft-copy reading is comparable to screen-film mammography in population-based screening.

  12. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  13. Falls among full-time wheelchair users with spinal cord injury and multiple sclerosis: a comparison of characteristics of fallers and circumstances of falls.

    PubMed

    Sung, JongHun; Trace, Yarden; Peterson, Elizabeth W; Sosnoff, Jacob J; Rice, Laura A

    2017-10-25

    The purpose of this study is to (1) explore and (2) compare circumstances of falls among full-time wheelchair users with spinal cord injury (SCI) and multiple sclerosis (MS). A mixed method approach was used to explore and compare the circumstances of falls of 41 full-time wheelchair users with SCI (n = 23) and MS (n = 18). In addition to collecting participants' demographic information (age, gender, type of wheelchair used, duration of wheelchair use, and duration of disability), self-reported fall frequency in the past 6 months, self-reported restriction in activity due to fear of falling and the Spinal Cord Injury-Fall Concerns Scale (SCI-FCS) was collected. Qualitative data in the form of participants' responses to an open-ended question yielding information regarding the circumstances of the most recent fall were also collected. To examine differences in survey outcomes and demographic characteristics between participants with SCI and MS, independent t-tests and Pearson's Chi-square tests were used. Qualitative data were analyzed with a thematic analysis. Statistical analysis revealed that individuals with MS (mean =3.3) had significantly higher average SCI-FCS than individuals with SCI (mean =2.4). The analysis of the participants' descriptions of the circumstances of their most recent falls resulted in three main categories: action-related fall contributors (e.g., transfer), (2) location of falls (e.g., bathroom), and (3) fall attributions (e.g., surface condition). The results from this study helped to understand fall circumstances among full-time wheelchair users with MS and SCI. Findings from this study can inform the development of evidenced-based interventions to improve the effectiveness of clinically based treatment protocols. Implications for rehabilitation Falls are a common health concern in full-time wheelchair users living with multiple sclerosis and spinal cord injury. The circumstances surrounding falls reported by full-time wheelchair users living with multiple sclerosis and spinal cord injuries were found to be multifactorial. The complex nature of falls must be taken into consideration in the development of fall prevention programs. Findings from this study can inform the development of comprehensive evidence-based, population-specific interventions to manage falls among full-time wheelchair users living with multiple sclerosis and spinal cord injury.

  14. Communications Link Characterization Experiment (CLCE) technical data report, volume 2

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The results are presented of the long term rain rate statistical analysis and the investigation of determining the worst month statistical from the measured attenuation data caused by precipitation. The rain rate statistics cover a period of 11 months from July of 1974 to May of 1975 for measurements taken at the NASA, Rosman station. The rain rate statistical analysis is a continuation of the analysis of the rain rate data accumulated for the ATS-6 Millimeter Wave Progation Experiment. The statistical characteristics of the rain rate data through December of 1974 is also presented for the above experiment.

  15. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  16. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    PubMed

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  17. A survey of visual function in an Austrian population of school-age children with reading and writing difficulties.

    PubMed

    Dusek, Wolfgang; Pierscionek, Barbara K; McClelland, Julie F

    2010-05-25

    To describe and compare visual function measures of two groups of school age children (6-14 years of age) attending a specialist eyecare practice in Austria; one group referred to the practice from educational assessment centres diagnosed with reading and writing difficulties and the other, a clinical age-matched control group. Retrospective clinical data from one group of subjects with reading difficulties (n = 825) and a clinical control group of subjects (n = 328) were examined.Statistical analysis was performed to determine whether any differences existed between visual function measures from each group (refractive error, visual acuity, binocular status, accommodative function and reading speed and accuracy). Statistical analysis using one way ANOVA demonstrated no differences between the two groups in terms of refractive error and the size or direction of heterophoria at distance (p > 0.05). Using predominately one way ANOVA and chi-square analyses, those subjects in the referred group were statistically more likely to have poorer distance visual acuity, an exophoric deviation at near, a lower amplitude of accommodation, reduced accommodative facility, reduced vergence facility, a reduced near point of convergence, a lower AC/A ratio and a slower reading speed than those in the clinical control group (p < 0.05). This study highlights the high proportions of visual function anomalies in a group of children with reading difficulties in an Austrian population. It confirms the importance of a full assessment of binocular visual status in order to detect and remedy these deficits in order to prevent the visual problems continuing to impact upon educational development.

  18. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-07-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.

  19. Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.

  20. Application of Ontology Technology in Health Statistic Data Analysis.

    PubMed

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  1. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    PubMed

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  2. Verify MesoNAM Performance

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and weaknesses and the GUI allows quick access to the data which will result in improved forecasts for operations.

  3. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    NASA Astrophysics Data System (ADS)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.

  4. Alar-columellar and lateral nostril changes following tongue-in-groove rhinoplasty.

    PubMed

    Shah, Ajul; Pfaff, Miles; Kinsman, Gianna; Steinbacher, Derek M

    2015-04-01

    Repositioning the medial crura cephalically onto the caudal septum (tongue-in-groove; TIG) allows alteration of the columella, ala, and nasal tip to address alar-columellar disproportion as seen from the lateral view. To date, quantitative analysis of nostril dimension, alar-columellar relationship, and nasal tip changes following the TIG rhinoplasty technique have not been described. The present study aims to evaluate post-operative lateral morphometric changes following TIG. Pre- and post-operative lateral views of a series of consecutive patients who underwent TIG rhinoplasty were produced from 3D images at multiple time points (≤2 weeks, 4-10 weeks, and >10 weeks post-operatively) for analysis. The 3D images were converted to 2D and set to scale. Exposed lateral nostril area, alar-columellar disproportion (divided into superior and inferior heights), nasolabial angle, nostril height, and nostril length were calculated and statistically analyzed using a pairwise t test. A P ≤ 0.05 was considered statistically significant. Ninety-four lateral views were analyzed from 20 patients (16 females; median age: 31.8). One patient had a history of current tobacco cigarette use. Lateral nostril area decreased at all time points post-operatively, in a statistically significant fashion. Alar-columellar disproportion was reduced following TIG at all time points. The nasolabial angle significantly increased post-operatively at ≤2 weeks, 4-10 weeks, and >10, all in a statistically significant fashion. Nostril height and nostril length decreased at all post-operative time points. Morphometric analysis reveals reduction in alar-columellar disproportion and lateral nostril shows following TIG rhinoplasty. Tip rotation, as a function of nasolabial angle, also increased. These results provide quantitative substantiation for qualitative descriptions attributed to the TIG technique. Future studies will focus on area and volumetric measurements, and assessment of long-term stability. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  5. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  6. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  7. Network Data: Statistical Theory and New Models

    DTIC Science & Technology

    2016-02-17

    SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging

  8. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  9. Shock and Vibration Symposium (59th) Held in Albuquerque, New Mexico on 18-20 October 1988. Volume 3

    DTIC Science & Technology

    1988-10-01

    N. F. Rieger Statistical Energy Analysis : An Overview of Its Development and Engineering Applications J. E. Manning DATA BASES DOE/DOD Environmental...Vibroacoustic Response Using the Finite Element Method and Statistical Energy Analysis F. L. Gloyna Study of Helium Effect on Spacecraft Random Vibration...Analysis S. A. Wilkerson vi DYNAMIC ANALYSIS Modeling of Vibration Transmission in a Damped Beam Structure Using Statistical Energy Analysis S. S

  10. Analysis and Report on SD2000: A Workshop to Determine Structural Dynamics Research for the Millenium

    DTIC Science & Technology

    2000-04-10

    interest. These include Statistical Energy Analysis (SEA), fuzzy structure theory, and approaches combining modal analysis and SEA. Non-determinism...34 arising with increasing frequency. This has led to Statistical Energy Analysis , in which a system is modelled as a collection of coupled subsystems...22. IUTAM Symposium on Statistical Energy Analysis . 1999 Ed. F.J. Fahy and W.G. Price. Kluwer Academic Publishing. • 23. R.S. Langley and P

  11. THE Role OF Anisotropy AND Intermittency IN Solar Wind/Magnetosphere Coupling

    NASA Astrophysics Data System (ADS)

    Jankovicova, D.; Voros, Z.

    2006-12-01

    Turbulent fluctuations are common in the solar wind as well as in the Earth's magnetosphere. The fluctuations of both magnetic field and plasma parameters exhibit non-Gaussian statistics. Neither the amplitude of these fluctuations nor their spectral characteristics can provide a full statistical description of multi-scale features in turbulence. It substantiates a statistical approach including the estimation of experimentally accessible statistical moments. In this contribution, we will directly estimate the third (skewness) and the fourth (kurtosis) statistical moments from the available time series of magnetic measurements in the solar wind (ACE and WIND spacecraft) and in the Earth's magnetosphere (SYM-H index). Then we evaluate how the statistical moments change during strong and weak solar wind/magnetosphere coupling intervals.

  12. Accuracy of Buccal Scan Procedures for the Registration of Habitual Intercuspation.

    PubMed

    Zimmermann, M; Ender, A; Attin, T; Mehl, A

    2018-04-09

    Accurate reproduction of the jaw relationship is important in many fields of dentistry. Maximum intercuspation can be registered with digital buccal scan procedures implemented in the workflow of many intraoral scanning systems. The aim of this study was to investigate the accuracy of buccal scan procedures with intraoral scanning devices for the registration of habitual intercuspation in vivo. The hypothesis was that there is no statistically significant difference for buccal scan procedures compared to registration methods with poured model casts. Ten individuals (full dentition, no dental rehabilitations) were subjects for five different habitual intercuspation registration methods: (CI) poured model casts, manual hand registration, buccal scan with inEOS X5; (BC) intraoral scan, buccal scan with CEREC Bluecam; (OC4.2) intraoral scan, buccal scan with CEREC Omnicam software version 4.2; (OC4.5β) intraoral scan, buccal scan with CEREC Omnicam version 4.5β; and (TR) intraoral scan, buccal scan with Trios 3. Buccal scan was repeated three times. Analysis of rotation (Rot) and translation (Trans) parameters was performed with difference analysis software (OraCheck). Statistical analysis was performed with one-way analysis of variance and the post hoc Scheffé test ( p<0.05). Statistical analysis showed no significant ( p>0.05) differences in terms of translation between groups CI_Trans (98.74±112.01 μm), BC_Trans (84.12±64.95 μm), OC4.2_Trans (60.70±35.08 μm), OC4.5β_Trans (68.36±36.67 μm), and TR_Trans (66.60±64.39 μm). For rotation, there were no significant differences ( p>0.05) for groups CI_Rot (0.23±0.25°), BC_Rot (0.73±0.52°), OC4.2_Rot (0.45±0.31°), OC4.5β_Rot (0.50±0.36°), and TR_Rot (0.47±0.65°). Intraoral scanning devices allow the reproduction of the static relationship of the maxillary and mandibular teeth with the same accuracy as registration methods with poured model casts.

  13. NIH funding in Radiation Oncology – A snapshot

    PubMed Central

    Steinberg, Michael; McBride, William H.; Vlashi, Erina; Pajonk, Frank

    2013-01-01

    Currently, pay lines for NIH grants are at a historical low. In this climate of fierce competition knowledge about the funding situation in a small field like Radiation Oncology becomes very important for career planning and recruitment of faculty. Unfortunately, this data cannot be easily extracted from the NIH s database because it does not discriminate between Radiology and Radiation Oncology Departments. At the start of fiscal year 2013, we extracted records for 952 individual grants, which were active at the time of analysis from the NIH database. Proposals originating from Radiation Oncology Departments were identified manually. Descriptive statistics were generated using the JMP statistical software package. Our analysis identified 197 grants in Radiation Oncology. These proposals came from 134 individual investigators in 43 academic institutions. The majority of the grants (118) were awarded to PIs at the Full Professor level and 122 PIs held a PhD degree. In 79% of the grants the research topic fell into the field of Biology, in 13 % into the field of Medical Physics. Only 7.6% of the proposals were clinical investigations. Our data suggests that the field of Radiation Oncology is underfunded by the NIH, and that the current level of support does not match the relevance of Radiation Oncology for cancer patients or the potential of its academic work force. PMID:23523324

  14. Measuring determinants of career satisfaction of anesthesiologists: validation of a survey instrument.

    PubMed

    Afonso, Anoushka M; Diaz, James H; Scher, Corey S; Beyl, Robbie A; Nair, Singh R; Kaye, Alan David

    2013-06-01

    To measure the parameter of job satisfaction among anesthesiologists. Survey instrument. Academic anesthesiology departments in the United States. 320 anesthesiologists who attended the annual meeting of the ASA in 2009 (95% response rate). The anonymous 50-item survey collected information on 26 independent demographic variables and 24 dependent ranked variables of career satisfaction among practicing anesthesiologists. Mean survey scores were calculated for each demographic variable and tested for statistically significant differences by analysis of variance. Questions within each domain that were internally consistent with each other within domains were identified by Cronbach's alpha ≥ 0.7. P-values ≤ 0.05 were considered statistically significant. Cronbach's alpha analysis showed strong internal consistency for 10 dependent outcome questions in the practice factor-related domain (α = 0.72), 6 dependent outcome questions in the peer factor-related domain (α = 0.71), and 8 dependent outcome questions in the personal factor-related domain (α = 0.81). Although age was not a variable, full-time status, early satisfaction within the first 5 years of practice, working with respected peers, and personal choice factors were all significantly associated with anesthesiologist job satisfaction. Improvements in factors related to job satisfaction among anesthesiologists may lead to higher early and current career satisfaction. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Statistical Enrichment of Epigenetic States Around Triplet Repeats that Can Undergo Expansions

    PubMed Central

    Essebier, Alexandra; Vera Wolf, Patricia; Cao, Minh Duc; Carroll, Bernard J.; Balasubramanian, Sureshkumar; Bodén, Mikael

    2016-01-01

    More than 30 human genetic diseases are linked to tri-nucleotide repeat expansions. There is no known mechanism that explains repeat expansions in full, but changes in the epigenetic state of the associated locus has been implicated in the disease pathology for a growing number of examples. A comprehensive comparative analysis of the genomic features associated with diverse repeat expansions has been lacking. Here, in an effort to decipher the propensity of repeats to undergo expansion and result in a disease state, we determine the genomic coordinates of tri-nucleotide repeat tracts at base pair resolution and computationally establish epigenetic profiles around them. Using three complementary statistical tests, we reveal that several epigenetic states are enriched around repeats that are associated with disease, even in cells that do not harbor expansion, relative to a carefully stratified background. Analysis of over one hundred cell types reveals that epigenetic states generally tend to vary widely between genic regions and cell types. However, there is qualified consistency in the epigenetic signatures of repeats associated with disease suggesting that changes to the chromatin and the DNA around an expanding repeat locus are likely to be similar. These epigenetic signatures may be exploited further to develop models that could explain the propensity of repeats to undergo expansions. PMID:27013954

  16. Angular power spectrum in publically released ALICE events

    NASA Astrophysics Data System (ADS)

    Llanes-Estrada, Felipe J.; Muñoz Martinez, Jose L.

    2018-02-01

    We study the particles emitted in the fireball following a Relativistic Heavy Ion Collision with the traditional angular analysis employed in cosmology and earth sciences, producing Mollweide plots of the number and pt distribution of a few actual, publically released ALICE-collaboration events and calculating their angular power spectrum. We also examine the angular spectrum of a simple two-particle correlation. While this may not be the optimal way of analyzing heavy ion data, our intention is to provide a one to one comparison to analysis in cosmology. With the limited statistics at hand, we do not find evidence for acoustic peaks but a decrease of Cl that is reminiscent of viscous attenuation, but subject to a strong effect from the rapidity acceptance which probably dominates (so we also subtract the m = 0 component). As an exercise, we still extract a characteristic Silk damping length (proportional to the square root of the viscosity over entropy density ratio) to illustrate the method. The absence of acoustic-like peaks is also compatible with a crossover from the QGP to the hadron gas (because a surface tension at domain boundaries would effect a restoring force that could have driven acoustic oscillations). Presently we do not understand a depression of the l = 6 multipole strength; perhaps ALICE could reexamine it with full statistics.

  17. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  18. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  20. An Analysis of the Navy’s Voluntary Education Program

    DTIC Science & Technology

    2007-03-01

    NAVAL ANALYSIS VOLED STUDY .........11 1. Data .........................................11 2. Statistical Models ...........................12 3...B. EMPLOYER FINANCED GENERAL TRAINING ................31 1. Data .........................................32 2. Statistical Model...37 1. Data .........................................38 2. Statistical Model ............................38 3. Findings

  1. The Shock and Vibration Digest. Volume 14, Number 12

    DTIC Science & Technology

    1982-12-01

    to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis

  2. Statistical Symbolic Execution with Informed Sampling

    NASA Technical Reports Server (NTRS)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  3. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  4. Effect of Graft Thickness on Visual Acuity After Descemet Stripping Endothelial Keratoplasty: A Systematic Review and Meta-Analysis.

    PubMed

    Wacker, Katrin; Bourne, William M; Patel, Sanjay V

    2016-03-01

    To assess the relationship between graft thickness and best-corrected visual acuity (BCVA) after Descemet stripping endothelial keratoplasty (DSEK). Systematic review and meta-analysis. PubMed, EMBASE, Web of Science, and conference abstracts were searched for studies published up to October 2015 with standard systematic review methodology. Eligibility criteria included studies evaluating graft thickness in primary DSEK and visual outcomes. There were no restrictions to study design, study population, or language. Correlation coefficients were pooled using random-effects models. Of 480 articles and conference abstracts, 31 met inclusion criteria (2214 eyes) after full-text review. Twenty-three studies assessed correlations between BCVA and graft thickness, and 8 studies used different statistical methods. All associations were reported dimensionless. Studies generally had small sample sizes and were heterogeneous, especially with respect to data and analysis quality (P = .02). Most studies did not measure BCVA in a standardized manner. The pooled correlation coefficient for graft thickness vs BCVA was 0.20 (95% CI, 0.14-0.26) for 17 studies without data concerns; this did not include 7 studies (815 eyes) that used different statistical methods and did not find significant associations. There is insufficient evidence that graft thickness is clinically important with respect to BCVA after DSEK, with meta-analysis suggesting a weak relationship. Although well-designed longitudinal studies with standardized measurements of visual acuity and graft thickness are necessary to better characterize this relationship, current evidence suggests that graft thickness is not important for surgical planning. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  6. Database Creation and Statistical Analysis: Finding Connections Between Two or More Secondary Storage Device

    DTIC Science & Technology

    2017-09-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS DATABASE CREATION AND STATISTICAL ANALYSIS: FINDING CONNECTIONS BETWEEN TWO OR MORE SECONDARY...BLANK ii Approved for public release. Distribution is unlimited. DATABASE CREATION AND STATISTICAL ANALYSIS: FINDING CONNECTIONS BETWEEN TWO OR MORE...Problem and Motivation . . . . . . . . . . . . . . . . . . . 1 1.2 DOD Applicability . . . . . . . . . . . . . . . . .. . . . . . . 2 1.3 Research

  7. Vibration Transmission through Rolling Element Bearings in Geared Rotor Systems

    DTIC Science & Technology

    1990-11-01

    147 4.8 Concluding Remarks ........................................................... 153 V STATISTICAL ENERGY ANALYSIS ............................................ 155...and dynamic finite element techniques are used to develop the discrete vibration models while statistical energy analysis method is used for the broad...bearing system studies, geared rotor system studies, and statistical energy analysis . Each chapter is self sufficient since it is written in a

  8. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    ERIC Educational Resources Information Center

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  9. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  10. Laparoscopic versus open pyloromyotomy for infantile hypertropic pyloric stenosis: an early experience.

    PubMed

    Saha, N; Saha, D K; Rahman, M A; Aziz, M A; Islam, M K

    2012-07-01

    This prospective comparative study was conducted with an initial experience in the Department of Pediatric Surgery, Dhaka Shishu (Children) Hospital during the period of December 2007 to January 2009, with the infants of 2-12 weeks age, diagnosed as Hypertrophic pyloric stenosis. Patients selection was done by simple random technique by means of lottery. For open pyloromyotomy conventional method & for laparoscopic pyloromyotomy three trocher techniques was applied. In this study, among 60 cases with infantile hypertrophic pyloric stenosis, 30 cases were finally selected for analysis irrespectively both in laparoscopic (Group A) & in open pyloromyotomy (Group B) group. Patients were studied under variables of operative time, required time of full feeds after operation, post operative hospital stay & both per and post operative complications. Regarding operative time, in Group A, mean±SD operating time (in minutes) was 61.59±51.73 whereas in Group B it was 28.33±8.40 & P value was 0.001. The result was statistically significant. The mean±SD time (in hours) of full feeds (ad libitum) was 35.00±31.70 hours in Group A compared to 28.95±10.99 hours in Group B and P value was found 0.342ns which was not statistically significant. On study of total length (in days) of post operative hospital stay, mean±SD was 3.09±2.25 & 2.58±1.15days in laparoscopic group & open pyloromyotomy group respectively. The p value was 0.355ns, which was statistically insignificant. Again, on study of complications, per operatively 6(19.5%) patients had developed haemorrage, 1(3.33%) had mucosal perforation & 4(13.36%) had developed duodenal serosal injury in laparoscopic group whereas only 1(3.33%) patient in open pyloromyotomy group had nothing else except simple hemorrhage. The p value (0.051ns) was also statistically insignificant. In regard to post operative complications, 2(6.6%) patients had developed wound hematoma, 2(6.6%) had wound infection, 1(3.33 %) had developed wound dehiscence and incisional hernia respectively in Group A. But in group B there was no subject with any complication. This result was also statistically insignificant. So, the overall study results denote that, laparoscopic pyloromyotomy would not be considered as a superior procedure or as safe as that of traditional open pyloromyotomy for the beginners.

  11. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    PubMed

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  12. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Should palonosetron be a preferred 5-HT3 receptor antagonist for chemotherapy-induced nausea and vomiting? An updated systematic review and meta-analysis.

    PubMed

    Chow, Ronald; Warr, David G; Navari, Rudolph M; Tsao, May; Popovic, Marko; Chiu, Leonard; Milakovic, Milica; Lam, Henry; DeAngelis, Carlo

    2018-05-23

    Chemotherapy-induced nausea and vomiting (CINV) continues to be a common side effect of systemic anticancer therapy, decreasing quality of life and increasing resource utilization. The aim of this meta-analysis was to investigate the comparative efficacy and safety of palonosetron relative to other 5-HT 3 RAs. A literature search was carried out in Ovid MEDLINE, Embase, and Cochrane Central Register of Controlled Trials. Full-text references were then screened and included in this meta-analysis if they were an RCT and had adequate data regarding one of the five primary endpoints-complete response (CR), complete control (CC), no emesis, no nausea, or no rescue medications. A total of 24 RCTs were included in this review. Palonosetron was statistically superior to other 5-HT 3 RAs for 10 of the 19 assessed endpoints. Only one endpoint-emesis in the overall phase-had noticeable more favorable data for palonosetron to the point that it approached the 10% risk difference (RD) threshold as specified by the MASCC/ESMO antiemetic panel; another two endpoints (CR in the overall phase and nausea in the delayed phase) approached the 10% threshold. Palonosetron seems to be more efficacious and safe than other 5-HT 3 RAs-statistically superior in 10 of 19 endpoints. It is, however, only clinically significant in one endpoint and approached clinically significant difference in another two endpoints. Within the limits of this meta-analysis, our results indicate that palonosetron may not be as superior in efficacy and safety as reported in a previous meta-analysis, and supports the recent MASCC/ESMO, ASCO, and NCCN guidelines in not generally indicating palonosetron as the 5-HT 3 RA of choice.

  14. Summary Statistics of Public TV Licensees, 1972.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Statistics in the areas of finance, employment, broadcast and production for public TV licenses in 1972 are given in this report. Tables in the area of finance are presented specifying total funds, income, direct operating costs, and capital expenditures. Employment is divided into all employment with subdivisions for full- and part-time employees…

  15. Canadian Statistical Review. Volume 53, Number 7, July 1978.

    ERIC Educational Resources Information Center

    von Zur-Muehlen, Max

    1978-01-01

    Information on Canadian social and economic trends is presented in this statistical review. Advance information on national income and expenditure accounts for the first quarter of 1978 is provided. Characteristics of full-time university teachers from 1956-57 to 1977-78 are detailed in tables that recount such developments as the nearly six-fold…

  16. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  17. A stochastic model of particle dispersion in turbulent reacting gaseous environments

    NASA Astrophysics Data System (ADS)

    Sun, Guangyuan; Lignell, David; Hewson, John

    2012-11-01

    We are performing fundamental studies of dispersive transport and time-temperature histories of Lagrangian particles in turbulent reacting flows. The particle-flow statistics including the full particle temperature PDF are of interest. A challenge in modeling particle motions is the accurate prediction of fine-scale aerosol-fluid interactions. A computationally affordable stochastic modeling approach, one-dimensional turbulence (ODT), is a proven method that captures the full range of length and time scales, and provides detailed statistics of fine-scale turbulent-particle mixing and transport. Limited results of particle transport in ODT have been reported in non-reacting flow. Here, we extend ODT to particle transport in reacting flow. The results of particle transport in three flow configurations are presented: channel flow, homogeneous isotropic turbulence, and jet flames. We investigate the functional dependence of the statistics of particle-flow interactions including (1) parametric study with varying temperatures, Reynolds numbers, and particle Stokes numbers; (2) particle temperature histories and PDFs; (3) time scale and the sensitivity of initial and boundary conditions. Flow statistics are compared to both experimental measurements and DNS data.

  18. General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies

    PubMed Central

    Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong

    2013-01-01

    We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515

  19. Primary, Secondary, and Meta-Analysis of Research

    ERIC Educational Resources Information Center

    Glass, Gene V.

    1976-01-01

    Examines data analysis at three levels: analysis of data; secondary analysis is the re-analysis of data for the purpose of answering the original research question with better statistical techniques, or answering new questions with old data; and, meta-analysis refers to the statistical analysis of many analysis results from individual studies for…

  20. Effluent concentration and removal efficiency of nine heavy metals in secondary treatment plants in Shanghai, China.

    PubMed

    Feng, Jingjing; Chen, Xiaolin; Jia, Lei; Liu, Qizhen; Chen, Xiaojia; Han, Deming; Cheng, Jinping

    2018-04-10

    Wastewater treatment plants (WWTPs) are the most common form of industrial and municipal wastewater control. To evaluate the performance of wastewater treatment and the potential risk of treated wastewater to aquatic life and human health, the influent and effluent concentrations of nine toxic metals were determined in 12 full-scale WWTPs in Shanghai, China. The performance was evaluated based on national standards for reclamation and aquatic criteria published by US EPA, and by comparison with other full-scale WWTPs in different countries. Potential sources of heavy metals were recognized using partial correlation analysis, hierarchical clustering, and principal component analysis (PCA). Results indicated significant treatment effect on As, Cd, Cr, Cu, Hg, Mn, Pb, and Zn. The removal efficiencies ranged from 92% (Cr) to 16.7% (Hg). The results indicated potential acute and/or chronic effect of Cu, Ni, Pb, and Zn on aquatic life and potential harmful effect of As and Mn on human health for the consumption of water and/or organism. The results of partial correlation analysis, hierarchical clustering based on cosine distance, and PCA, which were consistent with each other, suggested common source of Cd, Cr, Cu, and Pb and common source of As, Hg, Mn, Ni, and Zn. Hierarchical clustering based on Jaccard similarity suggested common source of Cd, Hg, and Ni, which was statistically proved by Fisher's exact test.

  1. MALDI Orbitrap Mass Spectrometry Profiling of Dysregulated Sulfoglycosphingolipids in Renal Cell Carcinoma Tissues

    NASA Astrophysics Data System (ADS)

    Jirásko, Robert; Holčapek, Michal; Khalikova, Maria; Vrána, David; Študent, Vladimír; Prouzová, Zuzana; Melichar, Bohuslav

    2017-08-01

    Matrix-assisted laser desorption/ionization coupled with Orbitrap mass spectrometry (MALDI-Orbitrap-MS) is used for the clinical study of patients with renal cell carcinoma (RCC), as the most common type of kidney cancer. Significant changes in sulfoglycosphingolipid abundances between tumor and autologous normal kidney tissues are observed. First, sulfoglycosphingolipid species in studied RCC samples are identified using high mass accuracy full scan and tandem mass spectra. Subsequently, optimization, method validation, and statistical evaluation of MALDI-MS data for 158 tissues of 80 patients are discussed. More than 120 sulfoglycosphingolipids containing one to five hexosyl units are identified in human RCC samples based on the systematic study of their fragmentation behavior. Many of them are recorded here for the first time. Multivariate data analysis (MDA) methods, i.e., unsupervised principal component analysis (PCA) and supervised orthogonal partial least square discriminant analysis (OPLS-DA), are used for the visualization of differences between normal and tumor samples to reveal the most up- and downregulated lipids in tumor tissues. Obtained results are closely correlated with MALDI mass spectrometry imaging (MSI) and histologic staining. Important steps of the present MALDI-Orbitrap-MS approach are also discussed, such as the selection of best matrix, correct normalization, validation for semiquantitative study, and problems with possible isobaric interferences on closed masses in full scan mass spectra.

  2. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  3. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  4. Sensitivity of the Hydrogen Epoch of Reionization Array and its build-out stages to one-point statistics from redshifted 21 cm observations

    NASA Astrophysics Data System (ADS)

    Kittiwisit, Piyanat; Bowman, Judd D.; Jacobs, Daniel C.; Beardsley, Adam P.; Thyagarajan, Nithyanandan

    2018-03-01

    We present a baseline sensitivity analysis of the Hydrogen Epoch of Reionization Array (HERA) and its build-out stages to one-point statistics (variance, skewness, and kurtosis) of redshifted 21 cm intensity fluctuation from the Epoch of Reionization (EoR) based on realistic mock observations. By developing a full-sky 21 cm light-cone model, taking into account the proper field of view and frequency bandwidth, utilizing a realistic measurement scheme, and assuming perfect foreground removal, we show that HERA will be able to recover statistics of the sky model with high sensitivity by averaging over measurements from multiple fields. All build-out stages will be able to detect variance, while skewness and kurtosis should be detectable for HERA128 and larger. We identify sample variance as the limiting constraint of the measurements at the end of reionization. The sensitivity can also be further improved by performing frequency windowing. In addition, we find that strong sample variance fluctuation in the kurtosis measured from an individual field of observation indicates the presence of outlying cold or hot regions in the underlying fluctuations, a feature that can potentially be used as an EoR bubble indicator.

  5. Toward a Better Understanding of the Relationship between Belief in the Paranormal and Statistical Bias: The Potential Role of Schizotypy

    PubMed Central

    Dagnall, Neil; Denovan, Andrew; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2016-01-01

    The present paper examined relationships between schizotypy (measured by the Oxford-Liverpool Inventory of Feelings and Experience; O-LIFE scale brief), belief in the paranormal (assessed via the Revised Paranormal Belief Scale; RPBS) and proneness to statistical bias (i.e., perception of randomness and susceptibility to conjunction fallacy). Participants were 254 volunteers recruited via convenience sampling. Probabilistic reasoning problems appeared framed within both standard and paranormal contexts. Analysis revealed positive correlations between the Unusual Experience (UnExp) subscale of O-LIFE and paranormal belief measures [RPBS full scale, traditional paranormal beliefs (TPB) and new age philosophy]. Performance on standard problems correlated negatively with UnExp and belief in the paranormal (particularly the TPB dimension of the RPBS). Consideration of specific problem types revealed that perception of randomness associated more strongly with belief in the paranormal than conjunction; both problem types related similarly to UnExp. Structural equation modeling specified that belief in the paranormal mediated the indirect relationship between UnExp and statistical bias. For problems presented in a paranormal context a framing effect occurred. Whilst UnExp correlated positively with conjunction proneness (controlling for perception of randomness), there was no association between UnExp and perception of randomness (controlling for conjunction). PMID:27471481

  6. Relationship between self-reported upper limb disability and quantitative tests in hand-arm vibration syndrome.

    PubMed

    Poole, Kerry; Mason, Howard

    2007-03-15

    To establish the relationship between quantitative tests of hand function and upper limb disability, as measured by the Disability of the Arm, Shoulder and Hand (DASH) questionnaire, in hand-arm vibration syndrome (HAVS). A total of 228 individuals with HAVS were included in this study. Each had undergone a full HAVS assessment by an experienced physician, including quantitative tests of vibrotactile and thermal perception thresholds, maximal hand-grip strength (HG) and the Purdue pegboard (PP) test. Individuals were also asked to complete a DASH questionnaire. PP and HG of the quantitative tests gave the best and statistically significant individual correlations with the DASH disability score (r2 = 0.168 and 0.096). Stepwise linear regression analysis revealed that only PP and HG measurements were statistically significant predictors of upper limb disability (r2 = 0.178). Overall a combination of the PP and HG measurements, rather than each alone, gave slightly better discrimination, although not statistically significant, between normal and abnormal DASH scores with a sensitivity of 73.1% and specificity of 64.3%. Measurements of manual dexterity and hand-grip strength using PP and HG may be useful in helping to confirm lack of upper limb function and 'perceived' disability in HAVS.

  7. Joint resonant CMB power spectrum and bispectrum estimation

    NASA Astrophysics Data System (ADS)

    Meerburg, P. Daniel; Münchmeyer, Moritz; Wandelt, Benjamin

    2016-02-01

    We develop the tools necessary to assess the statistical significance of resonant features in the CMB correlation functions, combining power spectrum and bispectrum measurements. This significance is typically addressed by running a large number of simulations to derive the probability density function (PDF) of the feature-amplitude in the Gaussian case. Although these simulations are tractable for the power spectrum, for the bispectrum they require significant computational resources. We show that, by assuming that the PDF is given by a multivariate Gaussian where the covariance is determined by the Fisher matrix of the sine and cosine terms, we can efficiently produce spectra that are statistically close to those derived from full simulations. By drawing a large number of spectra from this PDF, both for the power spectrum and the bispectrum, we can quickly determine the statistical significance of candidate signatures in the CMB, considering both single frequency and multifrequency estimators. We show that for resonance models, cosmology and foreground parameters have little influence on the estimated amplitude, which allows us to simplify the analysis considerably. A more precise likelihood treatment can then be applied to candidate signatures only. We also discuss a modal expansion approach for the power spectrum, aimed at quickly scanning through large families of oscillating models.

  8. Toward a Better Understanding of the Relationship between Belief in the Paranormal and Statistical Bias: The Potential Role of Schizotypy.

    PubMed

    Dagnall, Neil; Denovan, Andrew; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2016-01-01

    The present paper examined relationships between schizotypy (measured by the Oxford-Liverpool Inventory of Feelings and Experience; O-LIFE scale brief), belief in the paranormal (assessed via the Revised Paranormal Belief Scale; RPBS) and proneness to statistical bias (i.e., perception of randomness and susceptibility to conjunction fallacy). Participants were 254 volunteers recruited via convenience sampling. Probabilistic reasoning problems appeared framed within both standard and paranormal contexts. Analysis revealed positive correlations between the Unusual Experience (UnExp) subscale of O-LIFE and paranormal belief measures [RPBS full scale, traditional paranormal beliefs (TPB) and new age philosophy]. Performance on standard problems correlated negatively with UnExp and belief in the paranormal (particularly the TPB dimension of the RPBS). Consideration of specific problem types revealed that perception of randomness associated more strongly with belief in the paranormal than conjunction; both problem types related similarly to UnExp. Structural equation modeling specified that belief in the paranormal mediated the indirect relationship between UnExp and statistical bias. For problems presented in a paranormal context a framing effect occurred. Whilst UnExp correlated positively with conjunction proneness (controlling for perception of randomness), there was no association between UnExp and perception of randomness (controlling for conjunction).

  9. Body fat indices and biomarkers of inflammation: a cross-sectional study with implications for obesity and peri-implant oral health.

    PubMed

    Elangovan, Satheesh; Brogden, Kim A; Dawson, Deborah V; Blanchette, Derek; Pagan-Rivera, Keyla; Stanford, Clark M; Johnson, Georgia K; Recker, Erica; Bowers, Rob; Haynes, William G; Avila-Ortiz, Gustavo

    2014-01-01

    To examine the relationships between three measures of body fat-body mass index (BMI), waist circumference (WC), and total body fat percent-and markers of inflammation around dental implants in stable periodontal maintenance patients. Seventy-three subjects were enrolled in this cross-sectional assessment. The study visit consisted of a physical examination that included anthropologic measurements of body composition (BMI, WC, body fat %); intraoral assessments were performed (full-mouth plaque index, periodontal and peri-implant comprehensive examinations) and peri-implant sulcular fluid (PISF) was collected on the study implants. Levels of interleukin (IL)-1α, IL-1β, IL-6, IL-8, IL-10, IL-12, IL-17, tumor necrosis factor-α, C-reactive protein, osteoprotegerin, leptin, and adiponectin in the PISF were measured using multiplex proteomic immunoassays. Correlation analysis with body fat measures was then performed using appropriate statistical methods. After adjustments for covariates, regression analyses revealed statistically significant correlation between IL-1β in PISF and WC (R = 0.33; P = .0047). In this study in stable periodontal maintenance patients, a modest but statistically significant positive correlation was observed between the levels of IL-1β, a major proinflammatory cytokine in PISF, and WC, a reliable measure of central obesity.

  10. A Self-Organizing Map-Based Approach to Generating Reduced-Size, Statistically Similar Climate Datasets

    NASA Astrophysics Data System (ADS)

    Cabell, R.; Delle Monache, L.; Alessandrini, S.; Rodriguez, L.

    2015-12-01

    Climate-based studies require large amounts of data in order to produce accurate and reliable results. Many of these studies have used 30-plus year data sets in order to produce stable and high-quality results, and as a result, many such data sets are available, generally in the form of global reanalyses. While the analysis of these data lead to high-fidelity results, its processing can be very computationally expensive. This computational burden prevents the utilization of these data sets for certain applications, e.g., when rapid response is needed in crisis management and disaster planning scenarios resulting from release of toxic material in the atmosphere. We have developed a methodology to reduce large climate datasets to more manageable sizes while retaining statistically similar results when used to produce ensembles of possible outcomes. We do this by employing a Self-Organizing Map (SOM) algorithm to analyze general patterns of meteorological fields over a regional domain of interest to produce a small set of "typical days" with which to generate the model ensemble. The SOM algorithm takes as input a set of vectors and generates a 2D map of representative vectors deemed most similar to the input set and to each other. Input predictors are selected that are correlated with the model output, which in our case is an Atmospheric Transport and Dispersion (T&D) model that is highly dependent on surface winds and boundary layer depth. To choose a subset of "typical days," each input day is assigned to its closest SOM map node vector and then ranked by distance. Each node vector is treated as a distribution and days are sampled from them by percentile. Using a 30-node SOM, with sampling every 20th percentile, we have been able to reduce 30 years of the Climate Forecast System Reanalysis (CFSR) data for the month of October to 150 "typical days." To estimate the skill of this approach, the "Measure of Effectiveness" (MOE) metric is used to compare area and overlap of statistical exceedance between the reduced data set and the full 30-year CFSR dataset. Using the MOE, we find that our SOM-derived climate subset produces statistics that fall within 85-90% overlap with the full set while using only 15% of the total data length, and consequently, 15% of the computational time required to run the T&D model for the full period.

  11. Variations on Bayesian Prediction and Inference

    DTIC Science & Technology

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  12. In vitro cavity and crown preparations and direct restorations: A comparison of performance at the start and end of the FD programme.

    PubMed

    Burke, F J T; Ravaghi, V; Mackenzie, L; Priest, N; Falcon, H C

    2017-04-21

    Aim To assess the performance and thereby the progress of the FDs when they carried out a number of simulated clinical exercises at the start and at the end of their FD year.Methods A standardised simulated clinical restorative dentistry training exercise was carried out by a group of 61 recently qualified dental graduates undertaking a 12 months' duration foundation training programme in England, at both the start and end of the programme. Participants completed a Class II cavity preparation and amalgam restoration, a Class IV composite resin restoration and two preparations for a porcelain-metal full crown. The completed preparations and restorations were independently assessed by an experienced consultant in restorative dentistry, using a scoring system based on previously validated criteria. The data were subjected to statistical analysis.Results There was wide variation in individual performance. Overall, there was a small but not statistically significant improvement in performance by the end of the programme. A statistically significant improvement was observed for the amalgam preparation and restoration, and, overall, for one of the five geographical sub-groups in the study. Possible reasons for the variable performance and improvement are discussed.Conclusions There was variability in the performance of the FDs. The operative performance of FDs at the commencement and end of their FD year indicated an overall moderately improved performance over the year and a statistically significant improvement in their performance with regard to amalgam restoration.

  13. Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC

    PubMed Central

    Templeton, Alan R.

    2009-01-01

    Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182

  14. Numerically exact full counting statistics of the nonequilibrium Anderson impurity model

    NASA Astrophysics Data System (ADS)

    Ridley, Michael; Singh, Viveka N.; Gull, Emanuel; Cohen, Guy

    2018-03-01

    The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n -electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events.

  15. Full counting statistics of a charge pump in the Coulomb blockade regime

    NASA Astrophysics Data System (ADS)

    Andreev, A. V.; Mishchenko, E. G.

    2001-12-01

    We study full charge counting statistics (FCCS) of a charge pump based on a nearly open single electron transistor. The problem is mapped onto an exactly soluble problem of a nonequilibrium g=1/2 Luttinger liquid with an impurity. We obtain an analytic expression for the generating function of the transmitted charge for an arbitrary pumping strength. Although this model contains fractionally charged excitations only integer transmitted charges can be observed. In the weak pumping limit FCCS correspond to a Poissonian transmission of particles with charge e*=e/2 from which all events with odd numbers of transferred particles are excluded.

  16. Numerically exact full counting statistics of the nonequilibrium Anderson impurity model

    DOE PAGES

    Ridley, Michael; Singh, Viveka N.; Gull, Emanuel; ...

    2018-03-06

    The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n-electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events

  17. Numerically exact full counting statistics of the nonequilibrium Anderson impurity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridley, Michael; Singh, Viveka N.; Gull, Emanuel

    The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n-electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events

  18. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  19. Cloning of the cDNA encoding adenosine 5'-monophosphate deaminase 1 and its mRNA expression in Japanese flounder Paralichthys olivaceus

    NASA Astrophysics Data System (ADS)

    Jiang, Keyong; Sun, Shujuan; Liu, Mei; Wang, Baojie; Meng, Xiaolin; Wang, Lei

    2013-01-01

    AMP deaminase catalyzes the conversion of AMP into IMP and ammonia. In the present study, a full-length cDNA of AMPD1 from skeletal muscle of Japanese flounder Paralichthys olivaceus was cloned and characterized. The 2 526 bp cDNA contains a 5'-UTR of 78 bp, a 3'-UTR of 237 bp and an open reading frame (ORF) of 2 211 bp, which encodes a protein of 736 amino acids. The predicted protein contains a highly conserved AMP deaminase motif (SLSTDDP) and an ATP-binding site sequence (EPLMEEYAIAAQVFK). Phylogenetic analysis showed that the AMPD1 and AMPD3 genes originate from the same branch, but are evolutionarily distant from the AMPD2 gene. RT-PCR showed that the flounder AMPD1 gene was expressed only in skeletal muscle. QRT-PCR analysis revealed a statistically significant 2.54 fold higher level of AMPD1 mRNA in adult muscle (750±40 g) compared with juvenile muscle (7.5±2 g) ( P<0.05). HPLC analysis showed that the IMP content in adult muscle (3.35±0.21 mg/g) was also statistically significantly higher than in juvenile muscle (1.08±0.04 mg/g) ( P<0.05). There is a direct relationship between the AMPD1 gene expression level and IMP content in the skeletal muscle of juvenile and adult flounders. These results may provide useful information for quality improvement and molecular breeding of aquatic animals.

  20. Moment-based metrics for global sensitivity analysis of hydrological systems

    NASA Astrophysics Data System (ADS)

    Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto

    2017-12-01

    We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.

  1. Risk factors for repetitive strain injuries among school teachers in Thailand.

    PubMed

    Chaiklieng, Sunisa; Suggaravetsiri, Pornnapa

    2012-01-01

    Prolonged posture, static works and repetition are previously reported as the cause of repetitive strain injuries (RSIs) among workers including teachers. This cross-sectional analytic study aimed to investigate the prevalence and risk factors of RSIs among school teachers. Participants were 452 full-time school teachers in Thailand. Data were collected by the structural questionnaires, illuminance measurements and the physical fitness tests. Descriptive statistics and inferential statistics which were Chi-square test and multiple logistic regression analysis were used. Most teachers in this study were females (57.3%), the mean years of work experience was 22.6 ± 10.4 years. The six-month prevalence of RSIs was 73.7%. The univariate analysis identified the related risk factors to RSIs which were chronic disease (OR=1.8; 95% CI = 1.16-2.73), history of trauma (OR=2.0; 95% CI = 1.02-4.01), member of family had RSIs (OR=2.0; 95% CI = 1.02- 4.01), stretch to write on board (OR=1.7; 95% CI = 1.06-1.70) and high heel shoe >2 inch (OR=1.6; 95% CI = 1.03-2.51). Multiple logistic regression analysis showed that chronic diseases and high heel shoe >2 inch significantly related to developing of RSIs. The poor grip strength and back muscle flexibility significantly affected RSIs of teachers. In conclusions, RSIs were highly prevalent in school teachers that they should be aware of health promotion to prevent RSIs.

  2. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. 77 FR 1454 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-10

    ..., statistical analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and... following disciplines: Demography, economics, geography, psychology, statistics, survey methodology, social... technical expertise in such areas as demography, economics, geography, psychology, statistics, survey...

  4. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  5. Childhood growth and development associated with need for full-time special education at school age.

    PubMed

    Mannerkoski, Minna; Aberg, Laura; Hoikkala, Marianne; Sarna, Seppo; Kaski, Markus; Autti, Taina; Heiskala, Hannu

    2009-01-01

    To explore how growth measurements and attainment of developmental milestones in early childhood reflect the need for full-time special education (SE). After stratification in this population-based study, 900 pupils in full-time SE groups (age-range 7-16 years, mean 12 years 8 months) at three levels and 301 pupils in mainstream education (age-range 7-16, mean 12 years 9 months) provided data on height and weight from birth to age 7 years and head circumference to age 1 year. Developmental screening was evaluated from age 1 month to 48 months. Statistical methods included a general linear model (growth measurements), binary logistic regression analysis (odds ratios for growth), and multinomial logistic regression analysis (odds ratios for developmental milestones). At 1 year, a 1 standard deviation score (SDS) decrease in height raised the probability of SE placement by 40%, and a 1 SDS decrease in head size by 28%. In developmental screening, during the first months of life the gross motor milestones, especially head support, differentiated the children at levels 0-3. Thereafter, the fine motor milestones and those related to speech and social skills became more important. Children whose growth is mildly impaired, though in the normal range, and who fail to attain certain developmental milestones have an increased probability for SE and thus a need for special attention when toddlers age. Similar to the growth curves, these children seem to have consistent developmental curves (patterns).

  6. Which has a Greater Influence on Smile Esthetics Perception: Teeth or Lips?

    PubMed

    Farzanegan, Fahimeh; Jahanbin, Arezoo; Darvishpour, Hadi; Salari, Soheil

    2013-09-01

    The aim of this study was to evaluate the role of teeth and lips in the perception of smile esthetics. Thirty women, ranging between 20 and 30 years of age, all with Class I canine and molar relationships and no history of orthodontic treatment, were chosen. Five black and white photographs were taken of each participant in a natural head position while smiling. The most natural photo, demonstrating a social smile, was selected. Two other photographs were also taken from a dental frontal view of each subject using a retractor, as well as a lip-together smile. Three groups of judges including 20 orthodontists, 20 restorative specialists, and 20 laypersons were selected. The judges were then asked to confirm the esthetics of each picture on a visual analogue scale. An analysis of variance (ANOVA) and the Pearson correlation test were used for statistical analysis. For the orthodontists group, correlation between the scores given to the full smile and each of its components was significant (α=0.05), with equal correlation of each component with the full smile. In contrast to laypersons, the correlation between the scores given to the full smile and each of its components among restorative specialists was significant. For orthodontists and restorative specialists, esthetic details and the components of the smile (teeth and perioral soft tissues) were important in esthetics perception. In contrast, laypersons perceived no effect of esthetics detail or smile components.

  7. Multiple Myeloma and Glyphosate Use: A Re-Analysis of US Agricultural Health Study (AHS) Data

    PubMed Central

    Sorahan, Tom

    2015-01-01

    A previous publication of 57,311 pesticide applicators enrolled in the US Agricultural Health Study (AHS) produced disparate findings in relation to multiple myeloma risks in the period 1993–2001 and ever-use of glyphosate (32 cases of multiple myeloma in the full dataset of 54,315 applicators without adjustment for other variables: rate ratio (RR) 1.1, 95% confidence interval (CI) 0.5 to 2.4; 22 cases of multiple myeloma in restricted dataset of 40,719 applicators with adjustment for other variables: RR 2.6, 95% CI 0.7 to 9.4). It seemed important to determine which result should be preferred. RRs for exposed and non-exposed subjects were calculated using Poisson regression; subjects with missing data were not excluded from the main analyses. Using the full dataset adjusted for age and gender the analysis produced a RR of 1.12 (95% CI 0.50 to 2.49) for ever-use of glyphosate. Additional adjustment for lifestyle factors and use of ten other pesticides had little effect (RR 1.24, 95% CI 0.52 to 2.94). There were no statistically significant trends for multiple myeloma risks in relation to reported cumulative days (or intensity weighted days) of glyphosate use. The doubling of risk reported previously arose from the use of an unrepresentative restricted dataset and analyses of the full dataset provides no convincing evidence in the AHS for a link between multiple myeloma risk and glyphosate use. PMID:25635915

  8. Which has a Greater Influence on Smile Esthetics Perception: Teeth or Lips?

    PubMed Central

    Farzanegan, Fahimeh; Jahanbin, Arezoo; Darvishpour, Hadi; Salari, Soheil

    2013-01-01

    Introduction: The aim of this study was to evaluate the role of teeth and lips in the perception of smile esthetics. Materials and Methods: Thirty women, ranging between 20 and 30 years of age, all with Class I canine and molar relationships and no history of orthodontic treatment, were chosen. Five black and white photographs were taken of each participant in a natural head position while smiling. The most natural photo, demonstrating a social smile, was selected. Two other photographs were also taken from a dental frontal view of each subject using a retractor, as well as a lip-together smile. Three groups of judges including 20 orthodontists, 20 restorative specialists, and 20 laypersons were selected. The judges were then asked to confirm the esthetics of each picture on a visual analogue scale. An analysis of variance (ANOVA) and the Pearson correlation test were used for statistical analysis. Results: For the orthodontists group, correlation between the scores given to the full smile and each of its components was significant (α=0.05), with equal correlation of each component with the full smile. In contrast to laypersons, the correlation between the scores given to the full smile and each of its components among restorative specialists was significant. Conclusion: For orthodontists and restorative specialists, esthetic details and the components of the smile (teeth and perioral soft tissues) were important in esthetics perception. In contrast, laypersons perceived no effect of esthetics detail or smile components. PMID:24303447

  9. Corneal Aberrations in Former Preterm Infants: Results From The Wiesbaden Prematurity Study.

    PubMed

    Fieß, Achim; Schuster, Alexander K; Kölb-Keerl, Ruth; Knuf, Markus; Kirchhof, Bernd; Muether, Philipp S; Bauer, Jacqueline

    2017-12-01

    To compare corneal aberrations in former preterm infants to that of full-term infants. A prospective cross-sectional study was carried out measuring the corneal shape with Scheimpflug imaging in former preterm infants of gestational age (GA) ≤32 weeks and full-term infants with GA ≥37 weeks now being aged between 4 to 10 years. The main outcome measures were corneal aberrations including astigmatism (Zernike: Z2-2; Z22), coma (Z3-1; Z31), trefoil (Z3-3; Z33), spherical aberration (Z40) and root-mean square of higher-order aberrations (RMS HOA). Multivariable analysis was performed to assess independent associations of gestational age groups and of retinopathy of prematurity (ROP) occurrence with corneal aberrations adjusting for sex and age at examination. A total of 259 former full-term and 226 preterm infants with a mean age of 7.2 ± 2.0 years were included in this study. Statistical analysis revealed an association of extreme prematurity (GA ≤28 weeks) with higher-order and lower-order aberrations of the total cornea. Vertical coma was higher in extreme prematurity (P < 0.001), due to the shape of the anterior corneal surface, while there was no association with trefoil and spherical aberration. ROP was not associated with higher-order aberrations when adjusted for gestational age group. This study demonstrated that specific corneal aberrations were associated with extreme prematurity rather than with ROP occurrence.

  10. Multiple myeloma and glyphosate use: a re-analysis of US Agricultural Health Study (AHS) data.

    PubMed

    Sorahan, Tom

    2015-01-28

    A previous publication of 57,311 pesticide applicators enrolled in the US Agricultural Health Study (AHS) produced disparate findings in relation to multiple myeloma risks in the period 1993-2001 and ever-use of glyphosate (32 cases of multiple myeloma in the full dataset of 54,315 applicators without adjustment for other variables: rate ratio (RR) 1.1, 95% confidence interval (CI) 0.5 to 2.4; 22 cases of multiple myeloma in restricted dataset of 40,719 applicators with adjustment for other variables: RR 2.6, 95% CI 0.7 to 9.4). It seemed important to determine which result should be preferred. RRs for exposed and non-exposed subjects were calculated using Poisson regression; subjects with missing data were not excluded from the main analyses. Using the full dataset adjusted for age and gender the analysis produced a RR of 1.12 (95% CI 0.50 to 2.49) for ever-use of glyphosate. Additional adjustment for lifestyle factors and use of ten other pesticides had little effect (RR 1.24, 95% CI 0.52 to 2.94). There were no statistically significant trends for multiple myeloma risks in relation to reported cumulative days (or intensity weighted days) of glyphosate use. The doubling of risk reported previously arose from the use of an unrepresentative restricted dataset and analyses of the full dataset provides no convincing evidence in the AHS for a link between multiple myeloma risk and glyphosate use.

  11. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al.'s originally-proposed statistics, on account of the inflated error rate that can result. PMID:22496670

  12. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  13. Central nervous system antiretroviral efficacy in HIV infection: a qualitative and quantitative review and implications for future research.

    PubMed

    Cysique, Lucette A; Waters, Edward K; Brew, Bruce J

    2011-11-22

    There is conflicting information as to whether antiretroviral drugs with better central nervous system (CNS) penetration (neuroHAART) assist in improving neurocognitive function and suppressing cerebrospinal fluid (CSF) HIV RNA. The current review aims to better synthesise existing literature by using an innovative two-phase review approach (qualitative and quantitative) to overcome methodological differences between studies. Sixteen studies, all observational, were identified using a standard citation search. They fulfilled the following inclusion criteria: conducted in the HAART era; sample size > 10; treatment effect involved more than one antiretroviral and none had a retrospective design. The qualitative phase of review of these studies consisted of (i) a blind assessment rating studies on features such as sample size, statistical methods and definitions of neuroHAART, and (ii) a non-blind assessment of the sensitivity of the neuropsychological methods to HIV-associated neurocognitive disorder (HAND). During quantitative evaluation we assessed the statistical power of studies, which achieved a high rating in the qualitative analysis. The objective of the power analysis was to determine the studies ability to assess their proposed research aims. After studies with at least three limitations were excluded in the qualitative phase, six studies remained. All six found a positive effect of neuroHAART on neurocognitive function or CSF HIV suppression. Of these six studies, only two had statistical power of at least 80%. Studies assessed as using more rigorous methods found that neuroHAART was effective in improving neurocognitive function and decreasing CSF viral load, but only two of those studies were adequately statistically powered. Because all of these studies were observational, they represent a less compelling evidence base than randomised control trials for assessing treatment effect. Therefore, large randomised trials are needed to determine the robustness of any neuroHAART effect. However, such trials must be longitudinal, include the full spectrum of HAND, ideally carefully control for co-morbidities, and be based on optimal neuropsychology methods.

  14. Linkage analysis of systolic blood pressure: a score statistic and computer implementation

    PubMed Central

    Wang, Kai; Peng, Yingwei

    2003-01-01

    A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145

  15. The Shock and Vibration Digest. Volume 16, Number 1

    DTIC Science & Technology

    1984-01-01

    investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is

  16. Do Knee Bracing and Delayed Weight Bearing Affect Mid-Term Functional Outcome after Anterior Cruciate Ligament Reconstruction?

    PubMed

    Di Miceli, Riccardo; Marambio, Carlotta Bustos; Zati, Alessandro; Monesi, Roberta; Benedetti, Maria Grazia

    2017-12-01

    Purpose  The aim of this study was to assess the effect of knee bracing and timing of full weight bearing after anterior cruciate ligament reconstruction (ACLR) on functional outcomes at mid-term follow-up. Methods  We performed a retrospective study on 41 patients with ACLR. Patients were divided in two groups: ACLR group, who received isolated ACL reconstruction and ACLR-OI group who received ACL reconstruction and adjunctive surgery. Information about age at surgery, bracing, full or progressive weight bearing permission after surgery were collected for the two groups. Subjective IKDC score was obtained at follow-up. Statistical analysis was performed to compare the two groups for IKDC score. Subgroup analysis was performed to assess the effect of postoperative regimen (knee bracing and weight bearing) on functional outcomes. Results  The mean age of patients was 30.8 ± 10.6 years. Mean IKDC score was 87.4 ± 13.9. The mean follow-up was 3.5 ± 1.8 years. Twenty-two (53.7%) patients underwent ACLR only, while 19 (46.3%) also received other interventions, such as meniscal repair and/or collateral ligament suture. Analysis of overall data showed no differences between the groups for IKDC score. Patients in the ACLR group exhibited a significantly better IKDC score when no brace and full weight bearing after 4 weeks from surgery was prescribed in comparison with patients who worn a brace and had delayed full weight bearing. No differences were found with respect to the use of brace and postoperative weight bearing regimen in the ACLR-OI group. Conclusion  Brace and delayed weight bearing after ACLR have a negative influence on long-term functional outcomes. Further research is required to explore possible differences in the patients operated on ACLR and other intervention with respect to the use of a brace and the timing of full weight bearing to identify optimal recovery strategies. Level of Evidence  Level III, retrospective observational study.

  17. PYCHEM: a multivariate analysis package for python.

    PubMed

    Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston

    2006-10-15

    We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem

  18. A local structure model for network analysis

    DOE PAGES

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    2017-04-01

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  19. A local structure model for network analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  20. Interpolation of longitudinal shape and image data via optimal mass transport

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Zhu, Liang-Jia; Bouix, Sylvain; Tannenbaum, Allen

    2014-03-01

    Longitudinal analysis of medical imaging data has become central to the study of many disorders. Unfortunately, various constraints (study design, patient availability, technological limitations) restrict the acquisition of data to only a few time points, limiting the study of continuous disease/treatment progression. Having the ability to produce a sensible time interpolation of the data can lead to improved analysis, such as intuitive visualizations of anatomical changes, or the creation of more samples to improve statistical analysis. In this work, we model interpolation of medical image data, in particular shape data, using the theory of optimal mass transport (OMT), which can construct a continuous transition from two time points while preserving "mass" (e.g., image intensity, shape volume) during the transition. The theory even allows a short extrapolation in time and may help predict short-term treatment impact or disease progression on anatomical structure. We apply the proposed method to the hippocampus-amygdala complex in schizophrenia, the heart in atrial fibrillation, and full head MR images in traumatic brain injury.

Top