Statistical methodology for the analysis of dye-switch microarray experiments
Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques
2008-01-01
Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965
Randomization Procedures Applied to Analysis of Ballistic Data
1991-06-01
test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE
Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems
1985-10-01
statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful
Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.
G.R. Johnson; J.N. King
1998-01-01
Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...
ERIC Educational Resources Information Center
Madhere, Serge
An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…
Consequences of common data analysis inaccuracies in CNS trauma injury basic research.
Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K
2013-05-15
The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.
A Statistical Analysis of Brain Morphology Using Wild Bootstrapping
Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.
2008-01-01
Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Predicting juvenile recidivism: new method, old problems.
Benda, B B
1987-01-01
This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.
A Primer on Multivariate Analysis of Variance (MANOVA) for Behavioral Scientists
ERIC Educational Resources Information Center
Warne, Russell T.
2014-01-01
Reviews of statistical procedures (e.g., Bangert & Baumberger, 2005; Kieffer, Reese, & Thompson, 2001; Warne, Lazo, Ramos, & Ritter, 2012) show that one of the most common multivariate statistical methods in psychological research is multivariate analysis of variance (MANOVA). However, MANOVA and its associated procedures are often not…
40 CFR 1065.12 - Approval of alternate procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engine meets all applicable emission standards according to specified procedures. (iii) Use statistical.... (e) We may give you specific directions regarding methods for statistical analysis, or we may approve... statistical tests. Perform the tests as follows: (1) Repeat measurements for all applicable duty cycles at...
ERIC Educational Resources Information Center
Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.
In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…
Hoyle, R H
1991-02-01
Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.
Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof
2009-01-01
The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066
Generalized Appended Product Indicator Procedure for Nonlinear Structural Equation Analysis.
ERIC Educational Resources Information Center
Wall, Melanie M.; Amemiya, Yasuo
2001-01-01
Considers the estimation of polynomial structural models and shows a limitation of an existing method. Introduces a new procedure, the generalized appended product indicator procedure, for nonlinear structural equation analysis. Addresses statistical issues associated with the procedure through simulation. (SLD)
Watanabe, Hiroshi
2012-01-01
Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
Statistical Reform in School Psychology Research: A Synthesis
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
Using SPSS to Analyze Book Collection Data.
ERIC Educational Resources Information Center
Townley, Charles T.
1981-01-01
Describes and illustrates Statistical Package for the Social Sciences (SPSS) procedures appropriate for book collection data analysis. Several different procedures for univariate, bivariate, and multivariate analysis are discussed, and applications of procedures for book collection studies are presented. Included are 24 tables illustrating output…
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
Statistical Signal Models and Algorithms for Image Analysis
1984-10-25
In this report, two-dimensional stochastic linear models are used in developing algorithms for image analysis such as classification, segmentation, and object detection in images characterized by textured backgrounds. These models generate two-dimensional random processes as outputs to which statistical inference procedures can naturally be applied. A common thread throughout our algorithms is the interpretation of the inference procedures in terms of linear prediction
Uncertainty Analysis for DAM Projects.
1987-09-01
overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases
Biostatistical analysis of quantitative immunofluorescence microscopy images.
Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C
2016-12-01
Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
The use of analysis of variance procedures in biological studies
Williams, B.K.
1987-01-01
The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
Quantifying the impact of between-study heterogeneity in multivariate meta-analyses
Jackson, Dan; White, Ian R; Riley, Richard D
2012-01-01
Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950
DOT National Transportation Integrated Search
1981-10-01
Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...
ERIC Educational Resources Information Center
Nitko, Anthony J.; Hsu, Tse-chi
Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…
A scaling procedure for the response of an isolated system with high modal overlap factor
NASA Astrophysics Data System (ADS)
De Rosa, S.; Franco, F.
2008-10-01
The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U
2009-05-01
In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.
A close examination of double filtering with fold change and t test in microarray analysis
2009-01-01
Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439
Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell
2012-01-01
Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less
Determining the Statistical Significance of Relative Weights
ERIC Educational Resources Information Center
Tonidandel, Scott; LeBreton, James M.; Johnson, Jeff W.
2009-01-01
Relative weight analysis is a procedure for estimating the relative importance of correlated predictors in a regression equation. Because the sampling distribution of relative weights is unknown, researchers using relative weight analysis are unable to make judgments regarding the statistical significance of the relative weights. J. W. Johnson…
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.
Christensen, G B; Knight, S; Camp, N J
2009-11-01
We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.
This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...
Statistics and Discoveries at the LHC (1/4)
Cowan, Glen
2018-02-09
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
Cowan, Glen
2018-02-19
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
Cowan, Glen
2018-05-22
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
Cowan, Glen
2018-04-26
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistical analysis of the calibration procedure for personnel radiation measurement instruments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.
1980-11-01
Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less
Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai
2014-11-10
Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Ostrouchov; W.E.Doll; D.A.Wolf
2003-07-01
Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.
ERIC Educational Resources Information Center
Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.
2008-01-01
Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…
Applications of Nonlinear Principal Components Analysis to Behavioral Data.
ERIC Educational Resources Information Center
Hicks, Marilyn Maginley
1981-01-01
An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)
1990-03-01
equation of the statistical energy analysis (SEA) using the procedure indicated in equation (13) [8, 9]. Similarly, one may state the quantities (. (X-)) and...CONGRESS ON ACOUSTICS, July 24-31 1986, Toronto, Canada, Paper D6-1. 5. CUSCHIERI, J.M., Power flow as a compliment to statistical energy analysis and...34Random response of identical one-dimensional subsystems", Journal of Sound and Vibration, 1980, Vol. 70, p. 343-353. 8. LYON, R.H., Statistical Energy Analysis of
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
GWAR: robust analysis and meta-analysis of genome-wide association studies.
Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G
2017-05-15
In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
On Statistical Analysis of Neuroimages with Imperfect Registration
Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas
2016-01-01
A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.
1979-01-01
A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.
NASA Technical Reports Server (NTRS)
Wong, K. W.
1974-01-01
In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.
Global aesthetic surgery statistics: a closer look.
Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas
2017-08-01
Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.
ICAP - An Interactive Cluster Analysis Procedure for analyzing remotely sensed data
NASA Technical Reports Server (NTRS)
Wharton, S. W.; Turner, B. J.
1981-01-01
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. ICAP differs from conventional clustering algorithms by allowing the analyst to optimize the cluster configuration by inspection, rather than by manipulating process parameters. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters, and the analyst, who can evaluate and elect to modify the cluster structure. Clusters can be deleted, or lumped together pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The principal advantage of this approach is that it allows prior information (when available) to be used directly in the analysis, since the analyst interacts with ICAP in a straightforward manner, using basic terms with which he is more likely to be familiar. Results from testing ICAP showed that an informed use of ICAP can improve classification, as compared to an existing cluster analysis procedure.
Statistics for People Who (Think They) Hate Statistics. Third Edition
ERIC Educational Resources Information Center
Salkind, Neil J.
2007-01-01
This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
Statistical analysis and digital processing of the Mössbauer spectra
NASA Astrophysics Data System (ADS)
Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri
2010-02-01
This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.
Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.
ERIC Educational Resources Information Center
Pickett, John C.
1984-01-01
AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)
WINPEPI updated: computer programs for epidemiologists, and their teaching potential
2011-01-01
Background The WINPEPI computer programs for epidemiologists are designed for use in practice and research in the health field and as learning or teaching aids. The programs are free, and can be downloaded from the Internet. Numerous additions have been made in recent years. Implementation There are now seven WINPEPI programs: DESCRIBE, for use in descriptive epidemiology; COMPARE2, for use in comparisons of two independent groups or samples; PAIRSetc, for use in comparisons of paired and other matched observations; LOGISTIC, for logistic regression analysis; POISSON, for Poisson regression analysis; WHATIS, a "ready reckoner" utility program; and ETCETERA, for miscellaneous other procedures. The programs now contain 122 modules, each of which provides a number, sometimes a large number, of statistical procedures. The programs are accompanied by a Finder that indicates which modules are appropriate for different purposes. The manuals explain the uses, limitations and applicability of the procedures, and furnish formulae and references. Conclusions WINPEPI is a handy resource for a wide variety of statistical routines used by epidemiologists. Because of its ready availability, portability, ease of use, and versatility, WINPEPI has a considerable potential as a learning and teaching aid, both with respect to practical procedures in the planning and analysis of epidemiological studies, and with respect to important epidemiological concepts. It can also be used as an aid in the teaching of general basic statistics. PMID:21288353
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Quality assurance software inspections at NASA Ames: Metrics for feedback and modification
NASA Technical Reports Server (NTRS)
Wenneson, G.
1985-01-01
Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2013 CFR
2013-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2014 CFR
2014-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2011 CFR
2011-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2012 CFR
2012-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Evaluation Using Sequential Trials Methods.
ERIC Educational Resources Information Center
Cohen, Mark E.; Ralls, Stephen A.
1986-01-01
Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)
ERIC Educational Resources Information Center
Mauriello, David
1984-01-01
Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
An Introduction to Path Analysis
ERIC Educational Resources Information Center
Wolfe, Lee M.
1977-01-01
The analytical procedure of path analysis is described in terms of its use in nonexperimental settings in the social sciences. The description assumes a moderate statistical background on the part of the reader. (JKS)
Santori, G; Fontana, I; Bertocchi, M; Gasloli, G; Valente, U
2010-05-01
Following the example of many Western countries, where a "minimum volume rule" policy has been adopted as a quality parameter for complex surgical procedures, the Italian National Transplant Centre set the minimum number of kidney transplantation procedures/y at 30/center. The number of procedures performed in a single center over a large period may be treated as a time series to evaluate trends, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1983, and December 31, 2007, we performed 1376 procedures in adult or pediatric recipients from living or cadaveric donors. The greatest numbers of cases/y were performed in 1998 (n = 86) followed by 2004 (n = 82), 1996 (n = 75), and 2003 (n = 73). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed a whole incremental trend after exponential smoothing as well as after seasonal decomposition. However, starting from 2005, we observed a decreased trend in the series. The number of kidney transplants expected to be performed for 2008 by using the Holt-Winters exponential smoothing applied to the period 1983 to 2007 suggested 58 procedures, while in that year there were 52. The time series approach may be helpful to establish a minimum volume/y at a single-center level. Copyright (c) 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Toppi, J; Petti, M; Vecchiato, G; Cincotti, F; Salinari, S; Mattia, D; Babiloni, F; Astolfi, L
2013-01-01
Partial Directed Coherence (PDC) is a spectral multivariate estimator for effective connectivity, relying on the concept of Granger causality. Even if its original definition derived directly from information theory, two modifies were introduced in order to provide better physiological interpretations of the estimated networks: i) normalization of the estimator according to rows, ii) squared transformation. In the present paper we investigated the effect of PDC normalization on the performances achieved by applying the statistical validation process on investigated connectivity patterns under different conditions of Signal to Noise ratio (SNR) and amount of data available for the analysis. Results of the statistical analysis revealed an effect of PDC normalization only on the percentages of type I and type II errors occurred by using Shuffling procedure for the assessment of connectivity patterns. No effects of the PDC formulation resulted on the performances achieved during the validation process executed instead by means of Asymptotic Statistic approach. Moreover, the percentages of both false positives and false negatives committed by Asymptotic Statistic are always lower than those achieved by Shuffling procedure for each type of normalization.
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Dangers in Using Analysis of Covariance Procedures.
ERIC Educational Resources Information Center
Campbell, Kathleen T.
Problems associated with the use of analysis of covariance (ANCOVA) as a statistical control technique are explained. Three problems relate to the use of "OVA" methods (analysis of variance, analysis of covariance, multivariate analysis of variance, and multivariate analysis of covariance) in general. These are: (1) the wasting of information when…
LaBudde, Robert A; Harnly, James M
2012-01-01
A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.
Analysis of Sensitivity Experiments - An Expanded Primer
2017-03-08
diehard practitioners. The difficulty associated with mastering statistical inference presents a true dilemma. Statistics is an extremely applied...lost, perhaps forever. In other words, when on this safari, you need a guide. This report is designed to be a guide, of sorts. It focuses on analytical...estimated accurately if our analysis is to have real meaning. For this reason, the sensitivity test procedure is designed to concentrate measurements
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Money Does Matter Somewhere: A Reply to Hanushek.
ERIC Educational Resources Information Center
Hedges, Larry V.; And Others
1994-01-01
Replies to E. A. Hanushek's questioning of the validity of meta-analysis as used by the authors in analyzing resource allocation and its effects on improving student academic performance. Statistical analysis procedures are examined. (GLR)
Hudson-Shore, Michelle
2016-12-01
The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.
Reproducibility-optimized test statistic for ranking genes in microarray studies.
Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero
2008-01-01
A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.
New insights into old methods for identifying causal rare variants.
Wang, Haitian; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Hu, Inchi
2011-11-29
The advance of high-throughput next-generation sequencing technology makes possible the analysis of rare variants. However, the investigation of rare variants in unrelated-individuals data sets faces the challenge of low power, and most methods circumvent the difficulty by using various collapsing procedures based on genes, pathways, or gene clusters. We suggest a new way to identify causal rare variants using the F-statistic and sliced inverse regression. The procedure is tested on the data set provided by the Genetic Analysis Workshop 17 (GAW17). After preliminary data reduction, we ranked markers according to their F-statistic values. Top-ranked markers were then subjected to sliced inverse regression, and those with higher absolute coefficients in the most significant sliced inverse regression direction were selected. The procedure yields good false discovery rates for the GAW17 data and thus is a promising method for future study on rare variants.
PROC IRT: A SAS Procedure for Item Response Theory
Matlock Cole, Ki; Paek, Insu
2017-01-01
This article reviews the procedure for item response theory (PROC IRT) procedure in SAS/STAT 14.1 to conduct item response theory (IRT) analyses of dichotomous and polytomous datasets that are unidimensional or multidimensional. The review provides an overview of available features, including models, estimation procedures, interfacing, input, and output files. A small-scale simulation study evaluates the IRT model parameter recovery of the PROC IRT procedure. The use of the IRT procedure in Statistical Analysis Software (SAS) may be useful for researchers who frequently utilize SAS for analyses, research, and teaching.
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zdarek, J.; Pecinka, L.
Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors.
On Muthen's Maximum Likelihood for Two-Level Covariance Structure Models
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro
2005-01-01
Data in social and behavioral sciences are often hierarchically organized. Special statistical procedures that take into account the dependence of such observations have been developed. Among procedures for 2-level covariance structure analysis, Muthen's maximum likelihood (MUML) has the advantage of easier computation and faster convergence. When…
A Comparison of Approaches for Setting Proficiency Standards.
ERIC Educational Resources Information Center
Koffler, Stephen L.
This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…
Traeger, Adrian C; Skinner, Ian W; Hübscher, Markus; Lee, Hopin; Moseley, G Lorimer; Nicholas, Michael K; Henschke, Nicholas; Refshauge, Kathryn M; Blyth, Fiona M; Main, Chris J; Hush, Julia M; Pearce, Garry; Lo, Serigne; McAuley, James H
Statistical analysis plans increase the transparency of decisions made in the analysis of clinical trial results. The purpose of this paper is to detail the planned analyses for the PREVENT trial, a randomized, placebo-controlled trial of patient education for acute low back pain. We report the pre-specified principles, methods, and procedures to be adhered to in the main analysis of the PREVENT trial data. The primary outcome analysis will be based on Mixed Models for Repeated Measures (MMRM), which can test treatment effects at specific time points, and the assumptions of this analysis are outlined. We also outline the treatment of secondary outcomes and planned sensitivity analyses. We provide decisions regarding the treatment of missing data, handling of descriptive and process measure data, and blinded review procedures. Making public the pre-specified statistical analysis plan for the PREVENT trial minimizes the potential for bias in the analysis of trial data, and in the interpretation and reporting of trial results. ACTRN12612001180808 (https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12612001180808). Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.
School District Enrollment Projections: A Comparison of Three Methods.
ERIC Educational Resources Information Center
Pettibone, Timothy J.; Bushan, Latha
This study assesses three methods of forecasting school enrollments: the cohort-sruvival method (grade progression), the statistical forecasting procedure developed by the Statistical Analysis System (SAS) Institute, and a simple ratio computation. The three methods were used to forecast school enrollments for kindergarten through grade 12 in a…
Some Conceptual Deficiencies in "Developmental" Behavior Genetics.
ERIC Educational Resources Information Center
Gottlieb, Gilbert
1995-01-01
Criticizes the application of the statistical procedures of the population-genetic approach within evolutionary biology to the study of psychological development. Argues that the application of the statistical methods of population genetics--primarily the analysis of variance--to the causes of psychological development is bound to result in a…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements
NASA Technical Reports Server (NTRS)
Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.
1988-01-01
The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.
NASA Technical Reports Server (NTRS)
Colvin, E. L.; Emptage, M. R.
1992-01-01
The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.
Harris, Alex; Reeder, Rachelle; Hyun, Jenny
2011-01-01
The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.
NASA Technical Reports Server (NTRS)
Holms, A. G.
1977-01-01
A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.
Handhayanti, Ludwy; Rustina, Yeni; Budiati, Tri
Premature infants tend to lose heat quickly. This loss can be aggravated when they have received an invasive procedure involving a venous puncture. This research uses crossover design by conducting 2 intervention tests to compare 2 different treatments on the same sample. This research involved 2 groups with 18 premature infants in each. The process of data analysis used a statistical independent t test. Interventions conducted in an open incubator showed a p value of .001 which statistically related to heat loss in premature infants. In contrast, the radiant warmer p value of .001 statistically referred to a different range of heat gain before and after the venous puncture was given. The radiant warmer saved the premature infant from hypothermia during the invasive procedure. However, it is inadvisable for routine care of newborn infants since it can increase insensible water loss.
Evaluation on the use of cerium in the NBL Titrimetric Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.
An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less
Colegrave, Nick
2017-01-01
A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
ERIC Educational Resources Information Center
Ling, Guangming; Rijmen, Frank
2011-01-01
The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…
ERIC Educational Resources Information Center
Zwick, Rebecca
2012-01-01
Differential item functioning (DIF) analysis is a key component in the evaluation of the fairness and validity of educational tests. The goal of this project was to review the status of ETS DIF analysis procedures, focusing on three aspects: (a) the nature and stringency of the statistical rules used to flag items, (b) the minimum sample size…
ERIC Educational Resources Information Center
O'Connell, Ann Aileen
The relationships among types of errors observed during probability problem solving were studied. Subjects were 50 graduate students in an introductory probability and statistics course. Errors were classified as text comprehension, conceptual, procedural, and arithmetic. Canonical correlation analysis was conducted on the frequencies of specific…
Theory and analysis of statistical discriminant techniques as applied to remote sensing data
NASA Technical Reports Server (NTRS)
Odell, P. L.
1973-01-01
Classification of remote earth resources sensing data according to normed exponential density statistics is reported. The use of density models appropriate for several physical situations provides an exact solution for the probabilities of classifications associated with the Bayes discriminant procedure even when the covariance matrices are unequal.
Advanced Categorical Statistics: Issues and Applications in Communication Research.
ERIC Educational Resources Information Center
Denham, Bryan E.
2002-01-01
Discusses not only the procedures, assumptions, and applications of advanced categorical statistics, but also covers some common misapplications, from which a great deal can be learned. Addresses the use and limitations of cross-tabulation and chi-square analysis, as well as issues such as observation independence and artificial inflation of a…
ERIC Educational Resources Information Center
Briere, John; Elliott, Diana M.
1993-01-01
Responds to article in which Nash et al. reported on effects of controlling for family environment when studying sexual abuse sequelae. Considers findings in terms of theoretical and statistical constraints placed on analysis of covariance and other partializing procedures. Questions use of covariate techniques to test hypotheses about causal role…
ERIC Educational Resources Information Center
Lewis, Virginia Vimpeny
2011-01-01
Number Concepts; Measurement; Geometry; Probability; Statistics; and Patterns, Functions and Algebra. Procedural Errors were further categorized into the following content categories: Computation; Measurement; Statistics; and Patterns, Functions, and Algebra. The results of the analysis showed the main sources of error for 6th, 7th, and 8th…
Human Deception Detection from Whole Body Motion Analysis
2015-12-01
9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL
1982-06-01
usefulness to the Untted States Antarctic mission as managed by the National Science Foundation. Various statistical measures were applied to the reported... statistical procedures that would evolve a general meteorological picture of each of these remote sites. Primary texts used as a basis for...processed by station for monthly, seasonal and annual statistics , as appropriate. The following outlines the evaluations completed for both
New robust statistical procedures for the polytomous logistic regression models.
Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro
2018-05-17
This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.
Gibson, W; Wagg, A
2016-07-01
To examine the trends in surgical treatment of stress urinary incontinence (SUI) in older women since the introduction of the mid-urethral sling. Analysis of data from Hospital Episode Statistics (HES) between 2000 and 2012. All surgical procedures for SUI in the National Health Service (NHS) in England. Retrospective cohort analysis of Hospital Episode Statistics for England from 2000 to 2012. Number of invasive, less invasive, and urethral bulking procedures performed in women in three age groups. There was a 90% fall in the number of invasive surgical treatments for SUI and a four-fold increase in the number of mid-urethral slings over this time. The total number of surgical procedures for SUI increased from 8458 to 13 219. However, the rise in the number of procedures in women aged over 75 was more modest-a three-fold increase from a low start of 187-and these women now make up a smaller proportion of all women receiving a mid-urethral sling (MUS). Despite the development and wide availability of a less invasive, safe and effective operation for stress urinary incontinence in older women, they do not appear to have benefitted. The reasons for this require prospective investigation. © 2015 Royal College of Obstetricians and Gynaecologists.
Hypothesis testing for band size detection of high-dimensional banded precision matrices.
An, Baiguo; Guo, Jianhua; Liu, Yufeng
2014-06-01
Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.
Cai, Li
2006-02-01
A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole
2006-02-01
To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.
NASA Technical Reports Server (NTRS)
Calkins, D. S.
1998-01-01
When the dependent (or response) variable response variable in an experiment has direction and magnitude, one approach that has been used for statistical analysis involves splitting magnitude and direction and applying univariate statistical techniques to the components. However, such treatment of quantities with direction and magnitude is not justifiable mathematically and can lead to incorrect conclusions about relationships among variables and, as a result, to flawed interpretations. This note discusses a problem with that practice and recommends mathematically correct procedures to be used with dependent variables that have direction and magnitude for 1) computation of mean values, 2) statistical contrasts of and confidence intervals for means, and 3) correlation methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrafiello, Gianpaolo, E-mail: gcarraf@tin.it; Lagana, Domenico; Lumia, Domenico
2007-09-15
The objective of this study was to analyze three ureteral stenting techniques in patients with malignant ureteral obstructions, considering the indications, techniques, procedural costs, and complications. In the period between June 2003 and June 2006, 45 patients with bilateral malignant ureteral obstructions were evaluated (24 males, 21 females; average age, 68.3; range, 42-87). All of the patients were treated with ureteral stenting: 30 (mild strictures) with direct stenting (insertion of the stent without predilation), 30 (moderate/severe strictures) with primary stenting (insertion of the stent after predilation in a one-stage procedure), and 30 (mild/moderate/severe strictures with infection) with secondary stenting (insertionmore » of the stent after predilation and 2-3 days after nephrostomy). The incidence of complications and procedural costs were compared by a statistical analysis. The primary technical success rate was 98.89%. We did not observe any major complications. The minor complication rate was 11.1%. The incidence of complications for the various techniques was not statistically significantly. The statistical analysis of costs demonstrated that the average cost of secondary stenting ( Euro 637; SD, Euro 115) was significantly higher than that of procedures which involved direct or primary stenting ( Euro 560; SD, Euro 108). We conclude that one-step stenting (direct or primary) is a valid option to secondary stenting in correctly selected patients, owing to the fact that when the procedure is performed by expert interventional radiologists there are high technical success rates, low complication rates, and a reduction in costs.« less
ERIC Educational Resources Information Center
Vivo, Juana-Maria; Franco, Manuel
2008-01-01
This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…
Multiple linear regression analysis
NASA Technical Reports Server (NTRS)
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Risk Factors Analysis for Occurrence of Asymptomatic Bacteriuria After Endourological Procedures
Junuzovic, Dzelaludin; Hasanbegovic, Munira
2014-01-01
Introduction: Endourological procedures are performed according to the principles of aseptic techniques, jet still in certain number of patients urinary tract infections may occur. Considering the risk of urinary tract infection, there is no unique opinion about the prophylactic use of antibiotics in endourological procedures. Goal: The objective of this study was to determine the connection between endourological procedures and occurrence of urinary infections and to analyze the risk factors of urinary infection for patients who were hospitalized at the Urology Clinic of the Clinical Center University of Sarajevo CCUS. Materials and Methods: The research was conducted as a prospective study on a sample of 208 patients of both genders, who were hospitalized at the Urology Clinic of the CCUS and to whom some endourological procedure was indicated for diagnostic or therapeutic purposes. We analyzed data from patient’s histories of illness, laboratory tests taken at admission and after endourological procedures, also surgical programs for endoscopic procedures. All patients were clinically examined prior to endoscopic procedures while after the treatment attention was focused to the symptoms of urinary tract infections. Results: Statistical analysis of the tested patients indicates that there is no significant difference in the presence of postoperative, compared to preoperative bacteriuria, which implies that the endourological procedures are safe procedures in terms of urinary tract infections. Preoperatively, the most commonly isolated bacteria was Escherichia coli (30.9%) and postoperatively, Enterococcus faecalis (25%). Statistically significant effect on the occurrence of postoperative bacteriuria has preoperative bacteriuria, duration of postoperative catheterization, and duration of hospitalization. Conclusion: In everyday urological practice, it is very important to identify and control risk factors for the development of urinary infection after endourological procedures, with main objective to minimize occurrence of infectious complications. PMID:25568546
Round-off errors in cutting plane algorithms based on the revised simplex procedure
NASA Technical Reports Server (NTRS)
Moore, J. E.
1973-01-01
This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.
Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-06-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh
2017-01-01
Context: Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. Aim: The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. Settings and Design: This was a randomized, split-mouth, cross-over study. Materials and Methods: A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Statistical Analysis Used: Paired t-test and repeated measure ANOVA. Results: The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Conclusion: Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures. PMID:29386796
Pounds, Stan; Cao, Xueyuan; Cheng, Cheng; Yang, Jun; Campana, Dario; Evans, William E.; Pui, Ching-Hon; Relling, Mary V.
2010-01-01
Powerful methods for integrated analysis of multiple biological data sets are needed to maximize interpretation capacity and acquire meaningful knowledge. We recently developed Projection Onto the Most Interesting Statistical Evidence (PROMISE). PROMISE is a statistical procedure that incorporates prior knowledge about the biological relationships among endpoint variables into an integrated analysis of microarray gene expression data with multiple biological and clinical endpoints. Here, PROMISE is adapted to the integrated analysis of pharmacologic, clinical, and genome-wide genotype data that incorporating knowledge about the biological relationships among pharmacologic and clinical response data. An efficient permutation-testing algorithm is introduced so that statistical calculations are computationally feasible in this higher-dimension setting. The new method is applied to a pediatric leukemia data set. The results clearly indicate that PROMISE is a powerful statistical tool for identifying genomic features that exhibit a biologically meaningful pattern of association with multiple endpoint variables. PMID:21516175
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder
2009-08-15
In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.
Trillsch, F; Mahner, S; Vettorazzi, E; Woelber, L; Reuss, A; Baumann, K; Keyver-Paik, M-D; Canzler, U; Wollschlaeger, K; Forner, D; Pfisterer, J; Schroeder, W; Muenstedt, K; Richter, B; Fotopoulou, C; Schmalfeldt, B; Burges, A; Ewald-Riegler, N; de Gregorio, N; Hilpert, F; Fehm, T; Meier, W; Hillemanns, P; Hanker, L; Hasenburg, A; Strauss, H-G; Hellriegel, M; Wimberger, P; Kommoss, S; Kommoss, F; Hauptmann, S; du Bois, A
2015-01-01
Background: Incomplete surgical staging is a negative prognostic factor for patients with borderline ovarian tumours (BOT). However, little is known about the prognostic impact of each individual staging procedure. Methods: Clinical parameters of 950 patients with BOT (confirmed by central reference pathology) treated between 1998 and 2008 at 24 German AGO centres were analysed. In 559 patients with serous BOT and adequate ovarian surgery, further recommended staging procedures (omentectomy, peritoneal biopsies, cytology) were evaluated applying Cox regression models with respect to progression-free survival (PFS). Results: For patients with one missing staging procedure, the hazard ratio (HR) for recurrence was 1.25 (95%-CI 0.66–2.39; P=0.497). This risk increased with each additional procedure skipped reaching statistical significance in case of two (HR 1.95; 95%-CI 1.06–3.58; P=0.031) and three missing steps (HR 2.37; 95%-CI 1.22–4.64; P=0.011). The most crucial procedure was omentectomy which retained a statistically significant impact on PFS in multiple analysis (HR 1.91; 95%-CI 1.15–3.19; P=0.013) adjusting for previously established prognostic factors as FIGO stage, tumour residuals, and fertility preservation. Conclusion: Individual surgical staging procedures contribute to the prognosis for patients with serous BOT. In this analysis, recurrence risk increased with each skipped surgical step. This should be considered when re-staging procedures following incomplete primary surgery are discussed. PMID:25562434
Trillsch, F; Mahner, S; Vettorazzi, E; Woelber, L; Reuss, A; Baumann, K; Keyver-Paik, M-D; Canzler, U; Wollschlaeger, K; Forner, D; Pfisterer, J; Schroeder, W; Muenstedt, K; Richter, B; Fotopoulou, C; Schmalfeldt, B; Burges, A; Ewald-Riegler, N; de Gregorio, N; Hilpert, F; Fehm, T; Meier, W; Hillemanns, P; Hanker, L; Hasenburg, A; Strauss, H-G; Hellriegel, M; Wimberger, P; Kommoss, S; Kommoss, F; Hauptmann, S; du Bois, A
2015-02-17
Incomplete surgical staging is a negative prognostic factor for patients with borderline ovarian tumours (BOT). However, little is known about the prognostic impact of each individual staging procedure. Clinical parameters of 950 patients with BOT (confirmed by central reference pathology) treated between 1998 and 2008 at 24 German AGO centres were analysed. In 559 patients with serous BOT and adequate ovarian surgery, further recommended staging procedures (omentectomy, peritoneal biopsies, cytology) were evaluated applying Cox regression models with respect to progression-free survival (PFS). For patients with one missing staging procedure, the hazard ratio (HR) for recurrence was 1.25 (95%-CI 0.66-2.39; P=0.497). This risk increased with each additional procedure skipped reaching statistical significance in case of two (HR 1.95; 95%-CI 1.06-3.58; P=0.031) and three missing steps (HR 2.37; 95%-CI 1.22-4.64; P=0.011). The most crucial procedure was omentectomy which retained a statistically significant impact on PFS in multiple analysis (HR 1.91; 95%-CI 1.15-3.19; P=0.013) adjusting for previously established prognostic factors as FIGO stage, tumour residuals, and fertility preservation. Individual surgical staging procedures contribute to the prognosis for patients with serous BOT. In this analysis, recurrence risk increased with each skipped surgical step. This should be considered when re-staging procedures following incomplete primary surgery are discussed.
Crop Identification Technology Assessment for Remote Sensing (CITARS)
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.
1975-01-01
The results of classifications and experiments performed for the Crop Identification Technology Assessment for Remote Sensing (CITARS) project are summarized. Fifteen data sets were classified using two analysis procedures. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. In addition, 20 data sets were classified using training statistics from another segment or date. The results of both the local and non-local classifications in terms of classification and proportion estimation are presented. Several additional experiments are described which were performed to provide additional understanding of the CITARS results. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, the spectral discriminability of corn, soybeans, and other, and analysis of aircraft multispectral data.
ERIC Educational Resources Information Center
Bakir, Saad T.
2010-01-01
We propose a nonparametric (or distribution-free) procedure for testing the equality of several population variances (or scale parameters). The proposed test is a modification of Bakir's (1989, Commun. Statist., Simul-Comp., 18, 757-775) analysis of means by ranks (ANOMR) procedure for testing the equality of several population means. A proof is…
USDA-ARS?s Scientific Manuscript database
Procedures for assessing model performance in agronomy are often arbitrary and not always helpful. An omnibus analysis statistic, concordance correlation, is widely known and used in many other sciences. An illustrative example is presented here. The analysis assumes the exact relationship “observat...
Metaprop: a Stata command to perform meta-analysis of binomial data.
Nyaga, Victoria N; Arbyn, Marc; Aerts, Marc
2014-01-01
Meta-analyses have become an essential tool in synthesizing evidence on clinical and epidemiological questions derived from a multitude of similar studies assessing the particular issue. Appropriate and accessible statistical software is needed to produce the summary statistic of interest. Metaprop is a statistical program implemented to perform meta-analyses of proportions in Stata. It builds further on the existing Stata procedure metan which is typically used to pool effects (risk ratios, odds ratios, differences of risks or means) but which is also used to pool proportions. Metaprop implements procedures which are specific to binomial data and allows computation of exact binomial and score test-based confidence intervals. It provides appropriate methods for dealing with proportions close to or at the margins where the normal approximation procedures often break down, by use of the binomial distribution to model the within-study variability or by allowing Freeman-Tukey double arcsine transformation to stabilize the variances. Metaprop was applied on two published meta-analyses: 1) prevalence of HPV-infection in women with a Pap smear showing ASC-US; 2) cure rate after treatment for cervical precancer using cold coagulation. The first meta-analysis showed a pooled HPV-prevalence of 43% (95% CI: 38%-48%). In the second meta-analysis, the pooled percentage of cured women was 94% (95% CI: 86%-97%). By using metaprop, no studies with 0% or 100% proportions were excluded from the meta-analysis. Furthermore, study specific and pooled confidence intervals always were within admissible values, contrary to the original publication, where metan was used.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
Palazón, L; Navas, A
2017-06-01
Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite
2009-01-01
Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.
Cost analysis of robotic versus laparoscopic general surgery procedures.
Higgins, Rana M; Frelich, Matthew J; Bosler, Matthew E; Gould, Jon C
2017-01-01
Robotic surgical systems have been used at a rapidly increasing rate in general surgery. Many of these procedures have been performed laparoscopically for years. In a surgical encounter, a significant portion of the total costs is associated with consumable supplies. Our hospital system has invested in a software program that can track the costs of consumable surgical supplies. We sought to determine the differences in cost of consumables with elective laparoscopic and robotic procedures for our health care organization. De-identified procedural cost and equipment utilization data were collected from the Surgical Profitability Compass Procedure Cost Manager System (The Advisory Board Company, Washington, DC) for our health care system for laparoscopic and robotic cholecystectomy, fundoplication, and inguinal hernia between the years 2013 and 2015. Outcomes were length of stay, case duration, and supply cost. Statistical analysis was performed using a t-test for continuous variables, and statistical significance was defined as p < 0.05. The total cost of consumable surgical supplies was significantly greater for all robotic procedures. Length of stay did not differ for fundoplication or cholecystectomy. Length of stay was greater for robotic inguinal hernia repair. Case duration was similar for cholecystectomy (84.3 robotic and 75.5 min laparoscopic, p = 0.08), but significantly longer for robotic fundoplication (197.2 robotic and 162.1 min laparoscopic, p = 0.01) and inguinal hernia repair (124.0 robotic and 84.4 min laparoscopic, p = ≪0.01). We found a significantly increased cost of general surgery procedures for our health care system when cases commonly performed laparoscopically are instead performed robotically. Our analysis is limited by the fact that we only included costs associated with consumable surgical supplies. The initial acquisition cost (over $1 million for robotic surgical system), depreciation, and service contract for the robotic and laparoscopic systems were not included in this analysis.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce
2013-03-01
New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.
Nonlinear estimation of parameters in biphasic Arrhenius plots.
Puterman, M L; Hrboticky, N; Innis, S M
1988-05-01
This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.
78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Applications of non-parametric statistics and analysis of variance on sample variances
NASA Technical Reports Server (NTRS)
Myers, R. H.
1981-01-01
Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.
The Equivalence of Three Statistical Packages for Performing Hierarchical Cluster Analysis
ERIC Educational Resources Information Center
Blashfield, Roger
1977-01-01
Three different software programs which contain hierarchical agglomerative cluster analysis procedures were shown to generate different solutions on the same data set using apparently the same options. The basis for the differences in the solutions was the formulae used to calculate Euclidean distance. (Author/JKS)
Determining Sample Sizes for Precise Contrast Analysis with Heterogeneous Variances
ERIC Educational Resources Information Center
Jan, Show-Li; Shieh, Gwowen
2014-01-01
The analysis of variance (ANOVA) is one of the most frequently used statistical analyses in practical applications. Accordingly, the single and multiple comparison procedures are frequently applied to assess the differences among mean effects. However, the underlying assumption of homogeneous variances may not always be tenable. This study…
RANDOMIZATION PROCEDURES FOR THE ANALYSIS OF EDUCATIONAL EXPERIMENTS.
ERIC Educational Resources Information Center
COLLIER, RAYMOND O.
CERTAIN SPECIFIC ASPECTS OF HYPOTHESIS TESTS USED FOR ANALYSIS OF RESULTS IN RANDOMIZED EXPERIMENTS WERE STUDIED--(1) THE DEVELOPMENT OF THE THEORETICAL FACTOR, THAT OF PROVIDING INFORMATION ON STATISTICAL TESTS FOR CERTAIN EXPERIMENTAL DESIGNS AND (2) THE DEVELOPMENT OF THE APPLIED ELEMENT, THAT OF SUPPLYING THE EXPERIMENTER WITH MACHINERY FOR…
Multivariate meta-analysis: a robust approach based on the theory of U-statistic.
Ma, Yan; Mazumdar, Madhu
2011-10-30
Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
2009-09-01
instructional format. Using a mixed- method coding and analysis approach, the sample of POIs were categorized, coded, statistically analyzed, and a... Method SECURITY CLASSIFICATION OF 19. LIMITATION OF 20. NUMBER 21. RESPONSIBLE PERSON 16. REPORT Unclassified 17. ABSTRACT...transition to a distributed (or blended) learning format. Procedure: A mixed- methods approach, combining qualitative coding procedures with basic
Jahn, I; Foraita, R
2008-01-01
In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.
A Field-Effect Transistor (FET) model for ASAP
NASA Technical Reports Server (NTRS)
Ming, L.
1965-01-01
The derivation of the circuitry of a field effect transistor (FET) model, the procedure for adapting the model to automated statistical analysis program (ASAP), and the results of applying ASAP on this model are described.
Railroad Classification Yard Technology Manual. Volume III. Freight Car Rollability
DOT National Transportation Integrated Search
1981-07-01
The report presents a survey of rolling resistance research, histograms of rolling resistance from five yards, a statistical regression analysis of causal factors affecting rolling resistance, procedures for constructing a rolling resistance histogra...
NASA Technical Reports Server (NTRS)
Wallace, G. R.; Weathers, G. D.; Graf, E. R.
1973-01-01
The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder
2009-01-01
Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086
Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek
2018-02-01
Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Actuarial analysis of surgical results: rationale and method.
Grunkemeier, G L; Starr, A
1977-11-01
The use of time-related methods of statistical analysis is essential for valid evaluation of the long-term results of a surgical procedure. Accurate comparison of two procedures or two prosthetic devices is possible only when the length of follow-up is properly accounted for. The purpose of this report is to make the technical aspects of the acturial, or life table, method easily accessible to the surgeon, with emphasis on the motivation for and the rationale behind it. This topic is illustrated in terms of heart valve prostheses, a field that is rapidly developing. Both the authors and readers of articles must be aware that controversies surrounding the relative merits of various prosthetic designs or operative procedures can be settled only if proper time-related methods of analysis are utilized.
Kumar, Keshav; Mishra, Ashok Kumar
2015-07-01
Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Huang, Shou-Guo; Chen, Bo; Lv, Dong; Zhang, Yong; Nie, Feng-Feng; Li, Wei; Lv, Yao; Zhao, Huan-Li; Liu, Hong-Mei
2017-01-01
Purpose Using a network meta-analysis approach, our study aims to develop a ranking of the six surgical procedures, that is, Plate, titanium elastic nail (TEN), tension band wire (TBW), hook plate (HP), reconstruction plate (RP) and Knowles pin, by comparing the post-surgery constant shoulder scores in patients with clavicular fracture (CF). Methods A comprehensive search of electronic scientific literature databases was performed to retrieve publications investigating surgical procedures in CF, with the stringent eligible criteria, and clinical experimental studies of high quality and relevance to our area of interest were selected for network meta-analysis. Statistical analyses were conducted using Stata 12.0. Results A total of 19 studies met our inclusion criteria were eventually enrolled into our network meta-analysis, representing 1164 patients who had undergone surgical procedures for CF (TEN group = 240; Plate group = 164; TBW group = 180; RP group = 168; HP group = 245; Knowles pin group = 167). The network meta-analysis results revealed that RP significantly improved constant shoulder score in patients with CF when compared with TEN, and the post-operative constant shoulder scores in patients with CF after Plate, TBW, HP, Knowles pin and TEN were similar with no statistically significant differences. The treatment relative ranking of predictive probabilities of constant shoulder scores in patients with CF after surgery revealed the surface under the cumulative ranking curves (SUCRA) value is the highest in RP. Conclusion The current network meta-analysis suggests that RP may be the optimum surgical treatment among six inventions for patients with CF, and it can improve the shoulder score of patients with CF. Implications for Rehabilitation RP improves shoulder joint function after surgical procedure. RP achieves stability with minimal complications after surgery. RP may be the optimum surgical treatment for rehabilitation of patients with CF.
Jacob, Laurent; Combes, Florence; Burger, Thomas
2018-06-18
We propose a new hypothesis test for the differential abundance of proteins in mass-spectrometry based relative quantification. An important feature of this type of high-throughput analyses is that it involves an enzymatic digestion of the sample proteins into peptides prior to identification and quantification. Due to numerous homology sequences, different proteins can lead to peptides with identical amino acid chains, so that their parent protein is ambiguous. These so-called shared peptides make the protein-level statistical analysis a challenge and are often not accounted for. In this article, we use a linear model describing peptide-protein relationships to build a likelihood ratio test of differential abundance for proteins. We show that the likelihood ratio statistic can be computed in linear time with the number of peptides. We also provide the asymptotic null distribution of a regularized version of our statistic. Experiments on both real and simulated datasets show that our procedures outperforms state-of-the-art methods. The procedures are available via the pepa.test function of the DAPAR Bioconductor R package.
Stepaniak, Pieter S; Soliman Hamad, Mohamed A; Dekker, Lukas R C; Koolen, Jacques J
2014-01-01
In this study, we sought to analyze the stochastic behavior of Catherization Laboratories (Cath Labs) procedures in our institution. Statistical models may help to improve estimated case durations to support management in the cost-effective use of expensive surgical resources. We retrospectively analyzed all the procedures performed in the Cath Labs in 2012. The duration of procedures is strictly positive (larger than zero) and has mostly a large minimum duration. Because of the strictly positive character of the Cath Lab procedures, a fit of a lognormal model may be desirable. Having a minimum duration requires an estimate of the threshold (shift) parameter of the lognormal model. Therefore, the 3-parameter lognormal model is interesting. To avoid heterogeneous groups of observations, we tested every group-cardiologist-procedure combination for the normal, 2- and 3-parameter lognormal distribution. The total number of elective and emergency procedures performed was 6,393 (8,186 h). The final analysis included 6,135 procedures (7,779 h). Electrophysiology (intervention) procedures fit the 3-parameter lognormal model 86.1% (80.1%). Using Friedman test statistics, we conclude that the 3-parameter lognormal model is superior to the 2-parameter lognormal model. Furthermore, the 2-parameter lognormal is superior to the normal model. Cath Lab procedures are well-modelled by lognormal models. This information helps to improve and to refine Cath Lab schedules and hence their efficient use.
NASA Technical Reports Server (NTRS)
Hyde, G.
1976-01-01
The 13/18 GHz COMSAT Propagation Experiment (CPE) was performed to measure attenuation caused by hydrometeors along slant paths from transmitting terminals on the ground to the ATS-6 satellite. The effectiveness of site diversity in overcoming this impairment was also studied. Problems encountered in assembling a valid data base of rain induced attenuation data for statistical analysis are considered. The procedures used to obtain the various statistics are then outlined. The graphs and tables of statistical data for the 15 dual frequency (13 and 18 GHz) site diversity locations are discussed. Cumulative rain rate statistics for the Fayetteville and Boston sites based on point rainfall data collected are presented along with extrapolations of the attenuation and point rainfall data.
Detection of crossover time scales in multifractal detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Ge, Erjia; Leung, Yee
2013-04-01
Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.
ERIC Educational Resources Information Center
Marquardt, Lloyd D.; McCormick, Ernest J.
This study was concerned with the identification of the job dimension underlying the job elements of the Position Analysis Questionnaire (PAQ), Form B. The PAQ is a structured job analysis instrument consisting of 187 worker-oriented job elements which are divided into six a priori major divisions. The statistical procedure of principal components…
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
NASA Technical Reports Server (NTRS)
Batthauer, Byron E.
1987-01-01
This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.
Effects of Instructional Design with Mental Model Analysis on Learning.
ERIC Educational Resources Information Center
Hong, Eunsook
This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…
Effect Size Measure and Analysis of Single Subject Designs
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Horner, Robert H.; Rogers, H. Jane; Sugai, George
2012-01-01
This study is aimed at addressing the criticisms that have been leveled at the currently available statistical procedures for analyzing single subject designs (SSD). One of the vexing problems in the analysis of SSD is in the assessment of the effect of intervention. Serial dependence notwithstanding, the linear model approach that has been…
A Review of Classical Methods of Item Analysis.
ERIC Educational Resources Information Center
French, Christine L.
Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…
A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.
ERIC Educational Resources Information Center
Feeney, J. D.
Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…
The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.
ERIC Educational Resources Information Center
Hummel, Thomas J.
An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…
ERIC Educational Resources Information Center
Goldstein, Harvey; Bonnet, Gerard; Rocher, Thierry
2007-01-01
The Programme for International Student Assessment comparative study of reading performance among 15-year-olds is reanalyzed using statistical procedures that allow the full complexity of the data structures to be explored. The article extends existing multilevel factor analysis and structural equation models and shows how this can extract richer…
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.
Carmichael, Owen; Sakhanenko, Lyudmila
2015-05-15
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data
Carmichael, Owen; Sakhanenko, Lyudmila
2015-01-01
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
Ahmad, Arif; Carleton, Jared D; Ahmad, Zoha F; Agarwala, Ashish
2016-09-01
The purpose of this study was to compare the operative and early perioperative outcomes of laparoscopic versus robotic-assisted Roux-en-Y gastric bypass procedures performed in a community hospital setting. The study was a chart review and analysis of the early perioperative outcomes of a total of 345 Roux-en-Y gastric bypass procedures performed by a single surgeon in a community hospital setting from January 2011 to October 2014. Of these, 173 procedures were performed laparoscopically and 172 were performed with robotic assistance utilizing the daVinci(®) surgical platform. Factors such as baseline patient characteristics, operative time, estimated blood loss (EBL), conversions to open procedure, complication rates, adverse events, length of stay (LOS), and return to the operating room for the two groups were retrospectively analyzed from a prospectively maintained database. Student's t test with unequal variances was used for statistical analysis, and a p value <0.05 was used for significance. There were no statistically significant differences in complication rates, EBL, or LOS between the two groups. There was a significant difference between the total operative times (135.30 ± 37.60 min for the laparoscopic procedure versus 154.84 ± 38.44 min for the robotic procedure, p < 0.05). There were no adverse intraoperative events, conversions to open procedures, leaks, strictures, returns to the operating room within 30 days, or mortalities in either group. Our study, which is the first of its kind to analyze the operative and early perioperative outcomes between laparoscopic and robotic-assisted Roux-en-Y gastric bypass procedures in the US community hospital setting, indicates that both are comparable in terms of safety, efficacy, and operative and early perioperative outcomes.
75 FR 38871 - Proposed Collection; Comment Request for Revenue Procedure 2004-29
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
... comments concerning Revenue Procedure 2004-29, Statistical Sampling in Sec. 274 Context. DATES: Written... Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling in Sec...: Revenue Procedure 2004-29 prescribes the statistical sampling methodology by which taxpayers under...
Fan, Tao; Zhao, XinGang; Zhao, HaiJun; Liang, Cong; Wang, YinQian; Gai, QiFei; Zhang, Fangyi
2015-10-01
It is well established that syringomyelia can cause neurological symptoms and deficit by accumulation of fluid within syrinx cavities that lead to internal compression within the spinal cord. When other intervention treating the underlying etiology failed to yield any improvement, the next option would be a procedure to divert the fluid from the syrinx cavity, such as syringo-subarachnoid, syringo-peritoneal or syringo-pleural shunting. The indications and long term efficacy of these direct shunting procedures are still questionable and controversial. To investigate the clinical indication, outcome and complication of syringe-pleural shunt (SPS) as an alternative for treatment of syringomyelia. We reported a retrospective 26 cases of syringomyelia were found to have indication for a diversion procedure. SPS was offered. Patients' symptoms, mJOA score, and MRI were collected to evaluate the change of the syringomyelia and prognosis of the patients. 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. All 26 patients underwent SPS. The clinical information was collected, the mean follow-up time was 27.4 months, 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. The key surgical technique, outcome and complications of SPS were reported in detail. No mortality and severe complications occurred. Postoperative MRIs revealed near-complete resolution of syrinx in 14 patients, significant shrinkage of syrinx in 10 patients, no obvious reduction or unchanged in remaining 2 patient. Postoperatively, the symptoms improved in 24 cases (92.3%). Statistical analysis of the mJOA scores showed a statistical significance (P<0.001) between the preoperative group and the 2-week postoperative group. No further significant improvement between 2 weeks to the final follow up at 27 months. Collapse or remarkable shrinkage of the syrinx by SPS could ameliorate or at least stabilize the symptoms for the patient. We recommend small laminectomy and a less than 3mm myelotomy either at PML or DREZ. The SPS procedure can be an effective and relatively long-lived treatment for the idiopathic syringomyelia and those that failed other options. Copyright © 2015 Elsevier B.V. All rights reserved.
Browne, Richard W; Whitcomb, Brian W
2010-07-01
Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
Accuracy of Buccal Scan Procedures for the Registration of Habitual Intercuspation.
Zimmermann, M; Ender, A; Attin, T; Mehl, A
2018-04-09
Accurate reproduction of the jaw relationship is important in many fields of dentistry. Maximum intercuspation can be registered with digital buccal scan procedures implemented in the workflow of many intraoral scanning systems. The aim of this study was to investigate the accuracy of buccal scan procedures with intraoral scanning devices for the registration of habitual intercuspation in vivo. The hypothesis was that there is no statistically significant difference for buccal scan procedures compared to registration methods with poured model casts. Ten individuals (full dentition, no dental rehabilitations) were subjects for five different habitual intercuspation registration methods: (CI) poured model casts, manual hand registration, buccal scan with inEOS X5; (BC) intraoral scan, buccal scan with CEREC Bluecam; (OC4.2) intraoral scan, buccal scan with CEREC Omnicam software version 4.2; (OC4.5β) intraoral scan, buccal scan with CEREC Omnicam version 4.5β; and (TR) intraoral scan, buccal scan with Trios 3. Buccal scan was repeated three times. Analysis of rotation (Rot) and translation (Trans) parameters was performed with difference analysis software (OraCheck). Statistical analysis was performed with one-way analysis of variance and the post hoc Scheffé test ( p<0.05). Statistical analysis showed no significant ( p>0.05) differences in terms of translation between groups CI_Trans (98.74±112.01 μm), BC_Trans (84.12±64.95 μm), OC4.2_Trans (60.70±35.08 μm), OC4.5β_Trans (68.36±36.67 μm), and TR_Trans (66.60±64.39 μm). For rotation, there were no significant differences ( p>0.05) for groups CI_Rot (0.23±0.25°), BC_Rot (0.73±0.52°), OC4.2_Rot (0.45±0.31°), OC4.5β_Rot (0.50±0.36°), and TR_Rot (0.47±0.65°). Intraoral scanning devices allow the reproduction of the static relationship of the maxillary and mandibular teeth with the same accuracy as registration methods with poured model casts.
Core, Cynthia; Brown, Janean W; Larsen, Michael D; Mahshie, James
2014-01-01
The objectives of this research were to determine whether an adapted version of a Hybrid Visual Habituation procedure could be used to assess speech perception of phonetic and prosodic features of speech (vowel height, lexical stress, and intonation) in individual pre-school-age children who use cochlear implants. Nine children ranging in age from 3;4 to 5;5 participated in this study. Children were prelingually deaf and used cochlear implants and had no other known disabilities. Children received two speech feature tests using an adaptation of a Hybrid Visual Habituation procedure. Seven of the nine children demonstrated perception of at least one speech feature using this procedure using results from a Bayesian linear regression analysis. At least one child demonstrated perception of each speech feature using this assessment procedure. An adapted version of the Hybrid Visual Habituation Procedure with an appropriate statistical analysis provides a way to assess phonetic and prosodicaspects of speech in pre-school-age children who use cochlear implants.
Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi
2017-01-01
High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
75 FR 53738 - Proposed Collection; Comment Request for Rev. Proc. 2007-35
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
... Revenue Procedure Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES... through the Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling...: This revenue procedure provides for determining when statistical sampling may be used in purposes of...
Dai, Hongying; Wu, Guodong; Wu, Michael; Zhi, Degui
2016-01-01
Next-generation sequencing data pose a severe curse of dimensionality, complicating traditional "single marker-single trait" analysis. We propose a two-stage combined p-value method for pathway analysis. The first stage is at the gene level, where we integrate effects within a gene using the Sequence Kernel Association Test (SKAT). The second stage is at the pathway level, where we perform a correlated Lancaster procedure to detect joint effects from multiple genes within a pathway. We show that the Lancaster procedure is optimal in Bahadur efficiency among all combined p-value methods. The Bahadur efficiency,[Formula: see text], compares sample sizes among different statistical tests when signals become sparse in sequencing data, i.e. ε →0. The optimal Bahadur efficiency ensures that the Lancaster procedure asymptotically requires a minimal sample size to detect sparse signals ([Formula: see text]). The Lancaster procedure can also be applied to meta-analysis. Extensive empirical assessments of exome sequencing data show that the proposed method outperforms Gene Set Enrichment Analysis (GSEA). We applied the competitive Lancaster procedure to meta-analysis data generated by the Global Lipids Genetics Consortium to identify pathways significantly associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, and total cholesterol.
NASA Astrophysics Data System (ADS)
Ishida, Shigeki; Mori, Atsuo; Shinji, Masato
The main method to reduce the blasting charge noise which occurs in a tunnel under construction is to install the sound insulation door in the tunnel. However, the numerical analysis technique to predict the accurate effect of the transmission loss in the sound insulation door is not established. In this study, we measured the blasting charge noise and the vibration of the sound insulation door in the tunnel with the blasting charge, and performed analysis and modified acoustic feature. In addition, we reproduced the noise reduction effect of the sound insulation door by statistical energy analysis method and confirmed that numerical simulation is possible by this procedure.
Preservation of Mercury in Polyethylene Containers.
ERIC Educational Resources Information Center
Piccolino, Samuel Paul
1983-01-01
Reports results of experiments favoring use of 0.5 percent nitric acid with an oxidant (potassium dichromate or potassium permanganate) to preserve samples in polyethylene containers for mercury analysis. Includes procedures used and statistical data obtained from the experiments. (JN)
14 CFR 23.621 - Casting factors.
Code of Federal Regulations, 2013 CFR
2013-01-01
... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...
14 CFR 23.621 - Casting factors.
Code of Federal Regulations, 2012 CFR
2012-01-01
... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...
14 CFR 23.621 - Casting factors.
Code of Federal Regulations, 2014 CFR
2014-01-01
... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...
14 CFR 23.621 - Casting factors.
Code of Federal Regulations, 2010 CFR
2010-01-01
... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...
Sarode, D; Bari, D A; Cain, A C; Syed, M I; Williams, A T
2017-04-01
To critically evaluate the evidence comparing success rates of endonasal dacryocystorhinostomy (EN-DCR) with and without silicone tubing and to thus determine whether silicone intubation is beneficial in primary EN-DCR. Systematic review and meta-analysis. A literature search was performed on AMED, EMBASE, HMIC, MEDLINE, PsycINFO, BNI, CINAHL, HEALTH BUSINESS ELITE, CENTRAL and Cochrane Ear, Nose and Throat disorders groups trials register using a combination of various MeSH. The date of last search was January 2016. This review was limited to randomised controlled trials (RCTs) in English language. Risk of bias was assessed using the Cochrane Collaboration's risk of bias tool. Chi-square and I 2 statistics were calculated to determine the presence and extent of statistical heterogeneity. Study selection, data extraction and risk of bias scoring were performed independently by two authors in concordance with the PRISMA statement. Five RCTs (447 primary EN-DCR procedures in 426 patients) were included for analysis. Moderate interstudy statistical heterogeneity was demonstrated (Chi 2 = 6.18; d.f. = 4; I 2 = 35%). Bicanalicular silicone stents were used in 229 and not used in 218 procedures. The overall success rate of EN-DCR was 92.8% (415/447). The success rate of EN-DCR was 93.4% (214/229) with silicone tubing and 92.2% (201/218) without silicone tubing. Meta-analysis using a random-effects model showed no statistically significant difference in outcomes between the two groups (P = 0.63; RR = 0.79; 95% CI = 0.3-2.06). Our review and meta-analysis did not demonstrate an additional advantage of silicone stenting. A high-quality well-powered prospective multicentre RCT is needed to further clarify on the benefit of silicone stents. © 2016 John Wiley & Sons Ltd.
Feature Screening for Ultrahigh Dimensional Categorical Data with Applications.
Huang, Danyang; Li, Runze; Wang, Hansheng
2014-01-01
Ultrahigh dimensional data with both categorical responses and categorical covariates are frequently encountered in the analysis of big data, for which feature screening has become an indispensable statistical tool. We propose a Pearson chi-square based feature screening procedure for categorical response with ultrahigh dimensional categorical covariates. The proposed procedure can be directly applied for detection of important interaction effects. We further show that the proposed procedure possesses screening consistency property in the terminology of Fan and Lv (2008). We investigate the finite sample performance of the proposed procedure by Monte Carlo simulation studies, and illustrate the proposed method by two empirical datasets.
NASA Astrophysics Data System (ADS)
Pisano, Luca; Vessia, Giovanna; Vennari, Carmela; Parise, Mario
2015-04-01
Empirical rainfall thresholds are a well established method to draw information about Duration (D) and Cumulated (E) values of the rainfalls that are likely to initiate shallow landslides. To this end, rain-gauge records of rainfall heights are commonly used. Several procedures can be applied to address the calculation of the Duration-Cumulated height and, eventually, the Intensity values related to the rainfall events responsible for shallow landslide onset. A large number of procedures are drawn from particular geological settings and climate conditions based on an expert identification of the rainfall event. A few researchers recently devised automated procedures to reconstruct the rainfall events responsible for landslide onset. In this study, 300 pairs of D, E couples, related to shallow landslides that occurred in a ten year span 2002-2012 on the Italian territory, have been drawn by means of two procedures: the expert method (Brunetti et al., 2010) and the automated method (Vessia et al., 2014). The two procedures start from the same sources of information on shallow landslides occurred during or soon after a rainfall. Although they have in common the method to select the date (up to the hour of the landslide occurrence), the site of the landslide and the choice of the rain-gauge representative for the rainfall, they differ when calculating the Duration and Cumulated height of the rainfall event. Moreover, the expert procedure identifies only one D, E pair for each landslide whereas the automated procedure draws 6 possible D,E pairs for the same landslide event. Each one of the 300 D, E pairs calculated by the automated procedure reproduces about 80% of the E values and about 60% of the D values calculated by the expert procedure. Unfortunately, no standard methods are available for checking the forecasting ability of both the expert and the automated reconstruction of the true D, E pairs that result in shallow landslide. Nonetheless, a statistical analysis on marginal distributions of the seven samples of 300 D and E values are performed in this study. The main objective of this statistical analysis is to highlight similarities and differences in the two sets of samples of Duration and Cumulated values collected by the two procedures. At first, the sample distributions have been investigated: the seven E samples are Lognormal distributed, whereas the D samples are all distributed Weibull like. On E samples, due to their Lognormal distribution, statistical tests can be applied to check two null hypotheses: equal mean values through the Student test, equal standard deviations through the Fisher test. These two hypotheses are accepted for the seven E samples, meaning that they come from the same population, at a confidence level of 95%. Conversely, the preceding tests cannot be applied to the seven D samples that are Weibull distributed with shape parameters k ranging between 0.9 to 1.2. Nonetheless, the two procedures calculate the rainfall event through the selection of the E values; after that the D is drawn. Thus, the results of this statistical analysis preliminary confirms the similarities of the two D,E pair set of values drawn from the two different procedures. References Brunetti, M.T., Peruccacci, S., Rossi, M., Luciani, S., Valigi, D., and Guzzetti, F.: Rainfall thresholds for the possible occurrence of landslides in Italy, Nat. Hazards Earth Syst. Sci., 10, 447-458, doi:10.5194/nhess-10-447-2010, 2010. Vessia G., Parise M., Brunetti M.T., Peruccacci S., Rossi M., Vennari C., and Guzzetti F.: Automated reconstruction of rainfall events responsible for shallow landslides, Nat. Hazards Earth Syst. Sci., 14, 2399-2408, doi: 10.5194/nhess-14-2399-2014, 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoon Sohn; Charles Farrar; Norman Hunter
2001-01-01
This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
ERIC Educational Resources Information Center
Choi, Jinnie
2017-01-01
This article reviews PROC IRT, which was added to Statistical Analysis Software in 2014. We provide an introductory overview of a free version of SAS, describe what PROC IRT offers for item response theory (IRT) analysis and how one can use PROC IRT, and discuss how other SAS macros and procedures may compensate the IRT functionalities of PROC IRT.
Sample Size Calculations for Precise Interval Estimation of the Eta-Squared Effect Size
ERIC Educational Resources Information Center
Shieh, Gwowen
2015-01-01
Analysis of variance is one of the most frequently used statistical analyses in the behavioral, educational, and social sciences, and special attention has been paid to the selection and use of an appropriate effect size measure of association in analysis of variance. This article presents the sample size procedures for precise interval estimation…
The NBS Energy Model Assessment project: Summary and overview
NASA Astrophysics Data System (ADS)
Gass, S. I.; Hoffman, K. L.; Jackson, R. H. F.; Joel, L. S.; Saunders, P. B.
1980-09-01
The activities and technical reports for the project are summarized. The reports cover: assessment of the documentation of Midterm Oil and Gas Supply Modeling System; analysis of the model methodology characteristics of the input and other supporting data; statistical procedures undergirding construction of the model and sensitivity of the outputs to variations in input, as well as guidelines and recommendations for the role of these in model building and developing procedures for their evaluation.
Automated Box-Cox Transformations for Improved Visual Encoding.
Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S
2013-01-01
The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.
Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor
2011-01-01
Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.
Damron, T A; McBeath, A A
1995-04-01
With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.
Khanna, Rajesh; Handa, Aashish; Virk, Rupam Kaur; Ghai, Deepika; Handa, Rajni Sharma; Goel, Asim
2017-01-01
Background: The process of cleaning and shaping the canal is not an easy goal to obtain, as canal curvature played a significant role during the instrumentation of the curved canals. Aim: The present in vivo study was conducted to evaluate procedural errors during the preparation of curved root canals using hand Nitiflex and rotary K3XF instruments. Materials and Methods: Procedural errors such as ledge formation, instrument separation, and perforation (apical, furcal, strip) were determined in sixty patients, divided into two groups. In Group I, thirty teeth in thirty patients were prepared using hand Nitiflex system, and in Group II, thirty teeth in thirty patients were prepared using K3XF rotary system. The evaluation was done clinically as well as radiographically. The results recorded from both groups were compiled and put to statistical analysis. Statistical Analysis: Chi-square test was used to compare the procedural errors (instrument separation, ledge formation, and perforation). Results: In the present study, both hand Nitiflex and rotary K3XF showed ledge formation and instrument separation. Although ledge formation and instrument separation by rotary K3XF file system was less as compared to hand Nitiflex. No perforation was seen in both the instrument groups. Conclusion: Canal curvature played a significant role during the instrumentation of the curved canals. Procedural errors such as ledge formation and instrument separation by rotary K3XF file system were less as compared to hand Nitiflex. PMID:29042727
Stulberg, Jonah J; Pavey, Emily S; Cohen, Mark E; Ko, Clifford Y; Hoyt, David B; Bilimoria, Karl Y
2017-02-01
Changes to resident duty hour policies in the Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) trial could impact hospitalized patients' length of stay (LOS) by altering care coordination. Length of stay can also serve as a reflection of all complications, particularly those not captured in the FIRST trial (eg pneumothorax from central line). Programs were randomized to either maintaining current ACGME duty hour policies (Standard arm) or more flexible policies waiving rules on maximum shift lengths and time off between shifts (Flexible arm). Our objective was to determine whether flexibility in resident duty hours affected LOS in patients undergoing high-risk surgical operations. Patients were identified who underwent hepatectomy, pancreatectomy, laparoscopic colectomy, open colectomy, or ventral hernia repair (2014-2015 academic year) at 154 hospitals participating in the FIRST trial. Two procedure-stratified evaluations of LOS were undertaken: multivariable negative binomial regression analysis on LOS and a multivariable logistic regression analysis on the likelihood of a prolonged LOS (>75 th percentile). Before any adjustments, there was no statistically significant difference in overall mean LOS between study arms (Flexible Policy: mean [SD] LOS 6.03 [5.78] days vs Standard Policy: mean LOS 6.21 [5.82] days; p = 0.74). In adjusted analyses, there was no statistically significant difference in LOS between study arms overall (incidence rate ratio for Flexible vs Standard: 0.982; 95% CI, 0.939-1.026; p = 0.41) or for any individual procedures. In addition, there was no statistically significant difference in the proportion of patients with prolonged LOS between study arms overall (Flexible vs Standard: odds ratio = 1.028; 95% CI, 0.871-1.212) or for any individual procedures. Duty hour flexibility had no statistically significant effect on LOS in patients undergoing complex intra-abdominal operations. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Portillo, M C; Gonzalez, J M
2008-08-01
Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.
TRAN-STAT: statistics for environmental transuranic studies, July 1978, Number 5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This issue is concerned with nonparametric procedures for (1) estimating the central tendency of a population, (2) describing data sets through estimating percentiles, (3) estimating confidence limits for the median and other percentiles, (4) estimating tolerance limits and associated numbers of samples, and (5) tests of significance and associated procedures for a variety of testing situations (counterparts to t-tests and analysis of variance). Some characteristics of several nonparametric tests are illustrated using the NAEG /sup 241/Am aliquot data presented and discussed in the April issue of TRAN-STAT. Some of the statistical terms used here are defined in a glossary. Themore » reference list also includes short descriptions of nonparametric books. 31 references, 3 figures, 1 table.« less
Morelli, Luca; Guadagni, Simone; Lorenzoni, Valentina; Di Franco, Gregorio; Cobuccio, Luigi; Palmeri, Matteo; Caprili, Giovanni; D'Isidoro, Cristiano; Moglia, Andrea; Ferrari, Vincenzo; Di Candio, Giulio; Mosca, Franco; Turchetti, Giuseppe
2016-09-01
The aim of this study is to compare surgical parameters and the costs of robotic surgery with those of laparoscopic approach in rectal cancer based on a single surgeon's early robotic experience. Data from 25 laparoscopic (LapTME) and the first 50 robotic (RobTME) rectal resections performed at our institution by an experienced laparoscopic surgeon (>100 procedures) between 2009 and 2014 were retrospectively analyzed and compared. Patient demographic, procedure, and outcome data were gathered. Costs of the two procedures were collected, differentiated into fixed and variable costs, and analyzed against the robotic learning curve according to the cumulative sum (CUSUM) method. Based on CUSUM analysis, RobTME group was divided into three phases (Rob1: 1-19; Rob2: 20-40; Rob3: 41-50). Overall median operative time (OT) was significantly lower in LapTME than in RobTME (270 vs 312.5 min, p = 0.006). A statistically significant change in OT by phase of robotic experience was detected in the RobTME group (p = 0.010). Overall mean costs associated with LapTME procedures were significantly lower than with RobTME (p < 0.001). Statistically significant reductions in variable and overall costs were found between robotic phases (p < 0.009 for both). With fixed costs excluded, the difference between laparoscopic and Rob3 was no longer statistically significant. Our results suggest a significant optimization of robotic rectal surgery's costs with experience. Efforts to reduce the dominant fixed cost are recommended to maintain the sustainability of the system and benefit from the technical advantages offered by the robot.
Completely automated modal analysis procedure based on the combination of different OMA methods
NASA Astrophysics Data System (ADS)
Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio
2018-03-01
In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.
Engineering Students Designing a Statistical Procedure for Quantifying Variability
ERIC Educational Resources Information Center
Hjalmarson, Margret A.
2007-01-01
The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…
Kholeif, S A
2001-06-01
A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.
Nevada Applied Ecology Group procedures handbook for environmental transuranics
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.G.; Dunaway, P.B.
The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and others. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure,more » it has been included to indicate special studies or applications perhaps more complex than the routine standard sampling procedures utilized.« less
Nevada Applied Ecology Group procedures handbook for environmental transuranics
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.G.; Dunaway, P.B.
The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and other biological material. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerningmore » a procedure, it has been included to indicate special studies or applications more complex than the routine standard sampling procedures utilized.« less
NASA Technical Reports Server (NTRS)
Van Dongen, H. P.; Olofsen, E.; VanHartevelt, J. H.; Kruyt, E. W.; Dinges, D. F. (Principal Investigator)
1999-01-01
Periodogram analysis of unequally spaced time-series, as part of many biological rhythm investigations, is complicated. The mathematical framework is scattered over the literature, and the interpretation of results is often debatable. In this paper, we show that the Lomb-Scargle method is the appropriate tool for periodogram analysis of unequally spaced data. A unique procedure of multiple period searching is derived, facilitating the assessment of the various rhythms that may be present in a time-series. All relevant mathematical and statistical aspects are considered in detail, and much attention is given to the correct interpretation of results. The use of the procedure is illustrated by examples, and problems that may be encountered are discussed. It is argued that, when following the procedure of multiple period searching, we can even benefit from the unequal spacing of a time-series in biological rhythm research.
Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy
NASA Astrophysics Data System (ADS)
Limandri, S.; Robledo, J.; Tirao, G.
2018-06-01
High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.
Plastic Surgery Statistics in the US: Evidence and Implications.
Heidekrueger, Paul I; Juran, Sabrina; Patel, Anup; Tanna, Neil; Broer, P Niclas
2016-04-01
The American Society of Plastic Surgeons publishes yearly procedural statistics, collected through questionnaires and online via tracking operations and outcomes for plastic surgeons (TOPS). The statistics, disaggregated by U.S. region, leave two important factors unaccounted for: (1) the underlying base population and (2) the number of surgeons performing the procedures. The presented analysis puts the regional distribution of surgeries into perspective and contributes to fulfilling the TOPS legislation objectives. ASPS statistics from 2005 to 2013 were analyzed by geographic region in the U.S. Using population estimates from the 2010 U.S. Census Bureau, procedures were calculated per 100,000 population. Then, based on the ASPS member roster, the rate of surgeries per surgeon by region was calculated and the interaction of these two variables was related to each other. In 2013, 1668,420 esthetic surgeries were performed in the U.S., resulting in the following ASPS ranking: 1st Mountain/Pacific (Region 5; 502,094 procedures, 30 % share), 2nd New England/Middle Atlantic (Region 1; 319,515, 19 %), 3rd South Atlantic (Region 3; 310,441, 19 %), 4th East/West South Central (Region 4; 274,282, 16 %), and 5th East/West North Central (Region 2; 262,088, 16 %). However, considering underlying populations, distribution and ranking appear to be different, displaying a smaller variance in surgical demand. Further, the number of surgeons and rate of procedures show great regional variation. Demand for plastic surgery is influenced by patients' geographic background and varies among U.S. regions. While ASPS data provide important information, additional insight regarding the demand for surgical procedures can be gained by taking certain demographic factors into consideration. This journal requires that the authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
ERIC Educational Resources Information Center
Bernstein, Michael I.
1982-01-01
Steps a school board can take to minimize the risk of age discrimination suits include reviewing all written policies, forms, files, and collective bargaining agreements for age discriminatory items; preparing a detailed statistical analysis of the age of personnel; and reviewing reduction-in-force procedures. (Author/MLF)
NASA Astrophysics Data System (ADS)
Hu, Chongqing; Li, Aihua; Zhao, Xingyang
2011-02-01
This paper proposes a multivariate statistical analysis approach to processing the instantaneous engine speed signal for the purpose of locating multiple misfire events in internal combustion engines. The state of each cylinder is described with a characteristic vector extracted from the instantaneous engine speed signal following a three-step procedure. These characteristic vectors are considered as the values of various procedure parameters of an engine cycle. Therefore, determination of occurrence of misfire events and identification of misfiring cylinders can be accomplished by a principal component analysis (PCA) based pattern recognition methodology. The proposed algorithm can be implemented easily in practice because the threshold can be defined adaptively without the information of operating conditions. Besides, the effect of torsional vibration on the engine speed waveform is interpreted as the presence of super powerful cylinder, which is also isolated by the algorithm. The misfiring cylinder and the super powerful cylinder are often adjacent in the firing sequence, thus missing detections and false alarms can be avoided effectively by checking the relationship between the cylinders.
ClinicAl Evaluation of Dental Restorative Materials
1989-01-01
use of an Atuarial Life Table Survival Analysis procedure. The median survival time for anterior composites was 13.5 years, as compared to 12.1 years...dental materials. For the first time in clinical biomaterials research, we used a statistical approach of Survival Analysis which utilized the... analysis has been established to assure uniformity in usage. This scale is now in use by clinical investigators throughout the country. Its use at the
Kepler AutoRegressive Planet Search
NASA Astrophysics Data System (ADS)
Feigelson, Eric
NASA's Kepler mission is the source of more exoplanets than any other instrument, but the discovery depends on complex statistical analysis procedures embedded in the Kepler pipeline. A particular challenge is mitigating irregular stellar variability without loss of sensitivity to faint periodic planetary transits. This proposal presents a two-stage alternative analysis procedure. First, parametric autoregressive ARFIMA models, commonly used in econometrics, remove most of the stellar variations. Second, a novel matched filter is used to create a periodogram from which transit-like periodicities are identified. This analysis procedure, the Kepler AutoRegressive Planet Search (KARPS), is confirming most of the Kepler Objects of Interest and is expected to identify additional planetary candidates. The proposed research will complete application of the KARPS methodology to the prime Kepler mission light curves of 200,000: stars, and compare the results with Kepler Objects of Interest obtained with the Kepler pipeline. We will then conduct a variety of astronomical studies based on the KARPS results. Important subsamples will be extracted including Habitable Zone planets, hot super-Earths, grazing-transit hot Jupiters, and multi-planet systems. Groundbased spectroscopy of poorly studied candidates will be performed to better characterize the host stars. Studies of stellar variability will then be pursued based on KARPS analysis. The autocorrelation function and nonstationarity measures will be used to identify spotted stars at different stages of autoregressive modeling. Periodic variables with folded light curves inconsistent with planetary transits will be identified; they may be eclipsing or mutually-illuminating binary star systems. Classification of stellar variables with KARPS-derived statistical properties will be attempted. KARPS procedures will then be applied to archived K2 data to identify planetary transits and characterize stellar variability.
Boriani, Filippo; Villani, Riccardo; Morselli, Paolo Giovanni
2014-10-01
Obesity is increasingly frequent in our society and is associated closely with metabolic disorders. As some studies have suggested, removal of fat tissue through liposuction and dermolipectomies may be of some benefit in the improvement of metabolic indices. This article aimed to review the published literature on this topic and to evaluate metabolic variations meta-analytically after liposuction, dermolipectomy, or both. Through a literature search with the PubMed/Medline database, 14 studies were identified. All articles were analyzed, and several metabolic variables were chosen in the attempt to meta-analyze the effect of adipose tissue removal through the various studies. All statistical calculations were performed with Review Manager (RevMan), version 5.0. Several cardiovascular and metabolic variables are described as prone to variations after body-contouring procedures when a significant amount of adipose tissue has been excised. Four of the studies included in the analysis reported improvements in all the parameters examined. Seven articles showed improvement in some variables and no improvement in others, whereas three studies showed no beneficial variation in any of the considered indicators after body-contouring procedures. Fasting plasma insulin was identified as the only variable for which a meta-analysis of five included studies was possible. The meta-analysis showed a statistically significant reduction in fasting plasma insulin resulting from large-volume liposuction in obese healthy women. Many beneficial metabolic effects resulting from dermolipectomy and liposuction procedures are described in the literature. In particular, fasting plasma insulin and thus insulin sensitivity seem to be positively influenced. Further research, including prospective clinical studies, is necessary for better exploration of the effects that body-contouring plastic surgery procedures have on metabolic parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
Reflectance of vegetation, soil, and water
NASA Technical Reports Server (NTRS)
Wiegand, C. L. (Principal Investigator)
1973-01-01
There are no author-identified significant results in this report. This report deals with the selection of the best channels from the 24-channel aircraft data to represent crop and soil conditions. A three-step procedure has been developed that involves using univariate statistics and an F-ratio test to indicate the best 14 channels. From the 14, the 10 best channels are selected by a multivariate stochastic process. The third step involves the pattern recognition procedures developed in the data analysis plan. Indications are that the procedures in use are satsifactory and will extract the desired information from the data.
Content analysis to detect high stress in oral interviews and text documents
NASA Technical Reports Server (NTRS)
Thirumalainambi, Rajkumar (Inventor); Jorgensen, Charles C. (Inventor)
2012-01-01
A system of interrogation to estimate whether a subject of interrogation is likely experiencing high stress, emotional volatility and/or internal conflict in the subject's responses to an interviewer's questions. The system applies one or more of four procedures, a first statistical analysis, a second statistical analysis, a third analysis and a heat map analysis, to identify one or more documents containing the subject's responses for which further examination is recommended. Words in the documents are characterized in terms of dimensions representing different classes of emotions and states of mind, in which the subject's responses that manifest high stress, emotional volatility and/or internal conflict are identified. A heat map visually displays the dimensions manifested by the subject's responses in different colors, textures, geometric shapes or other visually distinguishable indicia.
Nonlinear filtering properties of detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-11-01
Detrended fluctuation analysis (DFA) has been widely used for quantifying long-range correlation and fractal scaling behavior. In DFA, to avoid spurious detection of scaling behavior caused by a nonstationary trend embedded in the analyzed time series, a detrending procedure using piecewise least-squares fitting has been applied. However, it has been pointed out that the nonlinear filtering properties involved with detrending may induce instabilities in the scaling exponent estimation. To understand this issue, we investigate the adverse effects of the DFA detrending procedure on the statistical estimation. We show that the detrending procedure using piecewise least-squares fitting results in the nonuniformly weighted estimation of the root-mean-square deviation and that this property could induce an increase in the estimation error. In addition, for comparison purposes, we investigate the performance of a centered detrending moving average analysis with a linear detrending filter and sliding window DFA and show that these methods have better performance than the standard DFA.
Majstorović, Branislava M; Simić, Snezana; Milaković, Branko D; Vucović, Dragan S; Aleksić, Valentina V
2010-01-01
In anaesthesiology, economic aspects have been insufficiently studied. The aim of this paper was the assessment of rational choice of the anaesthesiological services based on the analysis of the scope, distribution, trend and cost. The costs of anaesthesiological services were counted based on "unit" prices from the Republic Health Insurance Fund. Data were analysed by methods of descriptive statistics and statistical significance was tested by Student's t-test and chi2-test. The number of general anaesthesia was higher and average time of general anaesthesia was shorter, without statistical significance (t-test, p = 0.436) during 2006 compared to the previous year. Local anaesthesia was significantly higher (chi2-test, p = 0.001) in relation to planned operation in emergency surgery. The analysis of total anaesthesiological procedures revealed that a number of procedures significantly increased in ENT and MFH surgery, and ophthalmology, while some reduction was observed in general surgery, orthopaedics and trauma surgery and cardiovascular surgery (chi2-test, p = 0.000). The number of analgesia was higher than other procedures (chi2-test, p = 0.000). The structure of the cost was 24% in neurosurgery, 16% in digestive (general) surgery,14% in gynaecology and obstetrics, 13% in cardiovascular surgery and 9% in emergency room. Anaesthesiological services costs were the highest in neurosurgery, due to the length anaesthesia, and digestive surgery due to the total number of general anaesthesia performed. It is important to implement pharmacoeconomic studies in all departments, and to separate the anaesthesia services for emergency and planned operations. Disproportions between the number of anaesthesia, surgery interventions and the number of patients in surgical departments gives reason to design relation database.
Yung, Emmanuel; Wong, Michael; Williams, Haddie; Mache, Kyle
2014-08-01
Randomized clinical trial. Objectives To compare the blood pressure (BP) and heart rate (HR) response of healthy volunteers to posteriorly directed (anterior-to-posterior [AP]) pressure applied to the cervical spine versus placebo. Manual therapists employ cervical spine AP mobilizations for various cervical-shoulder pain conditions. However, there is a paucity of literature describing the procedure, cardiovascular response, and safety profile. Thirty-nine (25 female) healthy participants (mean ± SD age, 24.7 ± 1.9 years) were randomly assigned to 1 of 2 groups. Group 1 received a placebo, consisting of light touch applied to the right C6 costal process. Group 2 received AP pressure at the same location. Blood pressure and HR were measured prior to, during, and after the application of AP pressure. One-way analysis of variance and paired-difference statistics were used for data analysis. There was no statistically significant difference between groups for mean systolic BP, mean diastolic BP, and mean HR (P >.05) for all time points. Within-group comparisons indicated statistically significant differences between baseline and post-AP pressure HR (-2.8 bpm; 95% confidence interval: -4.6, -1.1) and between baseline and post-AP pressure systolic BP (-2.4 mmHg; 95% confidence interval: -3.7, -1.0) in the AP group, and between baseline and postplacebo systolic BP (-2.6 mmHg; 95% confidence interval: -4.2, -1.0) in the placebo group. No participants reported any adverse reactions or side effects within 24 hours of testing. AP pressure caused a statistically significant physiologic response that resulted in a minor drop in HR (without causing asystole or vasodepression) after the procedure, whereas this cardiovascular change did not occur for those in the placebo group. Within both groups, there was a small but statistically significant reduction in systolic BP following the procedure.
Behavior analytic approaches to problem behavior in intellectual disabilities.
Hagopian, Louis P; Gregory, Meagan K
2016-03-01
The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.
Which statistics should tropical biologists learn?
Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián
2011-09-01
Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-20
... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...
ERIC Educational Resources Information Center
Gorman, Dennis M.; Huber, J. Charles, Jr.
2009-01-01
This study explores the possibility that any drug prevention program might be considered "evidence-based" given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of…
Carter, Laura; Wilson, Stephen; Tumer, Erwin G
2010-01-01
The purpose of this retrospective chart review was to document sedation and analgesic medications administered preoperotively, intraoperatively, and during postanesthesia care for children undergoing dental rehabilitation using general anesthesia (GA). Patient gender, age, procedure type performed, and ASA status were recorded from the medical charts of children undergoing GA for dental rehabilitation. The sedative and analgesic drugs administered pre-, intra-, and postoperatively were recorded. Statistical analysis included descriptive statistics and cross-tabulation. A sample of 115 patients with a mean age of 64 (+/-30) months was studied; 47% were females, and 71% were healthy. Over 80% of the patients were administered medications primarily during pre- and intraoperative phases, with fewer than 25% receiving medications postoperatively. Morphine and fentanyl were the most frequently administered agents intraoperatively. The procedure type, gender, and health status were not statistically associated with the number of agents administered. Younger patients, however, were statistically more likely to receive additional analgesic medications. Our study suggests that a minority of patients have postoperative discomfort in the postanesthesia care unit; mild to moderate analgesics were administered during intraoperative phases of dental rehabilitation.
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.
1975-01-01
The results of classifications and experiments for the crop identification technology assessment for remote sensing are summarized. Using two analysis procedures, 15 data sets were classified. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. Additionally, 20 data sets were classified using training statistics from another segment or date. The classification and proportion estimation results of the local and nonlocal classifications are reported. Data also describe several other experiments to provide additional understanding of the results of the crop identification technology assessment for remote sensing. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, spectral discriminability of corn, soybeans, and other, and analyses of aircraft multispectral data.
Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F
2015-01-01
To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.
The study was concerned with the identification of the job dimension underlying the job elements of the Position Analysis Questionnaire ( PAQ ), Form B...The PAQ is a structured job analysis instrument consisting of 187 worker-oriented job elements which are divided into six a priori major divisions...The statistical procedure of principal components analysis was used to identify the job dimensions of the PAQ . Forty-five job dimensions were
DOT National Transportation Integrated Search
1985-09-01
This report examines the groove wear variability among tires subjected to the : Uniform Tire Quality Grading (UTQC) test procedure for determining tire tread wear. : The effects of heteroscedasticity (variable variance) on a previously reported : sta...
METHODS OF DEALING WITH VALUES BELOW THE LIMIT OF DETECTION USING SAS
Due to limitations of chemical analysis procedures, small concentrations cannot be precisely measured. These concentrations are said to be below the limit of detection (LOD). In statistical analyses, these values are often censored and substituted with a constant value, such ...
Wrestling with Philosophy: Improving Scholarship in Higher Education
ERIC Educational Resources Information Center
Kezar, Adrianna
2004-01-01
Method is usually viewed as completely separate from philosophy or theory, focusing instead on techniques and procedures of interviewing, focus groups, observation, or statistical analysis. Several texts on methodology published recently have added significant sections on philosophy, such as Creswell's (1998) Qualitative inquiry and research…
Sun, Jianguo; Feng, Yanqin; Zhao, Hui
2015-01-01
Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.
"Hyperstat": an educational and working tool in epidemiology.
Nicolosi, A
1995-01-01
The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.
Bowling, Mark R; Kohan, Matthew W; Walker, Paul; Efird, Jimmy; Ben Or, Sharon
2015-01-01
Navigational bronchoscopy is utilized to guide biopsies of peripheral lung nodules and place fiducial markers for treatment of limited stage lung cancer with stereotactic body radiotherapy. The type of sedation used for this procedure remains controversial. We performed a retrospective chart review to evaluate the differences of diagnostic yield and overall success of the procedure based on anesthesia type. Electromagnetic navigational bronchoscopy was performed using the superDimension software system. Once the targeted lesion was within reach, multiple tissue samples were obtained. Statistical analysis was used to correlate the yield with the type of sedation among other factors. A successful procedure was defined if a diagnosis was made or a fiducial marker was adequately placed. Navigational bronchoscopy was performed on a total of 120 targeted lesions. The overall complication rate of the procedure was 4.1%. The diagnostic yield and success of the procedure was 74% and 87%, respectively. Duration of the procedure was the only significant difference between the general anesthesia and IV sedation groups (mean, 58 vs. 43 min, P=0.0005). A larger tumor size was associated with a higher diagnostic yield (P=0.032). All other variables in terms of effect on diagnostic yield and an unsuccessful procedure did not meet statistical significance. Navigational bronchoscopy is a safe and effective pulmonary diagnostic tool with relatively low complication rate. The diagnostic yield and overall success of the procedure does not seem to be affected by the type of sedation used.
Statistics in the pharmacy literature.
Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R
2004-09-01
Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.
Quantitative trait Loci analysis using the false discovery rate.
Benjamini, Yoav; Yekutieli, Daniel
2005-10-01
False discovery rate control has become an essential tool in any study that has a very large multiplicity problem. False discovery rate-controlling procedures have also been found to be very effective in QTL analysis, ensuring reproducible results with few falsely discovered linkages and offering increased power to discover QTL, although their acceptance has been slower than in microarray analysis, for example. The reason is partly because the methodological aspects of applying the false discovery rate to QTL mapping are not well developed. Our aim in this work is to lay a solid foundation for the use of the false discovery rate in QTL mapping. We review the false discovery rate criterion, the appropriate interpretation of the FDR, and alternative formulations of the FDR that appeared in the statistical and genetics literature. We discuss important features of the FDR approach, some stemming from new developments in FDR theory and methodology, which deem it especially useful in linkage analysis. We review false discovery rate-controlling procedures--the BH, the resampling procedure, and the adaptive two-stage procedure-and discuss the validity of these procedures in single- and multiple-trait QTL mapping. Finally we argue that the control of the false discovery rate has an important role in suggesting, indicating the significance of, and confirming QTL and present guidelines for its use.
Methods for collection and analysis of aquatic biological and microbiological samples
Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.
1977-01-01
Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.
Janssen, Dirk P
2012-03-01
Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.
Censored data treatment using additional information in intelligent medical systems
NASA Astrophysics Data System (ADS)
Zenkova, Z. N.
2015-11-01
Statistical procedures are a very important and significant part of modern intelligent medical systems. They are used for proceeding, mining and analysis of different types of the data about patients and their diseases; help to make various decisions, regarding the diagnosis, treatment, medication or surgery, etc. In many cases the data can be censored or incomplete. It is a well-known fact that censorship considerably reduces the efficiency of statistical procedures. In this paper the author makes a brief review of the approaches which allow improvement of the procedures using additional information, and describes a modified estimation of an unknown cumulative distribution function involving additional information about a quantile which is known exactly. The additional information is used by applying a projection of a classical estimator to a set of estimators with certain properties. The Kaplan-Meier estimator is considered as an estimator of the unknown cumulative distribution function, the properties of the modified estimator are investigated for a case of a single right censorship by means of simulations.
ERIC Educational Resources Information Center
Earl, Lorna L.
This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…
Load research manual. Volume 2: Fundamentals of implementing load research procedures
NASA Astrophysics Data System (ADS)
1980-11-01
This manual will assist electric utilities and state regulatory authorities in investigating customer electricity demand as part of cost-of-service studies, rate design, marketing research, system design, load forecasting, rate reform analysis, and load management research. Load research procedures are described in detail. Research programs at three utilities are compared: Carolina Power and Light Company, Long Island Lighting Company, and Southern California Edison Company. A load research bibliography and glossaries of load research and statistical terms are also included.
Gupta, Alisha; Agarwala, Sandeep; Sreenivas, Vishnubhatla; Srinivas, Madhur; Bhatnagar, Veereshwar
2017-01-01
Females with Krickenbeck low-type anorectal malformations - vestibular fistula (VF) and perineal fistula (PF) - are managed either by a primary definitive or conventional three-staged approach. Ultimate outcome in these children may be affected by wound dehiscence leading to healing by fibrosis. Most of the literature favors one approach over other based on retrospective analysis of their outcomes. Whether a statistically significant difference in wound dehiscence rates between these approaches exists needed to be seen. A randomized controlled trial for girls <14 years with VF or PF was done. Random tables were used to randomize 33 children to Group I (primary procedure) and 31 to Group II (three-staged procedure). Statistical analysis was done for significance of difference ( P < 0.05) in the primary outcome (wound dehiscence) and secondary outcomes (immediate and early postoperative complications). Of the 64 children randomized, 54 (84%) had VF. Both groups were comparable in demography, clinical profile and age at surgery. The incidence of wound dehiscence (39.4% vs. 18.2%; P = 0.04), immediate postoperative complications (51.5% vs. 12.9%; P = 0.001), and early postoperative complications (42.4% vs. 12.9%; P = 0.01) was significantly higher in Group I as compared to Group II. Six of 13 children (46.2%) with dehiscence in Group I required a diverting colostomy to be made. Females with VF or PF undergoing primary definitive procedure have a significantly higher incidence of wound dehiscence ( P = 0.04), immediate ( P = 0.001) and early postoperative complications ( P = 0.01).
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
ERIC Educational Resources Information Center
Bloom, Howard S.
2002-01-01
Introduces an new approach for measuring the impact of whole school reforms. The approach, based on "short" interrupted time-series analysis, is explained, its statistical procedures are outlined, and how it was used in the evaluation of a major whole-school reform, Accelerated Schools is described (H. Bloom and others, 2001). (SLD)
Weighting by Inverse Variance or by Sample Size in Random-Effects Meta-Analysis
ERIC Educational Resources Information Center
Marin-Martinez, Fulgencio; Sanchez-Meca, Julio
2010-01-01
Most of the statistical procedures in meta-analysis are based on the estimation of average effect sizes from a set of primary studies. The optimal weight for averaging a set of independent effect sizes is the inverse variance of each effect size, but in practice these weights have to be estimated, being affected by sampling error. When assuming a…
A clinical research analytics toolkit for cohort study.
Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue
2012-01-01
This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.
Statistical Learning Analysis in Neuroscience: Aiming for Transparency
Hanke, Michael; Halchenko, Yaroslav O.; Haxby, James V.; Pollmann, Stefan
2009-01-01
Encouraged by a rise of reciprocal interest between the machine learning and neuroscience communities, several recent studies have demonstrated the explanatory power of statistical learning techniques for the analysis of neural data. In order to facilitate a wider adoption of these methods, neuroscientific research needs to ensure a maximum of transparency to allow for comprehensive evaluation of the employed procedures. We argue that such transparency requires “neuroscience-aware” technology for the performance of multivariate pattern analyses of neural data that can be documented in a comprehensive, yet comprehensible way. Recently, we introduced PyMVPA, a specialized Python framework for machine learning based data analysis that addresses this demand. Here, we review its features and applicability to various neural data modalities. PMID:20582270
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...
NASA Astrophysics Data System (ADS)
Aminah, Agustin Siti; Pawitan, Gandhi; Tantular, Bertho
2017-03-01
So far, most of the data published by Statistics Indonesia (BPS) as data providers for national statistics are still limited to the district level. Less sufficient sample size for smaller area levels to make the measurement of poverty indicators with direct estimation produced high standard error. Therefore, the analysis based on it is unreliable. To solve this problem, the estimation method which can provide a better accuracy by combining survey data and other auxiliary data is required. One method often used for the estimation is the Small Area Estimation (SAE). There are many methods used in SAE, one of them is Empirical Best Linear Unbiased Prediction (EBLUP). EBLUP method of maximum likelihood (ML) procedures does not consider the loss of degrees of freedom due to estimating β with β ^. This drawback motivates the use of the restricted maximum likelihood (REML) procedure. This paper proposed EBLUP with REML procedure for estimating poverty indicators by modeling the average of household expenditures per capita and implemented bootstrap procedure to calculate MSE (Mean Square Error) to compare the accuracy EBLUP method with the direct estimation method. Results show that EBLUP method reduced MSE in small area estimation.
A statistically robust EEG re-referencing procedure to mitigate reference effect
Lepage, Kyle Q.; Kramer, Mark A.; Chu, Catherine J.
2014-01-01
Background The electroencephalogram (EEG) remains the primary tool for diagnosis of abnormal brain activity in clinical neurology and for in vivo recordings of human neurophysiology in neuroscience research. In EEG data acquisition, voltage is measured at positions on the scalp with respect to a reference electrode. When this reference electrode responds to electrical activity or artifact all electrodes are affected. Successful analysis of EEG data often involves re-referencing procedures that modify the recorded traces and seek to minimize the impact of reference electrode activity upon functions of the original EEG recordings. New method We provide a novel, statistically robust procedure that adapts a robust maximum-likelihood type estimator to the problem of reference estimation, reduces the influence of neural activity from the re-referencing operation, and maintains good performance in a wide variety of empirical scenarios. Results The performance of the proposed and existing re-referencing procedures are validated in simulation and with examples of EEG recordings. To facilitate this comparison, channel-to-channel correlations are investigated theoretically and in simulation. Comparison with existing methods The proposed procedure avoids using data contaminated by neural signal and remains unbiased in recording scenarios where physical references, the common average reference (CAR) and the reference estimation standardization technique (REST) are not optimal. Conclusion The proposed procedure is simple, fast, and avoids the potential for substantial bias when analyzing low-density EEG data. PMID:24975291
Evaluation of noise pollution level in the operating rooms of hospitals: A study in Iran.
Giv, Masoumeh Dorri; Sani, Karim Ghazikhanlou; Alizadeh, Majid; Valinejadi, Ali; Majdabadi, Hesamedin Askari
2017-06-01
Noise pollution in the operating rooms is one of the remaining challenges. Both patients and physicians are exposed to different sound levels during the operative cases, many of which can last for hours. This study aims to evaluate the noise pollution in the operating rooms during different surgical procedures. In this cross-sectional study, sound level in the operating rooms of Hamadan University-affiliated hospitals (totally 10) in Iran during different surgical procedures was measured using B&K sound meter. The gathered data were compared with national and international standards. Statistical analysis was performed using descriptive statistics and one-way ANOVA, t -test, and Pearson's correlation test. Noise pollution level at majority of surgical procedures is higher than national and international documented standards. The highest level of noise pollution is related to orthopedic procedures, and the lowest one related to laparoscopic and heart surgery procedures. The highest and lowest registered sound level during the operation was 93 and 55 dB, respectively. Sound level generated by equipments (69 ± 4.1 dB), trolley movement (66 ± 2.3 dB), and personnel conversations (64 ± 3.9 dB) are the main sources of noise. The noise pollution of operating rooms are higher than available standards. The procedure needs to be corrected for achieving the proper conditions.
DIFAS: Differential Item Functioning Analysis System. Computer Program Exchange
ERIC Educational Resources Information Center
Penfield, Randall D.
2005-01-01
Differential item functioning (DIF) is an important consideration in assessing the validity of test scores (Camilli & Shepard, 1994). A variety of statistical procedures have been developed to assess DIF in tests of dichotomous (Hills, 1989; Millsap & Everson, 1993) and polytomous (Penfield & Lam, 2000; Potenza & Dorans, 1995) items. Some of these…
Silt fences: An economical technique for measuring hillslope soil erosion
Peter R. Robichaud; Robert E. Brown
2002-01-01
Measuring hillslope erosion has historically been a costly, time-consuming practice. An easy to install low-cost technique using silt fences (geotextile fabric) and tipping bucket rain gauges to measure onsite hillslope erosion was developed and tested. Equipment requirements, installation procedures, statistical design, and analysis methods for measuring hillslope...
The contribution of molecular relaxation in nitrogen to the absorption of sound in the atmosphere
NASA Technical Reports Server (NTRS)
Zuckerwar, A. J.; Meredith, R. W.
1980-01-01
Results and statistical analysis are presented for sound absorption in N2-H2O binary mixtures at room temperature. Experimental procedure, temperature effects, and preliminary results are presented for sound absorption in N2-H2O binary mixtures at elevated temperatures.
METHODS OF DEALING WITH VALUES BELOW THE LIMIT OF DETECTION USING SAS
Due to limitations of chemical analysis procedures, small values cannot be precisely measured. These values are said to be below the limit of detection (LOD). In statistical analyses, these values are often censored and substituted with a constant value, such as half the LOD,...
Averaging Models: Parameters Estimation with the R-Average Procedure
ERIC Educational Resources Information Center
Vidotto, G.; Massidda, D.; Noventa, S.
2010-01-01
The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…
ERIC Educational Resources Information Center
Ferrari, Pier Alda; Barbiero, Alessandro
2012-01-01
The increasing use of ordinal variables in different fields has led to the introduction of new statistical methods for their analysis. The performance of these methods needs to be investigated under a number of experimental conditions. Procedures to simulate from ordinal variables are then required. In this article, we deal with simulation from…
Development of the Research Competencies Scale
ERIC Educational Resources Information Center
Swank, Jacqueline M.; Lambie, Glenn W.
2016-01-01
The authors present the development of the Research Competencies Scale (RCS). The purpose of this article is threefold: (a) present a rationale for the RCS, (b) review statistical analysis procedures used in developing the RCS, and (c) offer implications for counselor education, the enhancement of scholar-researchers, and future research.
Assessing timber availability in upland Hardwood Forests
Dennis M. May; Chris B. LeDoux
1992-01-01
Reported forest inventory statistics gathered by the USDA Forest Service, Southern Forest Experiment Station, Forest Inventory and Analysis (SOFIA) have been criticized because not all of the inventory volume reported is truly available for harvest. In response to this criticism, a procedure has been developed for assessing timber availability from reported inventory...
40 CFR 86.1341-90 - Test cycle validation criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-90 Test cycle validation criteria. (a) To minimize the biasing effect of the time lag... brake horsepower-hour. (c) Regression line analysis to calculate validation statistics. (1) Linear...
40 CFR 86.1341-90 - Test cycle validation criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-90 Test cycle validation criteria. (a) To minimize the biasing effect of the time lag... brake horsepower-hour. (c) Regression line analysis to calculate validation statistics. (1) Linear...
40 CFR 86.1341-90 - Test cycle validation criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-90 Test cycle validation criteria. (a) To minimize the biasing effect of the time lag... brake horsepower-hour. (c) Regression line analysis to calculate validation statistics. (1) Linear...
Pinto, Gustavo Da Col dos Santos; Dias, Kleber Campioni; Cruvinel, Diogo Rodrigues; Garcia, Lucas da Fonseca Roberti; Consani, Simonides; Pires-De-Souza, Fernanda de Carvalho Panzeri
2013-01-01
To assess the influence of finishing/polishing procedure on color stability (ΔE ) and surface roughness (R(a)) of composites (Heliomolar and Tetric - color A2) submitted to accelerated artificial aging (AAA). Sixty test specimens were made of each composite (12 mm × 2 mm) and separated into six groups (n = 10), according to the type of finishing/polishing to which they were submitted: C, control; F, tip 3195 F; FF, tip 3195 FF; FP, tip 3195 F + diamond paste; FFP, tip 3195 FF + diamond paste; SF, Sof-Lex discs. After polishing, controlled by an electromechanical system, initial color (spectrophotometer PCB 6807 BYK GARDNER) and R(a) (roughness meter Surfcorder SE 1700, cut-off 0.25 mm) readings were taken. Next, the test specimens were submitted to the AAA procedure (C-UV Comexim) for 384 hours, and at the end of this period, new color readings and R(a) were taken. Statistical analysis [2-way analysis of variance (ANOVA), Bonferroni, P < 0.05] showed that all composites demonstrated ΔE alteration above the clinically acceptable limits, with the exception of Heliomolar composite in FP. The greatest ΔE alteration occurred for Tetric composite in SF (13.38 ± 2.10) statistically different from F and FF (P < 0.05). For R(a), Group F showed rougher samples than FF with statistically significant difference (P < 0.05). In spite of the surface differences, the different finishing/polishing procedures were not capable of providing color stability within the clinically acceptable limits.
Yoshida, Hiroyuki; Shibata, Hiroko; Izutsu, Ken-Ichi; Goda, Yukihiro
2017-01-01
The current Japanese Ministry of Health Labour and Welfare (MHLW)'s Guideline for Bioequivalence Studies of Generic Products uses averaged dissolution rates for the assessment of dissolution similarity between test and reference formulations. This study clarifies how the application of model-independent multivariate confidence region procedure (Method B), described in the European Medical Agency and U.S. Food and Drug Administration guidelines, affects similarity outcomes obtained empirically from dissolution profiles with large variations in individual dissolution rates. Sixty-one datasets of dissolution profiles for immediate release, oral generic, and corresponding innovator products that showed large variation in individual dissolution rates in generic products were assessed on their similarity by using the f 2 statistics defined in the MHLW guidelines (MHLW f 2 method) and two different Method B procedures, including a bootstrap method applied with f 2 statistics (BS method) and a multivariate analysis method using the Mahalanobis distance (MV method). The MHLW f 2 and BS methods provided similar dissolution similarities between reference and generic products. Although a small difference in the similarity assessment may be due to the decrease in the lower confidence interval for expected f 2 values derived from the large variation in individual dissolution rates, the MV method provided results different from those obtained through MHLW f 2 and BS methods. Analysis of actual dissolution data for products with large individual variations would provide valuable information towards an enhanced understanding of these methods and their possible incorporation in the MHLW guidelines.
Graft survival of diabetic versus nondiabetic donor tissue after initial keratoplasty.
Vislisel, Jesse M; Liaboe, Chase A; Wagoner, Michael D; Goins, Kenneth M; Sutphin, John E; Schmidt, Gregory A; Zimmerman, M Bridget; Greiner, Mark A
2015-04-01
To compare corneal graft survival using tissue from diabetic and nondiabetic donors in patients undergoing initial Descemet stripping automated endothelial keratoplasty (DSAEK) or penetrating keratoplasty (PKP). A retrospective chart review of pseudophakic eyes that underwent DSAEK or PKP was performed. The primary outcome measure was graft failure. Cox proportional hazard regression and Kaplan-Meier survival analyses were used to compare diabetic versus nondiabetic donor tissue for all keratoplasty cases. A total of 183 eyes (136 DSAEK, 47 PKP) were included in the statistical analysis. Among 24 procedures performed using diabetic donor tissue, there were 4 cases (16.7%) of graft failure (3 DSAEK, 1 PKP), and among 159 procedures performed using nondiabetic donor tissue, there were 18 cases (11.3%) of graft failure (12 DSAEK, 6 PKP). Cox proportional hazard ratio of graft failure for all cases comparing diabetic with nondiabetic donor tissue was 1.69, but this difference was not statistically significant (95% confidence interval, 0.56-5.06; P = 0.348). There were no significant differences in Kaplan-Meier curves comparing diabetic with nondiabetic donor tissue for all cases (P = 0.380). Statistical analysis of graft failure by donor diabetes status within each procedure type was not possible because of the small number of graft failure events involving diabetic tissue. We found similar rates of graft failure in all keratoplasty cases when comparing tissue from diabetic and nondiabetic donors, but further investigation is needed to determine whether diabetic donor tissue results in different graft failure rates after DSAEK compared with PKP.
A spin column-free approach to sodium hydroxide-based glycan permethylation.
Hu, Yueming; Borges, Chad R
2017-07-24
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.
A spin column-free approach to sodium hydroxide-based glycan permethylation†
Hu, Yueming; Borges, Chad R.
2018-01-01
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willse, Alan R.; Belcher, Ann; Preti, George
2005-04-15
Gas chromatography (GC), combined with mass spectrometry (MS) detection, is a powerful analytical technique that can be used to separate, quantify, and identify volatile compounds in complex mixtures. This paper examines the application of GC-MS in a comparative experiment to identify volatiles that differ in concentration between two groups. A complex mixture might comprise several hundred or even thousands of volatile compounds. Because their number and location in a chromatogram generally are unknown, and because components overlap in populous chromatograms, the statistical problems offer significant challenges beyond traditional two-group screening procedures. We describe a statistical procedure to compare two-dimensional GC-MSmore » profiles between groups, which entails (1) signal processing: baseline correction and peak detection in single ion chromatograms; (2) aligning chromatograms in time; (3) normalizing differences in overall signal intensities; and (4) detecting chromatographic regions that differ between groups. Compared to existing approaches, the proposed method is robust to errors made at earlier stages of analysis, such as missed peaks or slightly misaligned chromatograms. To illustrate the method, we identify differences in GC-MS chromatograms of ether-extracted urine collected from two nearly identical inbred groups of mice, to investigate the relationship between odor and genetics of the major histocompatibility complex.« less
Linear retrieval and global measurements of wind speed from the Seasat SMMR
NASA Technical Reports Server (NTRS)
Pandey, P. C.
1983-01-01
Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.
From fields to objects: A review of geographic boundary analysis
NASA Astrophysics Data System (ADS)
Jacquez, G. M.; Maruca, S.; Fortin, M.-J.
Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Bayesian Sensitivity Analysis of Statistical Models with Missing Data
ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG
2013-01-01
Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718
Online and offline tools for head movement compensation in MEG.
Stolk, Arjen; Todorovic, Ana; Schoffelen, Jan-Mathijs; Oostenveld, Robert
2013-03-01
Magnetoencephalography (MEG) is measured above the head, which makes it sensitive to variations of the head position with respect to the sensors. Head movements blur the topography of the neuronal sources of the MEG signal, increase localization errors, and reduce statistical sensitivity. Here we describe two novel and readily applicable methods that compensate for the detrimental effects of head motion on the statistical sensitivity of MEG experiments. First, we introduce an online procedure that continuously monitors head position. Second, we describe an offline analysis method that takes into account the head position time-series. We quantify the performance of these methods in the context of three different experimental settings, involving somatosensory, visual and auditory stimuli, assessing both individual and group-level statistics. The online head localization procedure allowed for optimal repositioning of the subjects over multiple sessions, resulting in a 28% reduction of the variance in dipole position and an improvement of up to 15% in statistical sensitivity. Offline incorporation of the head position time-series into the general linear model resulted in improvements of group-level statistical sensitivity between 15% and 29%. These tools can substantially reduce the influence of head movement within and between sessions, increasing the sensitivity of many cognitive neuroscience experiments. Copyright © 2012 Elsevier Inc. All rights reserved.
Eksborg, Staffan
2013-01-01
Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.
Laboratory animal science: a resource to improve the quality of science.
Forni, M
2007-08-01
The contribution of animal experimentation to biomedical research is of undoubted value, nevertheless the real usefulness of animal models is still being hotly debated. Laboratory Animal Science is a multidisciplinary approach to humane animal experimentation that allows the choice of the correct animal model and the collection of unbiased data. Refinement, Reduction and Replacement, the "3Rs rule", are now widely accepted and have a major influence on animal experimentation procedures. Refinement, namely any decrease in the incidence or severity of inhumane procedures applied to animals, has been today extended to the entire lives of the experimental animals. Reduction of the number of animals used to obtain statistically significant data may be achieved by improving experimental design and statistical analysis of data. Replacement refers to the development of validated alternative methods. A Laboratory Animal Science training program in biomedical degrees can promote the 3Rs and improve the welfare of laboratory animals as well as the quality of science with ethical, scientific and economic advantages complying with the European requirement that "persons who carry out, take part in, or supervise procedures on animals, or take care of animals used in procedures, shall have had appropriate education and training".
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon
2015-11-03
Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Uncertainties in obtaining high reliability from stress-strength models
NASA Technical Reports Server (NTRS)
Neal, Donald M.; Matthews, William T.; Vangel, Mark G.
1992-01-01
There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems
NASA Astrophysics Data System (ADS)
Reddy, Lauren M.
In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.
Harrysson, Iliana J; Cook, Jonathan; Sirimanna, Pramudith; Feldman, Liane S; Darzi, Ara; Aggarwal, Rajesh
2014-07-01
To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment. Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary. A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria. Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121). Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.
Irvine, Kathryn M.; Manlove, Kezia; Hollimon, Cynthia
2012-01-01
An important consideration for long term monitoring programs is determining the required sampling effort to detect trends in specific ecological indicators of interest. To enhance the Greater Yellowstone Inventory and Monitoring Network’s water resources protocol(s) (O’Ney 2006 and O’Ney et al. 2009 [under review]), we developed a set of tools to: (1) determine the statistical power for detecting trends of varying magnitude in a specified water quality parameter over different lengths of sampling (years) and different within-year collection frequencies (monthly or seasonal sampling) at particular locations using historical data, and (2) perform periodic trend analyses for water quality parameters while addressing seasonality and flow weighting. A power analysis for trend detection is a statistical procedure used to estimate the probability of rejecting the hypothesis of no trend when in fact there is a trend, within a specific modeling framework. In this report, we base our power estimates on using the seasonal Kendall test (Helsel and Hirsch 2002) for detecting trend in water quality parameters measured at fixed locations over multiple years. We also present procedures (R-scripts) for conducting a periodic trend analysis using the seasonal Kendall test with and without flow adjustment. This report provides the R-scripts developed for power and trend analysis, tutorials, and the associated tables and graphs. The purpose of this report is to provide practical information for monitoring network staff on how to use these statistical tools for water quality monitoring data sets.
DHLAS: A web-based information system for statistical genetic analysis of HLA population data.
Thriskos, P; Zintzaras, E; Germenis, A
2007-03-01
DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.
FGWAS: Functional genome wide association analysis.
Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-10-01
Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.
Managing heteroscedasticity in general linear models.
Rosopa, Patrick J; Schaffer, Meline M; Schroeder, Amber N
2013-09-01
Heteroscedasticity refers to a phenomenon where data violate a statistical assumption. This assumption is known as homoscedasticity. When the homoscedasticity assumption is violated, this can lead to increased Type I error rates or decreased statistical power. Because this can adversely affect substantive conclusions, the failure to detect and manage heteroscedasticity could have serious implications for theory, research, and practice. In addition, heteroscedasticity is not uncommon in the behavioral and social sciences. Thus, in the current article, we synthesize extant literature in applied psychology, econometrics, quantitative psychology, and statistics, and we offer recommendations for researchers and practitioners regarding available procedures for detecting heteroscedasticity and mitigating its effects. In addition to discussing the strengths and weaknesses of various procedures and comparing them in terms of existing simulation results, we describe a 3-step data-analytic process for detecting and managing heteroscedasticity: (a) fitting a model based on theory and saving residuals, (b) the analysis of residuals, and (c) statistical inferences (e.g., hypothesis tests and confidence intervals) involving parameter estimates. We also demonstrate this data-analytic process using an illustrative example. Overall, detecting violations of the homoscedasticity assumption and mitigating its biasing effects can strengthen the validity of inferences from behavioral and social science data.
Grbovic, Vesna; Jurisic-Skevin, Aleksandra; Djukic, Svetlana; Stefanović, Srdjan; Nurkovic, Jasmin
2016-01-01
[Purpose] Painful diabetic polyneuropathy occurs as a complication in 16% of all patients with diabetes mellitus. [Subjects and Methods] A clinical, prospective open-label randomized intervention study was conducted of 60 adult patients, with distal sensorimotor diabetic neuropathy two groups of 30 patients, with diabetes mellitus type 2 with distal sensorimotor diabetic neuropathy. Patients in group A were treated with combined physical procedures, and patients in group B were treated with alpha lipoic acid. [Results] There where a statistically significant improvements in terminal latency and the amplitude of the action potential in group A patients, while group B patients showed a statistically significant improvements in conduction velocity and terminal latency of n. peroneus. Group A patients showed a statistically significant improvements in conduction velocity and terminal latency, while group B patients also showed a statistically significant improvements in conduction velocity and terminal latency. This was reflected in a significant improvements in electrophysiological parameters (conduction velocity, amplitude and latency) of the motor and sensory nerves (n. peroneus, n. suralis). [Conclusion] These results present further evidence justifying of the use of physical agents in the treatment of diabetic sensorimotor polyneuropathy. PMID:27065527
Piotrowski, T; Rodrigues, G; Bajon, T; Yartsev, S
2014-03-01
Multi-institutional collaborations allow for more information to be analyzed but the data from different sources may vary in the subgroup sizes and/or conditions of measuring. Rigorous statistical analysis is required for pooling the data in a larger set. Careful comparison of all the components of the data acquisition is indispensable: identical conditions allow for enlargement of the database with improved statistical analysis, clearly defined differences provide opportunity for establishing a better practice. The optimal sequence of required normality, asymptotic normality, and independence tests is proposed. An example of analysis of six subgroups of position corrections in three directions obtained during image guidance procedures for 216 prostate cancer patients from two institutions is presented. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Raj, E-mail: rajdas@nhs.net, E-mail: raj.das@stgeorges.nhs.uk; Lucatelli, Pierleone, E-mail: pierleone.lucatelli@gmail.com; Wang, Haofan, E-mail: wwhhff123@gmail.com
AimA clear understanding of operator experience is important in improving technical success whilst minimising patient risk undergoing endovascular procedures, and there is the need to ensure that trainees have the appropriate skills as primary operators. The aim of the study is to retrospectively analyse uterine artery embolisation (UAE) procedures performed by interventional radiology (IR) trainees at an IR training unit analysing fluoroscopy times and radiation dose as surrogate markers of technical skill.MethodsTen IR fellows were primary operator in 200 UAE procedures over a 5-year period. We compared fluoroscopy times, radiation dose and complications, after having them categorised according to threemore » groups: Group 1, initial five, Group 2, >5 procedures and Group 3, penultimate five UAE procedures. We documented factors that may affect screening time (number of vials employed and use of microcatheters).ResultsMean fluoroscopy time was 18.4 (±8.1), 17.3 (±9.0), 16.3 (±8.4) min in Groups 1, 2 and 3, respectively. There was no statistically significant difference between these groups (p > 0.05) with respect to fluoroscopy time or radiation dose. Analysis after correction for confounding factors showed no statistical significance (p > 0.05). All procedures were technically successful, and total complication rate was 4 %.ConclusionUAE was chosen as a highly standardised procedure followed by IR practitioners. Although there is a non-significant trend for shorter screening times with experience, technical success and safety were not compromised with appropriate Consultant supervision, which illustrates a safe construct for IR training. This is important and reassuring information for patients undergoing a procedure in a training unit.« less
Soils element activities for the period October 1973--September 1974
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fowler, E.B.; Essington, E.H.; White, M.G.
Soils Element activities were conducted on behalf of the U. S. Atomic Energy Commission's Nevada Applied Ecology Group (NAEG) program to provide source term information for the other program elements and maintain continuous cognizance of program requirements for sampling, sample preparation, and analysis. Activities included presentation of papers; participation in workshops; analysis of soil, vegetation, and animal tissue samples for $sup 238$Pu, $sup 239-240$Pu, $sup 241$Am, $sup 137$Cs, $sup 60$Co, and gamma scan for routine and laboratory quality control purposes; preparation and analysis of animal tissue samples for NAEG laboratory certification; studies on a number of analytical, sample preparation, andmore » sample collection procedures; and contributions to the evaluation of procedures for calculation of specialized counting statistics. (auth)« less
Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D
2010-10-01
The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes
NASA Astrophysics Data System (ADS)
Morozov, Yu. V.; Spektor, A. A.
2017-11-01
A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Sloan, J. C.
1976-01-01
A method is described for developing a statistical air quality assessment for the launch of an aerospace vehicle from the Kennedy Space Center in terms of existing climatological data sets. The procedure can be refined as developing meteorological conditions are identified for use with the NASA-Marshall Space Flight Center Rocket Exhaust Effluent Diffusion (REED) description. Classical climatological regimes for the long range analysis can be narrowed as the synoptic and mesoscale structure is identified. Only broad synoptic regimes are identified at this stage of analysis. As the statistical data matrix is developed, synoptic regimes will be refined in terms of the resulting eigenvectors as applicable to aerospace air quality predictions.
Gorman, Dennis M; Huber, J Charles
2009-08-01
This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.
Macyszyn, Luke; Attiah, Mark; Ma, Tracy S; Ali, Zarina; Faught, Ryan; Hossain, Alisha; Man, Karen; Patel, Hiren; Sobota, Rosanna; Zager, Eric L; Stein, Sherman C
2017-05-01
OBJECTIVE Moyamoya disease (MMD) is a chronic cerebrovascular disease that can lead to devastating neurological outcomes. Surgical intervention is the definitive treatment, with direct, indirect, and combined revascularization procedures currently employed by surgeons. The optimal surgical approach, however, remains unclear. In this decision analysis, the authors compared the effectiveness of revascularization procedures in both adult and pediatric patients with MMD. METHODS A comprehensive literature search was performed for studies of MMD. Using complication and success rates from the literature, the authors constructed a decision analysis model for treatment using a direct and indirect revascularization technique. Utility values for the various outcomes and complications were extracted from the literature examining preferences in similar clinical conditions. Sensitivity analysis was performed. RESULTS A structured literature search yielded 33 studies involving 4197 cases. Cases were divided into adult and pediatric populations. These were further subdivided into 3 different treatment groups: indirect, direct, and combined revascularization procedures. In the pediatric population at 5- and 10-year follow-up, there was no significant difference between indirect and combination procedures, but both were superior to direct revascularization. In adults at 4-year follow-up, indirect was superior to direct revascularization. CONCLUSIONS In the absence of factors that dictate a specific approach, the present decision analysis suggests that direct revascularization procedures are inferior in terms of quality-adjusted life years in both adults at 4 years and children at 5 and 10 years postoperatively, respectively. These findings were statistically significant (p < 0.001 in all cases), suggesting that indirect and combination procedures may offer optimal results at long-term follow-up.
An ANOVA approach for statistical comparisons of brain networks.
Fraiman, Daniel; Fraiman, Ricardo
2018-03-16
The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
Vibroacoustic optimization using a statistical energy analysis model
NASA Astrophysics Data System (ADS)
Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia
2016-08-01
In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.
Robustness of S1 statistic with Hodges-Lehmann for skewed distributions
NASA Astrophysics Data System (ADS)
Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping
2016-10-01
Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.
Test data analysis for concentrating photovoltaic arrays
NASA Astrophysics Data System (ADS)
Maish, A. B.; Cannon, J. E.
A test data analysis approach for use with steady state efficiency measurements taken on concentrating photovoltaic arrays is presented. The analysis procedures can be used to identify based and erroneous data. The steps involved in analyzing the test data are screening the data, developing coefficients for the performance equation, analyzing statistics to ensure adequacy of the regression fit to the data, and plotting the data. In addition, this paper analyzes the sources and magnitudes of precision and bias errors that affect measurement accuracy are analyzed.
El Batawi, H Y
2015-02-01
To investigate the possible effect of intraoperative analgesia, namely diclofenac sodium compared to acetaminophen on post-recovery pain perception in children undergoing painful dental procedures under general anaesthesia. A double-blind randomised clinical trial. A sample of 180 consecutive cases of children undergoing full dental rehabilitation under general anaesthesia in a private hospital in Saudi Arabia during 2013 was divided into three groups (60 children each) according to the analgesic used prior to extubation. Group A, children had diclofenac sodium suppository. Group B, children received acetaminophen suppository and Group C, the control group. Using an authenticated Arabic version of the Wong and Baker faces Pain assessment Scale, patients were asked to choose the face that suits best the pain he/she is suffering. Data were collected and recorded for statistical analysis. Student's t test was used for comparison of sample means. A preliminary F test to compare sample variances was carried out to determine the appropriate t test variant to be used. A "p" value less than 0.05 was considered significant. More than 93% of children had post-operative pain in varying degrees. High statistical significance was observed between children in groups A and B compared to control group C with the later scoring high pain perception. Diclofenac showed higher potency in multiple painful procedures, while the statistical difference was not significant in children with three or less painful dental procedures. Diclophenac sodium is more potent than acetaminophen, especially for multiple pain-provoking or traumatic procedures. A timely use of NSAID analgesia just before extubation helps provide adequate coverage during recovery. Peri-operative analgesia is to be recommended as an essential treatment adjunct for child dental rehabilitation under general anaesthesia.
Monitoring Items in Real Time to Enhance CAT Security
ERIC Educational Resources Information Center
Zhang, Jinming; Li, Jie
2016-01-01
An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…
Aggregative Learning Method and Its Application for Communication Quality Evaluation
NASA Astrophysics Data System (ADS)
Akhmetov, Dauren F.; Kotaki, Minoru
2007-12-01
In this paper, so-called Aggregative Learning Method (ALM) is proposed to improve and simplify the learning and classification abilities of different data processing systems. It provides a universal basis for design and analysis of mathematical models of wide class. A procedure was elaborated for time series model reconstruction and analysis for linear and nonlinear cases. Data approximation accuracy (during learning phase) and data classification quality (during recall phase) are estimated from introduced statistic parameters. The validity and efficiency of the proposed approach have been demonstrated through its application for monitoring of wireless communication quality, namely, for Fixed Wireless Access (FWA) system. Low memory and computation resources were shown to be needed for the procedure realization, especially for data classification (recall) stage. Characterized with high computational efficiency and simple decision making procedure, the derived approaches can be useful for simple and reliable real-time surveillance and control system design.
Piepho, H P
1994-11-01
Multilocation trials are often used to analyse the adaptability of genotypes in different environments and to find for each environment the genotype that is best adapted; i.e. that is highest yielding in that environment. For this purpose, it is of interest to obtain a reliable estimate of the mean yield of a cultivar in a given environment. This article compares two different statistical estimation procedures for this task: the Additive Main Effects and Multiplicative Interaction (AMMI) analysis and Best Linear Unbiased Prediction (BLUP). A modification of a cross validation procedure commonly used with AMMI is suggested for trials that are laid out as a randomized complete block design. The use of these procedure is exemplified using five faba bean datasets from German registration trails. BLUP was found to outperform AMMI in four of five faba bean datasets.
Using Statistical Process Control to Make Data-Based Clinical Decisions.
ERIC Educational Resources Information Center
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
Statistical Cost Estimation in Higher Education: Some Alternatives.
ERIC Educational Resources Information Center
Brinkman, Paul T.; Niwa, Shelley
Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…
Development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale
ERIC Educational Resources Information Center
Mullen, Patrick R.; Lambie, Glenn W.; Conley, Abigail H.
2014-01-01
The authors present the development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale (ELICSES). The purpose of this article is threefold: (a) present a rationale for the ELICSES, (b) review statistical analysis procedures used to develop the ELICSES, and (c) offer implications for future research and counselor education.
Estimated Effects of Retirement Revision on Retention of Navy Tactical Pilots.
1986-12-01
detailed explanation of the procedure and proofs can be found in Hanushek and Jackson [Ref. 441. S511 ,V. VI. RESULTS AND ANALYSIS A. DESCRIPTIVE...Introduction to Econometrics, pp. 242-243, Prentice-Hall, 1978. 44. Hanushek Eric ard Jackson, John, Statistical .Mlethods for Social Scientists, p. S188
ERIC Educational Resources Information Center
Posey-Goodwin, Patricia Ann
2013-01-01
The purpose of this study was to explore differences in perceptions of mentoring activities from four generations of registered nurses in Florida, using the Alleman Mentoring Activities Questionnaire ® (AMAQ ®). Statistical procedures of analysis of variance (ANOVA) were employed to explore differences among 65 registered nurses in Florida from…
Return to Our Roots: Raising Radishes to Teach Experimental Design. Methods and Techniques.
ERIC Educational Resources Information Center
Stallings, William M.
1993-01-01
Reviews research in teaching applied statistics. Concludes that students should analyze data from studies they have designed and conducted. Describes an activity in which students study germination and growth of radish seeds. Includes a table providing student instructions for both the experimental procedure and data analysis. (CFR)
Forest statistics for Northwest Florida, 1987
Mark J. Brown
1987-01-01
The Forest Inventory and Analysis (Forest Survey) Research Work Unit at the Southeastern Forest Experiment Station recently conducted a review of its data processing procedures. During this process, a computer error was discovered which led to inflated estimates of annual removals, net annual growth, and annual mortality for the 1970-1980 remeasurement period in...
The Length of a Pestle: A Class Exercise in Measurement and Statistical Analysis.
ERIC Educational Resources Information Center
O'Reilly, James E.
1986-01-01
Outlines the simple exercise of measuring the length of an object as a concrete paradigm of the entire process of making chemical measurements and treating the resulting data. Discusses the procedure, significant figures, measurement error, spurious data, rejection of results, precision and accuracy, and student responses. (TW)
More Powerful Tests of Simple Interaction Contrasts in the Two-Way Factorial Design
ERIC Educational Resources Information Center
Hancock, Gregory R.; McNeish, Daniel M.
2017-01-01
For the two-way factorial design in analysis of variance, the current article explicates and compares three methods for controlling the Type I error rate for all possible simple interaction contrasts following a statistically significant interaction, including a proposed modification to the Bonferroni procedure that increases the power of…
T.M. Barrett
2004-01-01
During the 1990s, forest inventories for California, Oregon, and Washington were conducted by different agencies using different methods. The Pacific Northwest Research Station Forest Inventory and Analysis program recently integrated these inventories into a single database. This document briefly describes potential statistical methods for estimating population totals...
PUNCHED CARD SYSTEM NEEDN'T BE COMPLEX TO GIVE COMPLETE CONTROL.
ERIC Educational Resources Information Center
BEMIS, HAZEL T.
AT WORCESTER JUNIOR COLLEGE, MASSACHUSETTS, USE OF A MANUALLY OPERATED PUNCHED CARD SYSTEM HAS RESULTED IN (1) SIMPLIFIED REGISTRATION PROCEDURES, (2) QUICK ANALYSIS OF CONFLICTS AND PROBLEMS IN CLASS SCHEDULING, (3) READY ACCESS TO STATISTICAL INFORMATION, (4) DIRECTORY INFORMATION IN A WIDE RANGE OF CLASSIFICATIONS, (5) EASY VERIFICATION OF…
ERIC Educational Resources Information Center
Wang, Lijuan; Ha, Amy Sau-ching; Wen, Xu
2014-01-01
This research primarily aimed to examine the compatibility of teaching perspectives of teachers with the Physical Education (PE) curriculum in China. The Teaching Perspective Inventory (Pratt, 1998) was used to collect data from 272 PE teachers. Descriptive statistics, MANOVAs, and correlational procedures were used for quantitative data analysis.…
Investigating the Stability of Four Methods for Estimating Item Bias.
ERIC Educational Resources Information Center
Perlman, Carole L.; And Others
The reliability of item bias estimates was studied for four methods: (1) the transformed delta method; (2) Shepard's modified delta method; (3) Rasch's one-parameter residual analysis; and (4) the Mantel-Haenszel procedure. Bias statistics were computed for each sample using all methods. Data were from administration of multiple-choice items from…
Welding of AM350 and AM355 steel
NASA Technical Reports Server (NTRS)
Davis, R. J.; Wroth, R. S.
1967-01-01
A series of tests was conducted to establish optimum procedures for TIG welding and heat treating of AM350 and AM355 steel sheet in thicknesses ranging from 0.010 inch to 0.125 inch. Statistical analysis of the test data was performed to determine the anticipated minimum strength of the welded joints.
Emery, R J
1997-03-01
Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.
Statistical analysis of radioimmunoassay. In comparison with bioassay (in Japanese)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakano, R.
1973-01-01
Using the data of RIA (radioimmunoassay), statistical procedures for dealing with two problems of the linearization of dose response curve and calculation of relative potency were described. There were three methods for linearization of dose response curve of RIA. In each method, the following parameters were shown on the horizontal and vertical axis: dose x, (B/T)/sup -1/; c/x + c, B/T (C: dose which makes B/T 50%); log x, logit B/T. Among them, the last method seems to be most practical. The statistical procedures for bioassay were employed for calculating the relative potency of unknown samples compared to the standardmore » samples from dose response curves of standand and unknown samples using regression coefficient. It is desirable that relative potency is calculated by plotting more than 5 points in the standard curve and plotting more than 2 points in unknow samples. For examining the statistical limit of precision of measuremert, LH activity of gonadotropin in urine was measured and relative potency, precision coefficient and the upper and lower limits of relative potency at 95% confidence limit were calculated. On the other hand, bioassay (by the ovarian ascorbic acid reduction method and anteriol lobe of prostate weighing method) was done in the same samples, and the precision was compared with that of RIA. In these examinations, the upper and lower limits of the relative potency at 95% confidence limit were near each other, while in bioassay, a considerable difference was observed between the upper and lower limits. The necessity of standardization and systematization of the statistical procedures for increasing the precision of RIA was pointed out. (JA)« less
Probabilistic micromechanics for metal matrix composites
NASA Astrophysics Data System (ADS)
Engelstad, S. P.; Reddy, J. N.; Hopkins, Dale A.
A probabilistic micromechanics-based nonlinear analysis procedure is developed to predict and quantify the variability in the properties of high temperature metal matrix composites. Monte Carlo simulation is used to model the probabilistic distributions of the constituent level properties including fiber, matrix, and interphase properties, volume and void ratios, strengths, fiber misalignment, and nonlinear empirical parameters. The procedure predicts the resultant ply properties and quantifies their statistical scatter. Graphite copper and Silicon Carbide Titanlum Aluminide (SCS-6 TI15) unidirectional plies are considered to demonstrate the predictive capabilities. The procedure is believed to have a high potential for use in material characterization and selection to precede and assist in experimental studies of new high temperature metal matrix composites.
NASA Astrophysics Data System (ADS)
Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan
2016-04-01
Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
CRISM Hyperspectral Data Filtering with Application to MSL Landing Site Selection
NASA Astrophysics Data System (ADS)
Seelos, F. P.; Parente, M.; Clark, T.; Morgan, F.; Barnouin-Jha, O. S.; McGovern, A.; Murchie, S. L.; Taylor, H.
2009-12-01
We report on the development and implementation of a custom filtering procedure for Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) IR hyperspectral data that is suitable for incorporation into the CRISM Reduced Data Record (RDR) calibration pipeline. Over the course of the Mars Reconnaissance Orbiter (MRO) Primary Science Phase (PSP) and the ongoing Extended Science Phase (ESP) CRISM has operated with an IR detector temperature between ~107 K and ~127 K. This ~20 K range in operational temperature has resulted in variable data quality, with observations acquired at higher detector temperatures exhibiting a marked increase in both systematic and stochastic noise. The CRISM filtering procedure consists of two main data processing capabilities. The primary systematic noise component in CRISM IR data appears as along track or column oriented striping. This is addressed by the robust derivation and application of an inter-column ratio correction frame. The correction frame is developed through the serial evaluation of band specific column ratio statistics and so does not compromise the spectral fidelity of the image cube. The dominant CRISM IR stochastic noise components appear as isolated data spikes or column oriented segments of variable length with erroneous data values. The non-systematic noise is identified and corrected through the application of an iterative-recursive kernel modeling procedure which employs a formal statistical outlier test as the iteration control and recursion termination criterion. This allows the filtering procedure to make a statistically supported determination between high frequency (spatial/spectral) signal and high frequency noise based on the information content of a given multidimensional data kernel. The governing statistical test also allows the kernel filtering procedure to be self regulating and adaptive to the intrinsic noise level in the data. The CRISM IR filtering procedure is scheduled to be incorporated into the next augmentation of the CRISM IR calibration (version 3). The filtering algorithm will be applied to the I/F data (IF) delivered to the Planetary Data System (PDS), but the radiance on sensor data (RA) will remain unfiltered. The development of CRISM hyperspectral analysis products in support of the Mars Science Laboratory (MSL) landing site selection process has motivated the advance of CRISM-specific data processing techniques. The quantitative results of the CRISM IR filtering procedure as applied to CRISM observations acquired in support of MSL landing site selection will be presented.
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Systems Analysis Directorate Activities Summary - September 1977. Volume 1
1977-10-01
Identify by block number) Chemical agent Censor criteria Purity of the agent Statistical samples 20. ABSTRACT (Continue on reverse side U... chemical agent lots. Volume II (CONF) contains an analysis fo the operational capability as the 105rom MIOIAI and M102 Hpwltzej^g, DD , FORM JAN 73...Data Entered) CONTENTS Page Procedure for Determining the Serviceability Category of Chemical Agent Lots •• 5 User’s Guide to the Computer
ERIC Educational Resources Information Center
Gilpatrick, Eleanor
This document is volume 3 of a four-volume report which describes the components of the Health Services Mobility Study (HSMS) method of task analysis, job ladder design, and curriculum development. Divided into four chapters, volume 3 is a manual for using HSMS computer based statistical procedures to design job structures and job ladders. Chapter…
An Analysis of the Navy’s Fiscal Year 2017 Shipbuilding Plan
2017-02-01
Navy would build a larger fleet of about 350 ships (see Table 5). Those three alternatives were chosen for illustrative purposes because variations ...3.2 billion. 2. For more on procedures for estimating and applying learning curves, see Matthew S. Goldberg and Anduin E. Touw, Statistical Methods...guidance from Matthew Goldberg (formerly of CBO) and David Mosher. Raymond Hall of CBO’s Budget Analysis Division produced the cost estimates with
Analysis of cost regression and post-accident absence
NASA Astrophysics Data System (ADS)
Wojciech, Drozd
2017-07-01
The article presents issues related with costs of work safety. It proves the thesis that economic aspects cannot be overlooked in effective management of occupational health and safety and that adequate expenditures on safety can bring tangible benefits to the company. Reliable analysis of this problem is essential for the description the problem of safety the work. In the article attempts to carry it out using the procedures of mathematical statistics [1, 2, 3].
The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data
NASA Technical Reports Server (NTRS)
Brown, E. N.; Czeisler, C. A.
1992-01-01
Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.
Application of microarray analysis on computer cluster and cloud platforms.
Bernau, C; Boulesteix, A-L; Knaus, J
2013-01-01
Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.
Effect of in-office bleaching agents on physical properties of dental composite resins.
Mourouzis, Petros; Koulaouzidou, Elisabeth A; Helvatjoglu-Antoniades, Maria
2013-04-01
The physical properties of dental restorative materials have a crucial effect on the longevity of restorations and moreover on the esthetic demands of patients, but they may be compromised by bleaching treatments. The purpose of this study was to evaluate the effects of in-office bleaching agents on the physical properties of three composite resin restorative materials. The bleaching agents used were hydrogen peroxide and carbamide peroxide at high concentrations. Specimens of each material were prepared, cured, and polished. Measurements of color difference, microhardness, and surface roughness were recorded before and after bleaching and data were examined statistically by analysis of variance (ANOVA) and Tukey HSD post-hoc test at P < .05. The measurements showed that hue and chroma of silorane-based composite resin altered after the bleaching procedure (P < .05). No statistically significant differences were found when testing the microhardness and surface roughness of composite resins tested (P > .05). The silorane-based composite resin tested showed some color alteration after bleaching procedures. The bleaching procedure did not alter the microhardness and the surface roughness of all composite resins tested.
Pristipino, Christian; Roncella, Adriana; Trani, Carlo; Nazzaro, Marco S; Berni, Andrea; Di Sciascio, Germano; Sciahbasi, Alessandro; Musarò, Salvatore Donato; Mazzarotto, Pietro; Gioffrè, Gaetano; Speciale, Giulio
2010-06-01
To assess: the reasons behind an operator choosing to perform radial artery catheterisation (RAC) as against femoral arterial catheterisation, and to explore why RAC may fail in the real world. A pre-determined analysis of PREVAIL study database was performed. Relevant data were collected in a prospective, observational survey of 1,052 consecutive patients undergoing invasive cardiovascular procedures at nine Italian hospitals over a one month observation period. By multivariate analysis, the independent predictors of RAC choice were having the procedure performed: (1) at a high procedural volume centre; and (2) by an operator who performs a high volume of radial procedures; clinical variables played no statistically significant role. RAC failure was predicted independently by (1) a lower operator propensity to use RAC; and (2) the presence of obstructive peripheral artery disease. A 10-fold lower rate of RAC failure was observed among operators who perform RAC for > 85% of their personal caseload than among those who use RAC < 25% of the time (3.8% vs. 33.0%, respectively); by receiver operator characteristic (ROC) analysis, no threshold value for operator RAC volume predicted RAC failure. A routine RAC in all-comers is superior to a selective strategy in terms of feasibility and success rate.
FACTORS ASSOCIATED WITH ODONTOGENIC BACTERAEMIA IN ORTHODONTIC PATIENTS.
Umeh, O D; Sanu, O O; Utomi, I L; Nwaokorie, F O
2016-01-01
Various researches have investigated factors associated with the prevalence and intensity of bacteraemia following oral procedures including orthodontic procedures. The aim of this study was to determine the effect of age, gender, plaque and gingival indices on the occurrence of odontogenic bacteraemia following orthodontic treatment procedures. Orthodontic Clinic, Lagos University Teaching Hospital (LUTH), Lagos , Nigeria. Using the consecutive, convenience sampling method, a total of 100 subjects who met the inclusion criteria were recruited for the study and peripheral blood was collected before and again within 2 minutes of completion of orthodontic procedures for microbiologic analysis using the BACTEC automated blood culture system and the lysis filtration methods of blood culturing. The subjects were randomly placed in one of four orthodontic procedures investigated: alginate impression making (Group I), separator placement (Group II), band cementation (Group III) and arch wire change (Group IV). Plaque and gingival indices were assessed using the plaque component of the Simplified Oral Hygiene Index (OHI-S) (Greene & Vermillion) and Modified gingival index (Lobene) respectively before blood collection. Spearman Point bi-serial correlations and logistic regression statistics were used for statistical evaluations at p < 0.05 level. An overall baseline prevalence of bacteraemia of 3% and 17% were observed using the BACCTEC and lysis filtration methods respectively. Similarly, overall prevalence of bacteraemia following orthodontic treatment procedures of 16% and 28% were observed respectively using the BACTEC and lysis filtration methods. A statistically significant increase in the prevalence of bateraemia was observed following separator placement (p=0.016). An increase in age, plaque index scores and modified gingival index scores of the subjects were found to be associated with an increase in the prevalence of bacteraemia following orthodontic treatment procedures, with plaque index score showing the strongest correlation. Separator placement was found to induce significantly highest level of bacteraemia. Meticulous oral hygiene practice and the use of 0.2% chlorhexidine mouth rinse prior to separator placement may be considered an effective measure in reducing oral bacteria load and consequent reduction of the occurrence of bacteraemia following orthodontic treatment procedures.
A New Way for Antihelixplasty in Prominent Ear Surgery: Modified Postauricular Fascial Flap.
Taş, Süleyman; Benlier, Erol
2016-06-01
Otoplasty procedures aim to reduce the concha-mastoid angle and recreate the antihelical fold. Here, we explained the modified postauricular fascial flap, described as a new way for recreating the antihelical fold, and reported the results of patients on whom this flap was used. The defined technique was used on 24 patients (10 females and 14 males; age, 6-27 years; mean, 16.7 years) between June 2009 and July 2012, a total of 48 procedures in total (bilateral). Follow-up ranged from 1 to 3 years (mean, 1.5 years). At the preoperative and postoperative time points (1 and 12 months after surgery), all patients were measured for upper and middle helix-head distance and were photographed. The records were analyzed statistically using t test and analysis of variance. The procedure resulted in ears that were natural in appearance without any significant visible evidence of surgery. The operations resulted in no complications except 1 patient who developed a small skin ulcer on the left ear because of band pressure. When we compared the preoperative and postoperative upper and middle helix-head distance, there was a high significance statistically. To introduce modified postauricular fascial flap, we used a simple and safe procedure to recreate an antihelical fold. This procedure led to several benefits, including a natural-in-appearance antihelical fold, prevention of suture extrusion and granuloma, as well as minimized risk for recurrence due to neochondrogenesis. This method may be used as a standard procedure for treating prominent ears surgically.
A statistical method for the conservative adjustment of false discovery rate (q-value).
Lai, Yinglei
2017-03-14
q-value is a widely used statistical method for estimating false discovery rate (FDR), which is a conventional significance measure in the analysis of genome-wide expression data. q-value is a random variable and it may underestimate FDR in practice. An underestimated FDR can lead to unexpected false discoveries in the follow-up validation experiments. This issue has not been well addressed in literature, especially in the situation when the permutation procedure is necessary for p-value calculation. We proposed a statistical method for the conservative adjustment of q-value. In practice, it is usually necessary to calculate p-value by a permutation procedure. This was also considered in our adjustment method. We used simulation data as well as experimental microarray or sequencing data to illustrate the usefulness of our method. The conservativeness of our approach has been mathematically confirmed in this study. We have demonstrated the importance of conservative adjustment of q-value, particularly in the situation that the proportion of differentially expressed genes is small or the overall differential expression signal is weak.
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Estimating the probability of rare events: addressing zero failure data.
Quigley, John; Revie, Matthew
2011-07-01
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yip, Doris; Vanasco, Matthew; Funaki, Brian
2004-01-15
To compare complication rates and tube performance of percutaneous mushroom gastrostomy, balloon gastrostomy, and gastrojejunostomy. Between September 9, 1999 and April 23, 2001, 203 patients underwent 250 radiologically guided percutaneous gastrostomy and gastrojejunostomy procedures. Follow-up was conducted through chart reviews and review of our interventional radiology database. Procedural and catheter-related complications were recorded. Chi-square statistical analysis was performed. In patients receiving mushroom-retained gastrostomy catheters (n = 114), the major complication rate was 0.88% (n = 1), the minor complication rate was 5.3% (n = 6), and the tube complication rate was 4.4% (n = 5). In patients receiving balloon-retained gastrostomymore » tubes (n = 67), the major complication rate was 0, the minor complication rate was 4.5% (n = 3), and the tube complication rate was 34.3% (n = 23). In patients receiving gastrojejunostomy catheters (n = 69), the major complication rate was 1.4% (n = 1), the minor complication rate was 2.9% (n = 2), and the tube complication rate was 34.8% (n = 24). No statistically significant differences were found between procedural or peri-procedural complications among the different types of tubes. Mushroom-retained catheters had significantly fewer tube complications (p < 0.01). Percutaneous gastrostomy and gastrojejunostomy have similar procedural and peri-procedural complication rates. Mushroom gastrostomy catheters have fewer tube-related complications compared with balloon gastrostomy and gastrojejunostomy catheters. In addition, mushroom-retained catheters exhibit the best overall long-term tube patency and are therefore the gastrostomy catheter of choice.« less
Precision of guided scanning procedures for full-arch digital impressions in vivo.
Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert
2017-11-01
System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Clinical skills temporal degradation assessment in undergraduate medical education.
Fisher, Joseph; Viscusi, Rebecca; Ratesic, Adam; Johnstone, Cameron; Kelley, Ross; Tegethoff, Angela M; Bates, Jessica; Situ-Lacasse, Elaine H; Adamas-Rappaport, William J; Amini, Richard
2018-01-01
Medical students' ability to learn clinical procedures and competently apply these skills is an essential component of medical education. Complex skills with limited opportunity for practice have been shown to degrade without continued refresher training. To our knowledge there is no evidence that objectively evaluates temporal degradation of clinical skills in undergraduate medical education. The purpose of this study was to evaluate temporal retention of clinical skills among third year medical students. This was a cross-sectional study conducted at four separate time intervals in the cadaver laboratory at a public medical school. Forty-five novice third year medical students were evaluated for retention of skills in the following three procedures: pigtail thoracostomy, femoral line placement, and endotracheal intubation. Prior to the start of third-year medical clerkships, medical students participated in a two-hour didactic session designed to teach clinically relevant materials including the procedures. Prior to the start of their respective surgery clerkships, students were asked to perform the same three procedures and were evaluated by trained emergency medicine and surgery faculty for retention rates, using three validated checklists. Students were then reassessed at six week intervals in four separate groups based on the start date of their respective surgical clerkships. We compared the evaluation results between students tested one week after training and those tested at three later dates for statistically significant differences in score distribution using a one-tailed Wilcoxon Mann-Whitney U-test for non-parametric rank-sum analysis. Retention rates were shown to have a statistically significant decline between six and 12 weeks for all three procedural skills. In the instruction of medical students, skill degradation should be considered when teaching complex technical skills. Based on the statistically significant decline in procedural skills noted in our investigation, instructors should consider administering a refresher course between six and twelve weeks from initial training.
Automating approximate Bayesian computation by local linear regression.
Thornton, Kevin R
2009-07-07
In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.
Taylor-Brown, F E; Cardy, T J A; Liebel, F X; Garosi, L; Kenny, P J; Volk, H A; De Decker, S
2015-12-01
Early post-operative neurological deterioration is a well-known complication following dorsal cervical laminectomies and hemilaminectomies in dogs. This study aimed to evaluate potential risk factors for early post-operative neurological deterioration following these surgical procedures. Medical records of 100 dogs that had undergone a cervical dorsal laminectomy or hemilaminectomy between 2002 and 2014 were assessed retrospectively. Assessed variables included signalment, bodyweight, duration of clinical signs, neurological status before surgery, diagnosis, surgical site, type and extent of surgery and duration of procedure. Outcome measures were neurological status immediately following surgery and duration of hospitalisation. Univariate statistical analysis was performed to identify variables to be included in a multivariate model. Diagnoses included osseous associated cervical spondylomyelopathy (OACSM; n = 41), acute intervertebral disk extrusion (IVDE; 31), meningioma (11), spinal arachnoid diverticulum (10) and vertebral arch anomalies (7). Overall 54% (95% CI 45.25-64.75) of dogs were neurologically worse 48 h post-operatively. Multivariate statistical analysis identified four factors significantly related to early post-operative neurological outcome. Diagnoses of OACSM or meningioma were considered the strongest variables to predict early post-operative neurological deterioration, followed by higher (more severely affected) neurological grade before surgery and longer surgery time. This information can aid in the management of expectations of clinical staff and owners with dogs undergoing these surgical procedures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain
Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young
2010-01-01
Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071
Maintaining Atmospheric Mass and Water Balance Within Reanalysis
NASA Technical Reports Server (NTRS)
Takacs, Lawrence L.; Suarez, Max; Todling, Ricardo
2015-01-01
This report describes the modifications implemented into the Goddard Earth Observing System Version-5 (GEOS-5) Atmospheric Data Assimilation System (ADAS) to maintain global conservation of dry atmospheric mass as well as to preserve the model balance of globally integrated precipitation and surface evaporation during reanalysis. Section 1 begins with a review of these global quantities from four current reanalysis efforts. Section 2 introduces the modifications necessary to preserve these constraints within the atmospheric general circulation model (AGCM), the Gridpoint Statistical Interpolation (GSI) analysis procedure, and the Incremental Analysis Update (IAU) algorithm. Section 3 presents experiments quantifying the impact of the new procedure. Section 4 shows preliminary results from its use within the GMAO MERRA-2 Reanalysis project. Section 5 concludes with a summary.
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Yılmaz, Koray; Özyürek, Taha
2017-04-01
The aim of this study was to compare the amount of debris extruded from the apex during retreatment procedures with ProTaper Next (PTN; Dentsply Maillefer, Ballaigues, Switzerland), Reciproc (RCP; VDW, Munich, Germany), and Twisted File Adaptive (TFA; SybronEndo, Orange, CA) files and the duration of these retreatment procedures. Ninety upper central incisor teeth were prepared and filled with gutta-percha and AH Plus sealer (Dentsply DeTrey, Konstanz, Germany) using the vertical compaction technique. The teeth were randomly divided into 3 groups of 30 for removal of the root filling material with PTN, RCP, and TFA files. The apically extruded debris was collected in preweighed Eppendorf tubes. The time for gutta-percha removal was recorded. Data were statistically analyzed using Kruskal-Wallis and 1-way analysis of variance tests. The amount of debris extruded was RPC > TFA > PTN, respectively. Compared with the PTN group, the amount of debris extruded in the RPC group was statistically significantly higher (P < .001). There was no statistically significant difference among the RCP, TFA, and PTN groups regarding the time for retreatment (P > .05). Within the limitations of this in vitro study, all groups were associated with debris extrusion from the apex. The RCP file system led to higher levels of apical extrusion in proportion to the PTN file system. In addition, there was no significant difference among groups in the duration of the retreatment procedures. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Choi, Yeon-Ju; Son, Wonsoo; Park, Ki-Su
2016-01-01
Objective This study used the intradural procedural time to assess the overall technical difficulty involved in surgically clipping an unruptured middle cerebral artery (MCA) aneurysm via a pterional or superciliary approach. The clinical and radiological variables affecting the intradural procedural time were investigated, and the intradural procedural time compared between a superciliary keyhole approach and a pterional approach. Methods During a 5.5-year period, patients with a single MCA aneurysm were enrolled in this retrospective study. The selection criteria for a superciliary keyhole approach included : 1) maximum diameter of the unruptured MCA aneurysm <15 mm, 2) neck diameter of the MCA aneurysm <10 mm, and 3) aneurysm location involving the sphenoidal or horizontal segment of MCA (M1) segment and MCA bifurcation, excluding aneurysms distal to the MCA genu. Meanwhile, the control comparison group included patients with the same selection criteria as for a superciliary approach, yet who preferred a pterional approach to avoid a postoperative facial wound or due to preoperative skin trouble in the supraorbital area. To determine the variables affecting the intradural procedural time, a multiple regression analysis was performed using such data as the patient age and gender, maximum aneurysm diameter, aneurysm neck diameter, and length of the pre-aneurysm M1 segment. In addition, the intradural procedural times were compared between the superciliary and pterional patient groups, along with the other variables. Results A total of 160 patients underwent a superciliary (n=124) or pterional (n=36) approach for an unruptured MCA aneurysm. In the multiple regression analysis, an increase in the diameter of the aneurysm neck (p<0.001) was identified as a statistically significant factor increasing the intradural procedural time. A Pearson correlation analysis also showed a positive correlation (r=0.340) between the neck diameter and the intradural procedural time. When comparing the superciliary and pterional groups, no statistically significant between-group difference was found in terms of the intradural procedural time reflecting the technical difficulty (mean±standard deviation : 29.8±13.0 min versus 27.7±9.6 min). Conclusion A superciliary keyhole approach can be a useful alternative to a pterional approach for an unruptured MCA aneurysm with a maximum diameter <15 mm and neck diameter <10 mm, representing no more of a technical challenge. For both surgical approaches, the technical difficulty increases along with the neck diameter of the MCA aneurysm. PMID:27847568
NASA Astrophysics Data System (ADS)
Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.
Using statistical process control to make data-based clinical decisions.
Pfadt, A; Wheeler, D J
1995-01-01
Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.
Konukoglu, Ender; Coutu, Jean-Philippe; Salat, David H; Fischl, Bruce
2016-07-01
Diffusion magnetic resonance imaging (dMRI) is a unique technology that allows the noninvasive quantification of microstructural tissue properties of the human brain in healthy subjects as well as the probing of disease-induced variations. Population studies of dMRI data have been essential in identifying pathological structural changes in various conditions, such as Alzheimer's and Huntington's diseases (Salat et al., 2010; Rosas et al., 2006). The most common form of dMRI involves fitting a tensor to the underlying imaging data (known as diffusion tensor imaging, or DTI), then deriving parametric maps, each quantifying a different aspect of the underlying microstructure, e.g. fractional anisotropy and mean diffusivity. To date, the statistical methods utilized in most DTI population studies either analyzed only one such map or analyzed several of them, each in isolation. However, it is most likely that variations in the microstructure due to pathology or normal variability would affect several parameters simultaneously, with differing variations modulating the various parameters to differing degrees. Therefore, joint analysis of the available diffusion maps can be more powerful in characterizing histopathology and distinguishing between conditions than the widely used univariate analysis. In this article, we propose a multivariate approach for statistical analysis of diffusion parameters that uses partial least squares correlation (PLSC) analysis and permutation testing as building blocks in a voxel-wise fashion. Stemming from the common formulation, we present three different multivariate procedures for group analysis, regressing-out nuisance parameters and comparing effects of different conditions. We used the proposed procedures to study the effects of non-demented aging, Alzheimer's disease and mild cognitive impairment on the white matter. Here, we present results demonstrating that the proposed PLSC-based approach can differentiate between effects of different conditions in the same region as well as uncover spatial variations of effects across the white matter. The proposed procedures were able to answer questions on structural variations such as: "are there regions in the white matter where Alzheimer's disease has a different effect than aging or similar effect as aging?" and "are there regions in the white matter that are affected by both mild cognitive impairment and Alzheimer's disease but with differing multivariate effects?" Copyright © 2016 Elsevier Inc. All rights reserved.
Konukoglu, Ender; Coutu, Jean-Philippe; Salat, David H.; Fischl, Bruce
2016-01-01
Diffusion magnetic resonance imaging (dMRI) is a unique technology that allows the noninvasive quantification of microstructural tissue properties of the human brain in healthy subjects as well as the probing of disease-induced variations. Population studies of dMRI data have been essential in identifying pathological structural changes in various conditions, such as Alzheimer’s and Huntington’s diseases1,2. The most common form of dMRI involves fitting a tensor to the underlying imaging data (known as Diffusion Tensor Imaging, or DTI), then deriving parametric maps, each quantifying a different aspect of the underlying microstructure, e.g. fractional anisotropy and mean diffusivity. To date, the statistical methods utilized in most DTI population studies either analyzed only one such map or analyzed several of them, each in isolation. However, it is most likely that variations in the microstructure due to pathology or normal variability would affect several parameters simultaneously, with differing variations modulating the various parameters to differing degrees. Therefore, joint analysis of the available diffusion maps can be more powerful in characterizing histopathology and distinguishing between conditions than the widely used univariate analysis. In this article, we propose a multivariate approach for statistical analysis of diffusion parameters that uses partial least squares correlation (PLSC) analysis and permutation testing as building blocks in a voxel-wise fashion. Stemming from the common formulation, we present three different multivariate procedures for group analysis, regressing-out nuisance parameters and comparing effects of different conditions. We used the proposed procedures to study the effects of non-demented aging, Alzheimer’s disease and mild cognitive impairment on the white matter. Here, we present results demonstrating that the proposed PLSC-based approach can differentiate between effects of different conditions in the same region as well as uncover spatial variations of effects across the white matter. The proposed procedures were able to answer questions on structural variations such as: “are there regions in the white matter where Alzheimer’s disease has a different effect than aging or similar effect as aging?” and “are there regions in the white matter that are affected by both mild cognitive impairment and Alzheimer’s disease but with differing multivariate effects?” PMID:27103138
OPATs: Omnibus P-value association tests.
Chen, Chia-Wei; Yang, Hsin-Chou
2017-07-10
Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.
Statistical atlas based extrapolation of CT data
NASA Astrophysics Data System (ADS)
Chintalapani, Gouthami; Murphy, Ryan; Armiger, Robert S.; Lepisto, Jyri; Otake, Yoshito; Sugano, Nobuhiko; Taylor, Russell H.; Armand, Mehran
2010-02-01
We present a framework to estimate the missing anatomical details from a partial CT scan with the help of statistical shape models. The motivating application is periacetabular osteotomy (PAO), a technique for treating developmental hip dysplasia, an abnormal condition of the hip socket that, if untreated, may lead to osteoarthritis. The common goals of PAO are to reduce pain, joint subluxation and improve contact pressure distribution by increasing the coverage of the femoral head by the hip socket. While current diagnosis and planning is based on radiological measurements, because of significant structural variations in dysplastic hips, a computer-assisted geometrical and biomechanical planning based on CT data is desirable to help the surgeon achieve optimal joint realignments. Most of the patients undergoing PAO are young females, hence it is usually desirable to minimize the radiation dose by scanning only the joint portion of the hip anatomy. These partial scans, however, do not provide enough information for biomechanical analysis due to missing iliac region. A statistical shape model of full pelvis anatomy is constructed from a database of CT scans. The partial volume is first aligned with the statistical atlas using an iterative affine registration, followed by a deformable registration step and the missing information is inferred from the atlas. The atlas inferences are further enhanced by the use of X-ray images of the patient, which are very common in an osteotomy procedure. The proposed method is validated with a leave-one-out analysis method. Osteotomy cuts are simulated and the effect of atlas predicted models on the actual procedure is evaluated.
Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio
2010-01-01
In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.
Camara, Jorge G; Ruszkowski, Joseph M; Worak, Sandra R
2008-06-25
Music and surgery. To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Retrospective case series. 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Mean arterial pressure, heart rate, and respiratory rate. 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery.
Lee, L.; Helsel, D.
2005-01-01
Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.
Statistical design of quantitative mass spectrometry-based proteomic experiments.
Oberg, Ann L; Vitek, Olga
2009-05-01
We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.
Statistical errors in molecular dynamics averages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiferl, S.K.; Wallace, D.C.
1985-11-15
A molecular dynamics calculation produces a time-dependent fluctuating signal whose average is a thermodynamic quantity of interest. The average of the kinetic energy, for example, is proportional to the temperature. A procedure is described for determining when the molecular dynamics system is in equilibrium with respect to a given variable, according to the condition that the mean and the bandwidth of the signal should be sensibly constant in time. Confidence limits for the mean are obtained from an analysis of a finite length of the equilibrium signal. The role of serial correlation in this analysis is discussed. The occurence ofmore » unstable behavior in molecular dynamics data is noted, and a statistical test for a level shift is described.« less
Super-delta: a new differential gene expression analysis procedure with robust data normalization.
Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing
2017-12-21
Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.
Paillet, Frederick L.; Crowder, R.E.
1996-01-01
Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.
ERIC Educational Resources Information Center
Strang, Kenneth David
2009-01-01
This paper discusses how a seldom-used statistical procedure, recursive regression (RR), can numerically and graphically illustrate data-driven nonlinear relationships and interaction of variables. This routine falls into the family of exploratory techniques, yet a few interesting features make it a valuable compliment to factor analysis and…
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
ERIC Educational Resources Information Center
Toutkoushian, Robert K.
This paper proposes a five-step process by which to analyze whether the salary ratio between junior and senior college faculty exhibits salary compression, a term used to describe an unusually small differential between faculty with different levels of experience. The procedure utilizes commonly used statistical techniques (multiple regression…
Finding and Developing Moderators and Directional Keys by Regression Analysis.
ERIC Educational Resources Information Center
Kokosh, John
A procedure for rapid screening of variables as potential moderators is presented and discussed. A moderator is defined as any variable which can be used to identify differentially predictable persons; or defined statistically by stating that if a predictor and a moderator are each divided into three or more categories and used as independent…
Assessing tree and stand biomass: a review with examples and critical comparisons
Bernard R. Parresol
1999-01-01
There is considerable interest today in estimating the biomass of trees and forests for both practical forestry issues and scientific purposes. New techniques and procedures are brought together along with the more traditional approaches to estimating woody biomass. General model forms and weighted analysis are reviewed, along with statistics for evaluating and...
Data-Mining Techniques in Detecting Factors Linked to Academic Achievement
ERIC Educational Resources Information Center
Martínez Abad, Fernando; Chaparro Caso López, Alicia A.
2017-01-01
In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…
Assessing the feasibility and profitability of cable logging in southern upland hardwood forests
Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann
1995-01-01
Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...
Dennis M. May; Chris B. LeDoux; John B. Tansey; Richard Widmann
1994-01-01
Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) units, were modified to demonstrate the impact of three in-woods product-merchandizing options on profitable logging opportunities in upland hardwood forests in 14 Southern...
Agarwal, Chitra; Deora, Savita; Abraham, Dennis; Gaba, Rohini; Kumar, Baron Tarun; Kudva, Praveen
2015-01-01
Context: Nowadays esthetics plays an important role in dentistry along with function of the prosthesis. Various soft tissue augmentation procedures are available to correct the ridge defects in the anterior region. The newer technique, vascularized interpositional periosteal connective tissue (VIP-CT) flap has been introduced, which has the potential to augment predictable amount of tissue and has many benefits when compared to other techniques. Aim: The study was designed to determine the efficacy of the VIP-CT flap in augmenting the ridge defect. Materials and Methods: Ten patients with Class III (Seibert's) ridge defects were treated with VIP-CT flap technique before fabricating fixed partial denture. Height and width of the ridge defects were measured before and after the procedure. Subsequent follow-up was done every 3 months for 1-year. Statistical Analysis Used: Paired t-test was performed to detect the significance of the procedure. Results: The surgical site healed uneventfully. The predictable amount of soft tissue augmentation had been achieved with the procedure. The increase in height and width of the ridge was statistically highly significant. Conclusion: The VIP-CT flap technique was effective in augmenting the soft tissue in esthetic area that remained stable over a long period. PMID:25810597
Adjustment of geochemical background by robust multivariate statistics
Zhou, D.
1985-01-01
Conventional analyses of exploration geochemical data assume that the background is a constant or slowly changing value, equivalent to a plane or a smoothly curved surface. However, it is better to regard the geochemical background as a rugged surface, varying with changes in geology and environment. This rugged surface can be estimated from observed geological, geochemical and environmental properties by using multivariate statistics. A method of background adjustment was developed and applied to groundwater and stream sediment reconnaissance data collected from the Hot Springs Quadrangle, South Dakota, as part of the National Uranium Resource Evaluation (NURE) program. Source-rock lithology appears to be a dominant factor controlling the chemical composition of groundwater or stream sediments. The most efficacious adjustment procedure is to regress uranium concentration on selected geochemical and environmental variables for each lithologic unit, and then to delineate anomalies by a common threshold set as a multiple of the standard deviation of the combined residuals. Robust versions of regression and RQ-mode principal components analysis techniques were used rather than ordinary techniques to guard against distortion caused by outliers Anomalies delineated by this background adjustment procedure correspond with uranium prospects much better than do anomalies delineated by conventional procedures. The procedure should be applicable to geochemical exploration at different scales for other metals. ?? 1985.
Pontis, Alessandro; Sedda, Federica; Mereu, Liliana; Podda, Mauro; Melis, Gian Benedetto; Pisanu, Adolfo; Angioni, Stefano
2016-09-01
To critically appraise published randomized controlled trials (RCTs) comparing laparo-endoscopic single site (LESS) and multi-port laparoscopic (MPL) in gynecologic operative surgery; the aim was to assess feasibility, safety, and potential benefits of LESS in comparison to MPL. A systematic review and meta-analysis of eleven RCTs. Women undergoing operative LESS and MPL gynecologic procedure (hysterectomy, cystectomy, salpingectomy, salpingo-oophorectomy, myomectomy). Outcomes evaluated were as follows: postoperative overall morbidity, postoperative pain evaluation at 6, 12, 24 and 48 h, cosmetic patient satisfaction, conversion rate, body mass index (BMI), operative time, blood loss, hemoglobin drop, postoperative hospital stay. Eleven RCTs comprising 956 women with gynecologic surgical disease randomized to either LESS (477) or MPL procedures (479) were analyzed systematically. The LESS approach is a surgical procedure with longer operative and better cosmetic results time than MPL but without statistical significance. Operative outcomes, postoperative recovery, postoperative morbidity and patient satisfaction are similar in LESS and MPL. LESS may be considered an alternative to MPL with comparable feasibility and safety in gynecologic operative procedures. However, it does not offer the expected advantages in terms of postoperative pain and cosmetic satisfaction.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Automatically inserted technical details improve radiology report accuracy.
Abujudeh, Hani H; Govindan, Siddharth; Narin, Ozden; Johnson, Jamlik Omari; Thrall, James H; Rosenthal, Daniel I
2011-09-01
To assess the effect of automatically inserted technical details on the concordance of a radiology report header with the actual procedure performed. The study was IRB approved and informed consent was waived. We obtained radiology report audit data from the hospital's compliance office from the period of January 2005 through December 2009 spanning a total of 20 financial quarters. A "discordance percentage" was defined as the percentage of total studies in which a procedure code change was made during auditing. Using Chi-square analysis we compared discordance percentages between reports with manually inserted technical details (MITD) and automatically inserted technical details (AITD). The second quarter data of 2007 was not included in the analysis as the switch from MITD to AITD occurred during this quarter. The hospital's compliance office audited 9,110 studies from 2005-2009. Excluding the 564 studies in the second quarter of 2007, we analyzed a total of 8,546 studies, 3,948 with MITD and 4,598 with AITD. The discordance percentage in the MITD group was 3.95% (156/3,948, range per quarter, 1.5- 6.1%). The AITD discordance percentage was 1.37% (63/4,598, range per quarter, 0.0-2.6%). A Chi-square analysis determined a statistically significant difference between the 2 groups (P < 0.001). There was a statistically significant improvement in the concordance of a radiology report header with the performed procedure using automatically inserted technical details compared to manually inserted details. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Qiu, Xing; Hu, Rui; Wu, Zhixin
2014-01-01
Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114
NASA Astrophysics Data System (ADS)
Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.
2018-04-01
There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.
NASA Technical Reports Server (NTRS)
Seasholtz, R. G.
1977-01-01
A laser Doppler velocimeter (LDV) built for use in the Lewis Research Center's turbine stator cascade facilities is described. The signal processing and self contained data processing are based on a computing counter. A procedure is given for mode matching the laser to the probe volume. An analysis is presented of biasing errors that were observed in turbulent flow when the mean flow was not normal to the fringes.
1992-04-01
contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more
Data Treatment for LC-MS Untargeted Analysis.
Riccadonna, Samantha; Franceschi, Pietro
2018-01-01
Liquid chromatography-mass spectrometry (LC-MS) untargeted experiments require complex chemometrics strategies to extract information from the experimental data. Here we discuss "data preprocessing", the set of procedures performed on the raw data to produce a data matrix which will be the starting point for the subsequent statistical analysis. Data preprocessing is a crucial step on the path to knowledge extraction, which should be carefully controlled and optimized in order to maximize the output of any untargeted metabolomics investigation.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I
2015-11-03
We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Planetarium instructional efficacy: A research synthesis
NASA Astrophysics Data System (ADS)
Brazell, Bruce D.
The purpose of the current study was to explore the instructional effectiveness of the planetarium in astronomy education using meta-analysis. A review of the literature revealed 46 studies related to planetarium efficacy. However, only 19 of the studies satisfied selection criteria for inclusion in the meta-analysis. Selected studies were then subjected to coding procedures, which extracted information such as subject characteristics, experimental design, and outcome measures. From these data, 24 effect sizes were calculated in the area of student achievement and five effect sizes were determined in the area of student attitudes using reported statistical information. Mean effect sizes were calculated for both the achievement and the attitude distributions. Additionally, each effect size distribution was subjected to homogeneity analysis. The attitude distribution was found to be homogeneous with a mean effect size of -0.09, which was not significant, p = .2535. The achievement distribution was found to be heterogeneous with a statistically significant mean effect size of +0.28, p < .05. Since the achievement distribution was heterogeneous, the analog to the ANOVA procedure was employed to explore variability in this distribution in terms of the coded variables. The analog to the ANOVA procedure revealed that the variability introduced by the coded variables did not fully explain the variability in the achievement distribution beyond subject-level sampling error under a fixed effects model. Therefore, a random effects model analysis was performed which resulted in a mean effect size of +0.18, which was not significant, p = .2363. However, a large random effect variance component was determined indicating that the differences between studies were systematic and yet to be revealed. The findings of this meta-analysis showed that the planetarium has been an effective instructional tool in astronomy education in terms of student achievement. However, the meta-analysis revealed that the planetarium has not been a very effective tool for improving student attitudes towards astronomy.
Ogawa, Yasushi; Fawaz, Farah; Reyes, Candice; Lai, Julie; Pungor, Erno
2007-01-01
Parameter settings of a parallel line analysis procedure were defined by applying statistical analysis procedures to the absorbance data from a cell-based potency bioassay for a recombinant adenovirus, Adenovirus 5 Fibroblast Growth Factor-4 (Ad5FGF-4). The parallel line analysis was performed with a commercially available software, PLA 1.2. The software performs Dixon outlier test on replicates of the absorbance data, performs linear regression analysis to define linear region of the absorbance data, and tests parallelism between the linear regions of standard and sample. Width of Fiducial limit, expressed as a percent of the measured potency, was developed as a criterion for rejection of the assay data and to significantly improve the reliability of the assay results. With the linear range-finding criteria of the software set to a minimum of 5 consecutive dilutions and best statistical outcome, and in combination with the Fiducial limit width acceptance criterion of <135%, 13% of the assay results were rejected. With these criteria applied, the assay was found to be linear over the range of 0.25 to 4 relative potency units, defined as the potency of the sample normalized to the potency of Ad5FGF-4 standard containing 6 x 10(6) adenovirus particles/mL. The overall precision of the assay was estimated to be 52%. Without the application of Fiducial limit width criterion, the assay results were not linear over the range, and an overall precision of 76% was calculated from the data. An absolute unit of potency for the assay was defined by using the parallel line analysis procedure as the amount of Ad5FGF-4 that results in an absorbance value that is 121% of the average absorbance readings of the wells containing cells not infected with the adenovirus.
Adaptive graph-based multiple testing procedures
Klinglmueller, Florian; Posch, Martin; Koenig, Franz
2016-01-01
Multiple testing procedures defined by directed, weighted graphs have recently been proposed as an intuitive visual tool for constructing multiple testing strategies that reflect the often complex contextual relations between hypotheses in clinical trials. Many well-known sequentially rejective tests, such as (parallel) gatekeeping tests or hierarchical testing procedures are special cases of the graph based tests. We generalize these graph-based multiple testing procedures to adaptive trial designs with an interim analysis. These designs permit mid-trial design modifications based on unblinded interim data as well as external information, while providing strong family wise error rate control. To maintain the familywise error rate, it is not required to prespecify the adaption rule in detail. Because the adaptive test does not require knowledge of the multivariate distribution of test statistics, it is applicable in a wide range of scenarios including trials with multiple treatment comparisons, endpoints or subgroups, or combinations thereof. Examples of adaptations are dropping of treatment arms, selection of subpopulations, and sample size reassessment. If, in the interim analysis, it is decided to continue the trial as planned, the adaptive test reduces to the originally planned multiple testing procedure. Only if adaptations are actually implemented, an adjusted test needs to be applied. The procedure is illustrated with a case study and its operating characteristics are investigated by simulations. PMID:25319733
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.
Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science
Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
Applying a statistical PTB detection procedure to complement the gold standard.
Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd
2011-04-01
This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pizzini, Edward L.; Treagust, David F.; Cody, John
The purpose of this study was to determine whether or not formative evaluation could facilitate goal attainment in a biochemistry course and produce desired learning outcomes consistently by altering course materials and/or instruction. Formative evaluation procedures included the administration of the Inorganic-Organic-Biological Chemistry Test Form 1974 and the Methods and Procedures of Science test to course participants over three consecutive years. A one group pretest-post-test design was used. The statistical analysis involved the use of the Wilcoxon matched-pairs signed-ranks test. The study involved 64 participants. The findings indicate that the use of formative evaluation can be effective in producing desired learning outcomes to facilitate goal attainment.
The role of simulation in the design of a neural network chip
NASA Technical Reports Server (NTRS)
Desai, Utpal; Roppel, Thaddeus A.; Padgett, Mary L.
1993-01-01
An iterative, simulation-based design procedure for a neural network chip is introduced. For this design procedure, the goal is to produce a chip layout for a neural network in which the weights are determined by transistor gate width-to-length ratios. In a given iteration, the current layout is simulated using the circuit simulator SPICE, and layout adjustments are made based on conventional gradient-decent methods. After the iteration converges, the chip is fabricated. Monte Carlo analysis is used to predict the effect of statistical fabrication process variations on the overall performance of the neural network chip.
Random forests for classification in ecology
Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.
2007-01-01
Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.
Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies
Liu, Zhonghua; Lin, Xihong
2017-01-01
Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391
Multiple phenotype association tests using summary statistics in genome-wide association studies.
Liu, Zhonghua; Lin, Xihong
2018-03-01
We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo
2006-08-01
Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.
The multiple imputation method: a case study involving secondary data analysis.
Walani, Salimah R; Cleland, Charles M
2015-05-01
To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.
Validation of a heteroscedastic hazards regression model.
Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin
2002-03-01
A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial.
Toward a perceptual image quality assessment of color quantized images
NASA Astrophysics Data System (ADS)
Frackiewicz, Mariusz; Palus, Henryk
2018-04-01
Color image quantization is an important operation in the field of color image processing. In this paper, we consider new perceptual image quality metrics for assessment of quantized images. These types of metrics, e.g. DSCSI, MDSIs, MDSIm and HPSI achieve the highest correlation coefficients with MOS during tests on the six publicly available image databases. Research was limited to images distorted by two types of compression: JPG and JPG2K. Statistical analysis of correlation coefficients based on the Friedman test and post-hoc procedures showed that the differences between the four new perceptual metrics are not statistically significant.
Analysis of repeated measurement data in the clinical trials
Singh, Vineeta; Rana, Rakesh Kumar; Singhal, Richa
2013-01-01
Statistics is an integral part of Clinical Trials. Elements of statistics span Clinical Trial design, data monitoring, analyses and reporting. A solid understanding of statistical concepts by clinicians improves the comprehension and the resulting quality of Clinical Trials. In biomedical research it has been seen that researcher frequently use t-test and ANOVA to compare means between the groups of interest irrespective of the nature of the data. In Clinical Trials we record the data on the patients more than two times. In such a situation using the standard ANOVA procedures is not appropriate as it does not consider dependencies between observations within subjects in the analysis. To deal with such types of study data Repeated Measure ANOVA should be used. In this article the application of One-way Repeated Measure ANOVA has been demonstrated by using the software SPSS (Statistical Package for Social Sciences) Version 15.0 on the data collected at four time points 0 day, 15th day, 30th day, and 45th day of multicentre clinical trial conducted on Pandu Roga (~Iron Deficiency Anemia) with an Ayurvedic formulation Dhatrilauha. PMID:23930038
Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea
NASA Astrophysics Data System (ADS)
Kim, S. D.; Park, H. M.
2017-12-01
To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.
NASA Astrophysics Data System (ADS)
Chakraborthy, Parthasarathi; Chattopadhyay, Surajit
2013-02-01
Endeavor of the present paper is to investigate the statistical properties of the total ozone concentration time series over Arosa, Switzerland (9.68°E, 46.78°N). For this purpose, different statistical data analysis procedures have been employed for analyzing the mean monthly total ozone concentration data, collected over a period of 40 years (1932-1971), at the above location. Based on the computations on the available data set, the study reports different degrees of variations in different months. The month of July is reported as the month of lowest variability. April and May are found to be the most correlated months with respect to total ozone concentration.
Osland, Emma; Yunus, Rossita Mohamad; Khan, Shahjahan; Alodat, Tareq; Memon, Breda; Memon, Muhammed Ashraf
2016-10-01
Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic vertical sleeve gastrectomy (LVSG) have been proposed as cost-effective strategies to manage obesity-related chronic disease. The aim of this meta-analysis and systematic review was to compare the "early postoperative complication rate i.e. within 30-days" reported from randomized control trials (RCTs) comparing these two procedures. RCTs comparing the early complication rates following LVSG and LRYGB between 2000 and 2015 were selected from PubMed, Medline, Embase, Science Citation Index, Current Contents, and the Cochrane database. The outcome variables analyzed included 30-day mortality, major and minor complications and interventions required for their management, length of hospital stay, readmission rates, operating time, and conversions from laparoscopic to open procedures. Six RCTs involving a total of 695 patients (LVSG n = 347, LRYGB n = 348) reported on early major complications. A statistically significant reduction in relative odds of early major complications favoring the LVSG procedure was noted (p = 0.05). Five RCTs representing 633 patients (LVSG n = 317, LRYGB n = 316) reported early minor complications. A non-statically significant reduction in relative odds of 29 % favoring the LVSG procedure was observed for early minor complications (p = 0.4). However, other outcomes directly related to complications which included reoperation rates, readmission rate, and 30-day mortality rate showed comparable effect size for both surgical procedures. This meta-analysis and systematic review of RCTs suggests that fewer early major and minor complications are associated with LVSG compared with LRYGB procedure. However, this does not translate into higher readmission rate, reoperation rate, or 30-day mortality for either procedure.
Statistical analysis of regulatory ecotoxicity tests.
Isnard, P; Flammarion, P; Roman, G; Babut, M; Bastien, P; Bintein, S; Esserméant, L; Férard, J F; Gallotti-Schmitt, S; Saouter, E; Saroli, M; Thiébaud, H; Tomassone, R; Vindimian, E
2001-11-01
ANOVA-type data analysis, i.e.. determination of lowest-observed-effect concentrations (LOECs), and no-observed-effect concentrations (NOECs), has been widely used for statistical analysis of chronic ecotoxicity data. However, it is more and more criticised for several reasons, among which the most important is probably the fact that the NOEC depends on the choice of test concentrations and number of replications and rewards poor experiments, i.e., high variability, with high NOEC values. Thus, a recent OECD workshop concluded that the use of the NOEC should be phased out and that a regression-based estimation procedure should be used. Following this workshop, a working group was established at the French level between government, academia and industry representatives. Twenty-seven sets of chronic data (algae, daphnia, fish) were collected and analysed by ANOVA and regression procedures. Several regression models were compared and relations between NOECs and ECx, for different values of x, were established in order to find an alternative summary parameter to the NOEC. Biological arguments are scarce to help in defining a negligible level of effect x for the ECx. With regard to their use in the risk assessment procedures, a convenient methodology would be to choose x so that ECx are on average similar to the present NOEC. This would lead to no major change in the risk assessment procedure. However, experimental data show that the ECx depend on the regression models and that their accuracy decreases in the low effect zone. This disadvantage could probably be reduced by adapting existing experimental protocols but it could mean more experimental effort and higher cost. ECx (derived with existing test guidelines, e.g., regarding the number of replicates) whose lowest bounds of the confidence interval are on average similar to present NOEC would improve this approach by a priori encouraging more precise experiments. However, narrow confidence intervals are not only linked to good experimental practices, but also depend on the distance between the best model fit and experimental data. At least, these approaches still use the NOEC as a reference although this reference is statistically not correct. On the contrary, EC50 are the most precise values to estimate on a concentration response curve, but they are clearly different from the NOEC and their use would require a modification of existing assessment factors.
Duc, Anh Nguyen; Wolbers, Marcel
2017-02-10
Composite endpoints are widely used as primary endpoints of randomized controlled trials across clinical disciplines. A common critique of the conventional analysis of composite endpoints is that all disease events are weighted equally, whereas their clinical relevance may differ substantially. We address this by introducing a framework for the weighted analysis of composite endpoints and interpretable test statistics, which are applicable to both binary and time-to-event data. To cope with the difficulty of selecting an exact set of weights, we propose a method for constructing simultaneous confidence intervals and tests that asymptotically preserve the family-wise type I error in the strong sense across families of weights satisfying flexible inequality or order constraints based on the theory of χ¯2-distributions. We show that the method achieves the nominal simultaneous coverage rate with substantial efficiency gains over Scheffé's procedure in a simulation study and apply it to trials in cardiovascular disease and enteric fever. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Statistical analysis on experimental calibration data for flowmeters in pressure pipes
NASA Astrophysics Data System (ADS)
Lazzarin, Alessandro; Orsi, Enrico; Sanfilippo, Umberto
2017-08-01
This paper shows a statistical analysis on experimental calibration data for flowmeters (i.e.: electromagnetic, ultrasonic, turbine flowmeters) in pressure pipes. The experimental calibration data set consists of the whole archive of the calibration tests carried out on 246 flowmeters from January 2001 to October 2015 at Settore Portate of Laboratorio di Idraulica “G. Fantoli” of Politecnico di Milano, that is accredited as LAT 104 for a flow range between 3 l/s and 80 l/s, with a certified Calibration and Measurement Capability (CMC) - formerly known as Best Measurement Capability (BMC) - equal to 0.2%. The data set is split into three subsets, respectively consisting in: 94 electromagnetic, 83 ultrasonic and 69 turbine flowmeters; each subset is analysed separately from the others, but then a final comparison is carried out. In particular, the main focus of the statistical analysis is the correction C, that is the difference between the flow rate Q measured by the calibration facility (through the accredited procedures and the certified reference specimen) minus the flow rate QM contemporarily recorded by the flowmeter under calibration, expressed as a percentage of the same QM .
Defining the ecological hydrology of Taiwan Rivers using multivariate statistical methods
NASA Astrophysics Data System (ADS)
Chang, Fi-John; Wu, Tzu-Ching; Tsai, Wen-Ping; Herricks, Edwin E.
2009-09-01
SummaryThe identification and verification of ecohydrologic flow indicators has found new support as the importance of ecological flow regimes is recognized in modern water resources management, particularly in river restoration and reservoir management. An ecohydrologic indicator system reflecting the unique characteristics of Taiwan's water resources and hydrology has been developed, the Taiwan ecohydrological indicator system (TEIS). A major challenge for the water resources community is using the TEIS to provide environmental flow rules that improve existing water resources management. This paper examines data from the extensive network of flow monitoring stations in Taiwan using TEIS statistics to define and refine environmental flow options in Taiwan. Multivariate statistical methods were used to examine TEIS statistics for 102 stations representing the geographic and land use diversity of Taiwan. The Pearson correlation coefficient showed high multicollinearity between the TEIS statistics. Watersheds were separated into upper and lower-watershed locations. An analysis of variance indicated significant differences between upstream, more natural, and downstream, more developed, locations in the same basin with hydrologic indicator redundancy in flow change and magnitude statistics. Issues of multicollinearity were examined using a Principal Component Analysis (PCA) with the first three components related to general flow and high/low flow statistics, frequency and time statistics, and quantity statistics. These principle components would explain about 85% of the total variation. A major conclusion is that managers must be aware of differences among basins, as well as differences within basins that will require careful selection of management procedures to achieve needed flow regimes.
A comparison of vowel normalization procedures for language variation research
NASA Astrophysics Data System (ADS)
Adank, Patti; Smits, Roel; van Hout, Roeland
2004-11-01
An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .
A comparison of vowel normalization procedures for language variation research.
Adank, Patti; Smits, Roel; van Hout, Roeland
2004-11-01
An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels ("vowel-extrinsic" information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself ("vowel-intrinsic" information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., "formant-extrinsic" F2-F1).
Silvestri, Enzo; Barile, Antonio; Albano, Domenico; Messina, Carmelo; Orlandi, Davide; Corazza, Angelo; Zugaro, Luigi; Masciocchi, Carlo; Sconfienza, Luca Maria
2018-04-01
To perform an online survey among all members of the Italian College of Musculoskeletal Radiology to understand how therapeutic musculoskeletal procedures are performed in daily practice in Italy. We administered an online survey to all 2405 members about the use of therapeutic musculoskeletal procedures in their institutions asking 16 different questions. Subgroup analysis was performed between general and orthopaedic hospitals with Mann-Whitney U and χ 2 statistics. A total of 129/2405 answers (5.4% of members) were included in our analysis. A median of 142.5 (25th-75th percentiles: 50-535.5; range 10-5000) therapeutic musculoskeletal procedures per single institution was performed in 2016. Arthropathic pain was the main indication. The most common procedures were joint injection, bursal/tendon injection, and irrigation of calcific tendinopathy. Ultrasound-guided procedures were mainly performed in ultrasonography rooms (77.4%) rather than in dedicated interventional rooms (22.6%). Conversely, fluoroscopic procedures were performed almost with the same frequency in interventional radiology suites (52.4%) and in general radiology rooms (47.6%). In most institutions (72%), autologous blood or components were not used. The median number of therapeutic musculoskeletal procedures performed in orthopaedic hospitals was significantly higher than in general hospitals (P = 0.002), as well as for the use of autologous preparations (P = 0.004). Joint injection, bursal/tendon injection, and irrigation of calcific tendinopathy were the most common therapeutic musculoskeletal procedures, being arthropathic pain the main indication. The percentage of procedures and the use of autologous preparations were significantly higher in orthopaedic hospitals than in general hospitals.
Lynch, Thomas Sean; Kosanovic, Radomir; Gibbs, Daniel Bradley; Park, Caroline; Bedi, Asheesh; Larson, Christopher M.; Ahmad, Christopher S.
2017-01-01
Objectives: Athletic pubalgia is a condition in which there is an injury to the core musculature that precipitates groin and lower abdominal pain, particularly in cutting and pivoting sports. These are common injury patterns in the National Football League (NFL); however, the effect of surgery on performance for these players has not been described. Methods: Athletes in the NFL that underwent a surgical procedure for athletic pubalgia / core muscle injury (CMI) were identified through team injury reports and archives on public record since 2004. Outcome data was collected for athletes who met inclusion criteria which included total games played after season of injury / surgery, number of Pro Bowls voted to, yearly total years and touchdowns for offensive players and yearly total tackles sacks and interceptions for defensive players. Previously validated performance scores were calculated using this data for each player one season before and after their procedure for a CMI. Athletes were then matched to control professional football players without a diagnosis of athletic pubalgia by age, position, year and round drafted. Statistical analysis was used to compare pre-injury and post-injury performance measures for players treated with operative management to their case controls. Results: The study group was composed of 32 NFL athletes who underwent operative management for athletic pubalgia that met inclusion criteria during this study period, including 18 offensive players and 16 defensive players. The average age of athletes undergoing this surgery was 27 years old. Analysis of pre- and post-injury athletic performance revealed no statistically significant changes after return to sport after surgical intervention; however, there was a statistically significant difference in the number of Pro Bowls that affected athletes participated in before surgery (8) compared to the season after surgery (3). Analysis of durability, as measured by total number of games played before and after surgery, revealed no statistically significant difference. Conclusion: National Football League players who undergo operative care for athletic pubalgia have a high return to play with no decrease in performance scores when compared to case-matched controls. However, the indications for operative intervention and the type of procedure performed are heterogeneous. Further research is warranted to better understand how these injuries occur, what can be done to prevent their occurrence, and the long term career ramifications of this disorder.
Yadlapati, Ajay; Grogan, Tristan; Elashoff, David; Kelly, Robert B.
2013-01-01
Abstract: Using a novel noninvasive, visible-light optical diffusion oximeter (T-Stat VLS Tissue Oximeter; Spectros Corporation, Portola Valley, CA) to measure the tissue oxygen saturation (StO2) of the buccal mucosa, the correlation between StO2 and central venous oxygen saturation (ScvO2) was examined in children with congenital cyanotic heart disease undergoing a cardiac surgical procedure. Paired StO2 and serum ScvO2 measurements were obtained postoperatively and statistically analyzed for agreement and association. Thirteen children (nine male) participated in the study (age range, 4 days to 18 months). Surgeries included Glenn shunt procedures, Norwood procedures, unifocalization procedures with Blalock-Taussig shunt placement, a Kawashima/Glenn shunt procedure, a Blalock-Taussig shunt placement, and a modified Norwood procedure. A total of 45 paired StO2-ScvO2 measurements was obtained. Linear regression demonstrated a Pearson’s correlation of .58 (95% confidence interval [CI], .35–.75; p < .0001). The regression slope coefficient estimate was .95 (95% CI, .54–1.36) with an interclass correlation coefficient of .48 (95% CI, .22–.68). Below a clinically relevant average ScvO2 value, a receiver operator characteristic analysis yielded an area under the curve of .78. Statistical methods to control for repeatedly measuring the same subjects produced similar results. This study shows a moderate relationship and agreement between StO2 and ScvO2 measurements in pediatric patients with a history of congenital cyanotic heart disease undergoing a cardiac surgical procedure. This real-time monitoring device can act as a valuable adjunct to standard noninvasive monitoring in which serum ScvO2 sampling currently assists in the diagnosis of low cardiac output after pediatric cardiac surgery. PMID:23691783
Efficacy of vibration on venipuncture pain scores in a pediatric emergency department.
Secil, Aydinoz; Fatih, Celikel; Gokhan, Aydemir; Alpaslan, Genc Fatih; Gonul, Sezer Rabia
2014-10-01
Venipuncture is a frequent source of painful procedures for infants. It has been well documented that infants react to pain with a combination of physiologic and behavioral responses. Infants are unable to describe pain and at particularly high risk for inadequate pain management. The Vibration Anesthesia Device is a specifically designed device for management of pain from minor procedures. It has been shown to reduce venipuncture pain in older children but has not been studied in infants. The mechanism of its effects has been described by a gate control theory, which states that vibration stimulates the dorsal horn neurons where the pain signal is being modulated. The objective of this study was to investigate the efficacy of this device on pain during and after venipuncture procedures in infants. Study participants were 60 healthy infants undergoing venipuncture procedure for routine laboratory tests. Infants were divided into 2 groups as follows: group 1 (n = 30) was placed vibration anesthesia device 5 to 10 cm proximally through the site of venipuncture, and group 2 (n = 30) underwent venipuncture only. A single observer rated pain responses using the Face, Legs, Activity, Cry, and Consolability scale before, during, and after the procedure. The χ distribution and Student t test were used for statistical analysis. Groups did not differ by sex. Mean age of group 2 is less than group 1 and is statistically significant (P = 0.026). There were no differences between pain scores of groups assessed by Face, Legs, Activity, Cry, and Consolability scale before, during, and after venipuncture procedure (P = 0.359, P = 0.907, and P = 0.400 respectively). We assessed the efficacy of a vibration anesthesia device, and our results suggested that this device did not reduce pain scores in infants during and after venipuncture procedure.
Psychological profiling of offender characteristics from crime behaviors in serial rape offences.
Kocsis, Richard N; Cooksey, Ray W; Irwin, Harvey J
2002-04-01
Criminal psychological profiling has progressively been incorporated into police procedures despite a dearth of empirical research. Indeed, in the study of serial violent crimes for the purpose of psychological profiling, very few original, quantitative, academically reviewed studies actually exist. This article reports on the analysis of 62 incidents of serial sexual assault. The statistical procedure of multidimensional scaling was employed in the analysis of this data, which in turn produced a five-cluster model of serial rapist behavior. First, a central cluster of behaviors were identified that represent common behaviors to all patterns of serial rape. Second, four distinct outlying patterns were identified as demonstrating distinct offence styles, these being assigned the following descriptive labels brutality, intercourse, chaotic, and ritual. Furthermore, analysis of these patterns also identified distinct offender characteristics that allow for the use of empirically robust offender profiles in future serial rape investigations.
Statistical PERT: An Improved Subnetwork Analysis Procedure
1975-11-01
z -^■zzzzzzzzzzzz - u < t- < VIVII/II/IU’. Vll/tVlV. WI^IA O li ’ V *->■>>■■>■ >>■>>*■*’* t, < K^^ Kfc -^-Kh-f-^t-H li. t...A.R. Dawe Office of Naval Research San Francisco Area Office 760 Market St. - ROOD 447 San Francisco, California 94103 Technical Library Naval
ERIC Educational Resources Information Center
Ojerinde, Dibu; Popoola, Omokunmi; Onyeneho, Patrick; Egberongbe, Aminat
2016-01-01
Statistical procedure used in adjusting test score difficulties on test forms is known as "equating". Equating makes it possible for various test forms to be used interchangeably. In terms of where the equating method fits in the assessment cycle, there are pre-equating and post-equating methods. The major benefits of pre-equating, when…
Grammar and Lexicon in Individuals with Autism: A Quantitative Analysis of a Large Italian Corpus
ERIC Educational Resources Information Center
Tuzzi, Arjuna
2009-01-01
Statistical and linguistic procedures were implemented to analyze a large corpus of texts written by 37 individuals with autism and 92 facilitators (without disabilities), producing written conversations by means of PCs. Such texts were compared and contrasted to identify the specific traits of the lexis of the group of individuals with autism and…
Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.
Berry, K J; Mielke, P W
2000-12-01
Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.
ERIC Educational Resources Information Center
Horne, Lela M.; Rachal, John R.; Shelley, Kyna
2012-01-01
A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
General aviation air traffic pattern safety analysis
NASA Technical Reports Server (NTRS)
Parker, L. C.
1973-01-01
A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.
Applications of satellite image processing to the analysis of Amazonian cultural ecology
NASA Technical Reports Server (NTRS)
Behrens, Clifford A.
1991-01-01
This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.
Efficacy of micronized acellular dermal graft for use in interproximal papillae regeneration.
Geurs, Nico C; Romanos, Alain H; Vassilopoulos, Philip J; Reddy, Michael S
2012-02-01
The aim of this study was to evaluate interdental papillary reconstruction based on a micronized acellular dermal matrix allograft technique. Thirty-eight papillae in 12 patients with esthetic complaints of insufficient papillae were evaluated. Decreased gingival recession values were found postoperatively (P < .001). Chi-square analysis showed significantly higher postoperative Papilla Index values (chi-square = 43, P < .001), further supported by positive symmetry statistical analysis values (positive kappa and weighted kappa values). This procedure shows promise as a method for papillary reconstruction.
NASA Technical Reports Server (NTRS)
Wharton, S. W.
1980-01-01
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.
Viewpoint: observations on scaled average bioequivalence.
Patterson, Scott D; Jones, Byron
2012-01-01
The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.
O' Sullivan, Katie E; Bracken-Clarke, Darragh; Segurado, Ricardo; Barry, Mitchel; Sugrue, Declan; Flood, Georgina; Hurley, John
2014-09-01
Retrograde transcatheter aortic valve implantation (TAVI) can be performed under local anesthesia (LA) or general anesthesia (GA); however, a wide variation in practice exists. PubMed was searched between 2009 and 2013. Data were extracted from eligible studies. Random-effects meta-analysis was performed using DerSimonian Laird between-study variance. There was no statistically significant difference identified between groups based on age or EuroSCORE. There was no statistically significant difference seen in all-cause mortality, or complication rates between groups. Mean procedural duration was 36 minutes shorter in the LA group (p = 0.001). There was increased vasopressor use in the GA group (odds ratio 3.92; p = 0.017). Mean hospital stay was 3.41 days shorter in the LA group (p = 0.018). Results suggest that the use of LA for retrograde TAVI is feasible. There are several potential benefits associated, shorter procedural duration, and hospital stay with lower vasopressor requirements. Further studies and randomized trials are mandatory to confirm the presented findings and to identify those patients for whom LA would be appropriate. Georg Thieme Verlag KG Stuttgart · New York.
Robustness of fit indices to outliers and leverage observations in structural equation modeling.
Yuan, Ke-Hai; Zhong, Xiaoling
2013-06-01
Normal-distribution-based maximum likelihood (NML) is the most widely used method in structural equation modeling (SEM), although practical data tend to be nonnormally distributed. The effect of nonnormally distributed data or data contamination on the normal-distribution-based likelihood ratio (LR) statistic is well understood due to many analytical and empirical studies. In SEM, fit indices are used as widely as the LR statistic. In addition to NML, robust procedures have been developed for more efficient and less biased parameter estimates with practical data. This article studies the effect of outliers and leverage observations on fit indices following NML and two robust methods. Analysis and empirical results indicate that good leverage observations following NML and one of the robust methods lead most fit indices to give more support to the substantive model. While outliers tend to make a good model superficially bad according to many fit indices following NML, they have little effect on those following the two robust procedures. Implications of the results to data analysis are discussed, and recommendations are provided regarding the use of estimation methods and interpretation of fit indices. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Chow, Jeffrey T. Y.; Hutnik, Cindy M. L.; Solo, Karla
2017-01-01
The purpose of this systematic review and meta-analysis was to examine the availability of evidence for one of the earliest available minimally invasive glaucoma surgery (MIGS) procedures, the Trabectome. Various databases were searched up to December 20, 2016, for any published studies assessing the use of the Trabectome as a solo procedure in patients with primary open-angle glaucoma (POAG). The standardized mean differences (SMD) were calculated for the change in intraocular pressure (IOP) and number of glaucoma mediations used at 1-month, 6-month, and 12-month follow-up. After screening, three studies and one abstract with analyzable data were included. The meta-analysis showed statistically significant reductions in IOP and number of glaucoma medications used at all time points. Though the Trabectome as a solo procedure appears to lower IOP and reduces the number of glaucoma medications, more high-quality studies are required to make definitive conclusions. The difficulty of obtaining evidence may be one of the many obstacles that limit a full understanding of the potential safety and/or efficacy benefits compared to standard treatments. The time has come for a thoughtful and integrated approach with stakeholders to determine optimal access to care strategies for our patients. PMID:28740733
Chow, Jeffrey T Y; Hutnik, Cindy M L; Solo, Karla; Malvankar-Mehta, Monali S
2017-01-01
The purpose of this systematic review and meta-analysis was to examine the availability of evidence for one of the earliest available minimally invasive glaucoma surgery (MIGS) procedures, the Trabectome. Various databases were searched up to December 20, 2016, for any published studies assessing the use of the Trabectome as a solo procedure in patients with primary open-angle glaucoma (POAG). The standardized mean differences (SMD) were calculated for the change in intraocular pressure (IOP) and number of glaucoma mediations used at 1-month, 6-month, and 12-month follow-up. After screening, three studies and one abstract with analyzable data were included. The meta-analysis showed statistically significant reductions in IOP and number of glaucoma medications used at all time points. Though the Trabectome as a solo procedure appears to lower IOP and reduces the number of glaucoma medications, more high-quality studies are required to make definitive conclusions. The difficulty of obtaining evidence may be one of the many obstacles that limit a full understanding of the potential safety and/or efficacy benefits compared to standard treatments. The time has come for a thoughtful and integrated approach with stakeholders to determine optimal access to care strategies for our patients.
Norris, David C; Wilson, Andrew
2016-01-01
In a 2014 report on adolescent mental health outcomes in the Moving to Opportunity for Fair Housing Demonstration (MTO), Kessler et al. reported that, at 10- to 15-year follow-up, boys from households randomized to an experimental housing voucher intervention experienced 12-month prevalence of post-traumatic stress disorder (PTSD) at several times the rate of boys from control households. We reanalyze this finding here, bringing to light a PTSD outcome imputation procedure used in the original analysis, but not described in the study report. By bootstrapping with repeated draws from the frequentist sampling distribution of the imputation model used by Kessler et al., and by varying two pseudorandom number generator seeds that fed their analysis, we account for several purely statistical components of the uncertainty inherent in their imputation procedure. We also discuss other sources of uncertainty in this procedure that were not accessible to a formal reanalysis.
Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0
Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.
2008-01-01
The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.
Bydon, Mohamad; Abt, Nicholas B; De la Garza-Ramos, Rafael; Macki, Mohamed; Witham, Timothy F; Gokaslan, Ziya L; Bydon, Ali; Huang, Judy
2015-04-01
The authors sought to determine the impact of resident participation on overall 30-day morbidity and mortality following neurosurgical procedures. The American College of Surgeons National Surgical Quality Improvement Program database was queried for all patients who had undergone neurosurgical procedures between 2006 and 2012. The operating surgeon(s), whether an attending only or attending plus resident, was assessed for his or her influence on morbidity and mortality. Multivariate logistic regression, was used to estimate odds ratios for 30-day postoperative morbidity and mortality outcomes for the attending-only compared with the attending plus resident cohorts (attending group and attending+resident group, respectively). The study population consisted of 16,098 patients who had undergone elective or emergent neurosurgical procedures. The mean patient age was 56.8 ± 15.0 years, and 49.8% of patients were women. Overall, 15.8% of all patients had at least one postoperative complication. The attending+resident group demonstrated a complication rate of 20.12%, while patients with an attending-only surgeon had a statistically significantly lower complication rate at 11.70% (p < 0.001). In the total population, 263 patients (1.63%) died within 30 days of surgery. Stratified by operating surgeon status, 162 patients (2.07%) in the attending+resident group died versus 101 (1.22%) in the attending group, which was statistically significant (p < 0.001). Regression analyses compared patients who had resident participation to those with only attending surgeons, the referent group. Following adjustment for preoperative patient characteristics and comorbidities, multivariate regression analysis demonstrated that patients with resident participation in their surgery had the same odds of 30-day morbidity (OR = 1.05, 95% CI 0.94-1.17) and mortality (OR = 0.92, 95% CI 0.66-1.28) as their attending only counterparts. Cases with resident participation had higher rates of mortality and morbidity; however, these cases also involved patients with more comorbidities initially. On multivariate analysis, resident participation was not an independent risk factor for postoperative 30-day morbidity or mortality following elective or emergent neurosurgical procedures.
NASA Astrophysics Data System (ADS)
Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.
2017-11-01
We define tephras and cryptotephras and their components (mainly ash-sized particles of glass ± crystals in distal deposits) and summarize the basis of tephrochronology as a chronostratigraphic correlational and dating tool for palaeoenvironmental, geological, and archaeological research. We then document and appraise recent advances in analytical methods used to determine the major, minor, and trace elements of individual glass shards from tephra or cryptotephra deposits to aid their correlation and application. Protocols developed recently for the electron probe microanalysis of major elements in individual glass shards help to improve data quality and standardize reporting procedures. A narrow electron beam (diameter ∼3-5 μm) can now be used to analyze smaller glass shards than previously attainable. Reliable analyses of 'microshards' (defined here as glass shards <32 μm in diameter) using narrow beams are useful for fine-grained samples from distal or ultra-distal geographic locations, and for vesicular or microlite-rich glass shards or small melt inclusions. Caveats apply, however, in the microprobe analysis of very small microshards (≤∼5 μm in diameter), where particle geometry becomes important, and of microlite-rich glass shards where the potential problem of secondary fluorescence across phase boundaries needs to be recognised. Trace element analyses of individual glass shards using laser ablation inductively coupled plasma-mass spectrometry (LA-ICP-MS), with crater diameters of 20 μm and 10 μm, are now effectively routine, giving detection limits well below 1 ppm. Smaller ablation craters (<10 μm) can be subject to significant element fractionation during analysis, but the systematic relationship of such fractionation with glass composition suggests that analyses for some elements at these resolutions may be quantifiable. In undertaking analyses, either by microprobe or LA-ICP-MS, reference material data acquired using the same procedure, and preferably from the same analytical session, should be presented alongside new analytical data. In part 2 of the review, we describe, critically assess, and recommend ways in which tephras or cryptotephras can be correlated (in conjunction with other information) using numerical or statistical analyses of compositional data. Statistical methods provide a less subjective means of dealing with analytical data pertaining to tephra components (usually glass or crystals/phenocrysts) than heuristic alternatives. They enable a better understanding of relationships among the data from multiple viewpoints to be developed and help quantify the degree of uncertainty in establishing correlations. In common with other scientific hypothesis testing, it is easier to infer using such analysis that two or more tephras are different rather than the same. Adding stratigraphic, chronological, spatial, or palaeoenvironmental data (i.e. multiple criteria) is usually necessary and allows for more robust correlations to be made. A two-stage approach is useful, the first focussed on differences in the mean composition of samples, or their range, which can be visualised graphically via scatterplot matrices or bivariate plots coupled with the use of statistical tools such as distance measures, similarity coefficients, hierarchical cluster analysis (informed by distance measures or similarity or cophenetic coefficients), and principal components analysis (PCA). Some statistical methods (cluster analysis, discriminant analysis) are referred to as 'machine learning' in the computing literature. The second stage examines sample variance and the degree of compositional similarity so that sample equivalence or otherwise can be established on a statistical basis. This stage may involve discriminant function analysis (DFA), support vector machines (SVMs), canonical variates analysis (CVA), and ANOVA or MANOVA (or its two-sample special case, the Hotelling two-sample T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain multiple compositional groups. Currently, incomplete databases are tending to limit correlation efficacy. The development of an open, online global database to facilitate progress towards integrated, high-quality tephrostratigraphic frameworks for different regions is encouraged.
Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.
Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J
2010-07-27
Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.
Raatikainen, M J Pekka; Arnar, David O; Zeppenfeld, Katja; Merino, Jose Luis; Levya, Francisco; Hindriks, Gerhardt; Kuck, Karl-Heinz
2015-01-01
There has been large variations in the use of invasive electrophysiological therapies in the member countries of the European Society of Cardiology (ESC). The aim of this analysis was to provide comprehensive information on cardiac implantable electronic device (CIED) and catheter ablation therapy trends in the ESC countries over the last five years. The European Heart Rhythm Association (EHRA) has collected data on CIED and catheter ablation therapy since 2008. Last year 49 of the 56 ESC member countries provided data for the EHRA White Book. This analysis is based on the current and previous editions of the EHRA White Book. Data on procedure rates together with information on economic aspects, local reimbursement systems and training activities are presented for each ESC country and the five geographical ESC regions. In 2013, the electrophysiological procedure rates per million population were highest in Western Europe followed by the Southern and Northern European countries. The CIED implantation and catheter ablation rate was lowest in the Eastern European and in the non-European ESC countries, respectively. However, in some Eastern European countries with relative low gross domestic product procedure rates exceeded those of some wealthier Western countries, suggesting that economic resources are not the only driver for utilization of arrhythmia therapies. These statistics indicate that despite significant improvements, there still is considerable heterogeneity in the availability of arrhythmia therapies across the ESC area. Hopefully, these data will help identify areas for improvement and guide future activities in cardiac arrhythmia management. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
NASA Astrophysics Data System (ADS)
Simoni, Daniele; Lengani, Davide; Guida, Roberto
2016-09-01
The transition process of the boundary layer growing over a flat plate with pressure gradient simulating the suction side of a low-pressure turbine blade and elevated free-stream turbulence intensity level has been analyzed by means of PIV and hot-wire measurements. A detailed view of the instantaneous flow field in the wall-normal plane highlights the physics characterizing the complex process leading to the formation of large-scale coherent structures during breaking down of the ordered motion of the flow, thus generating randomized oscillations (i.e., turbulent spots). This analysis gives the basis for the development of a new procedure aimed at determining the intermittency function describing (statistically) the transition process. To this end, a wavelet-based method has been employed for the identification of the large-scale structures created during the transition process. Successively, a probability density function of these events has been defined so that an intermittency function is deduced. This latter strictly corresponds to the intermittency function of the transitional flow computed trough a classic procedure based on hot-wire data. The agreement between the two procedures in the intermittency shape and spot production rate proves the capability of the method in providing the statistical representation of the transition process. The main advantages of the procedure here proposed concern with its applicability to PIV data; it does not require a threshold level to discriminate first- and/or second-order time-derivative of hot-wire time traces (that makes the method not influenced by the operator); and it provides a clear evidence of the connection between the flow physics and the statistical representation of transition based on theory of turbulent spot propagation.
Local Linear Regression for Data with AR Errors.
Li, Runze; Li, Yan
2009-07-01
In many statistical applications, data are collected over time, and they are likely correlated. In this paper, we investigate how to incorporate the correlation information into the local linear regression. Under the assumption that the error process is an auto-regressive process, a new estimation procedure is proposed for the nonparametric regression by using local linear regression method and the profile least squares techniques. We further propose the SCAD penalized profile least squares method to determine the order of auto-regressive process. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed procedure, and to compare the performance of the proposed procedures with the existing one. From our empirical studies, the newly proposed procedures can dramatically improve the accuracy of naive local linear regression with working-independent error structure. We illustrate the proposed methodology by an analysis of real data set.
Magnetic navigation system for percutaneous coronary intervention: A meta-analysis.
Qi, Zhiyong; Wu, Bangwei; Luo, Xinping; Zhu, Jun; Shi, Haiming; Jin, Bo
2016-07-01
Magnetic navigation system (MNS) allows calculation of the vessel coordinates in real space within the patient's chest for percutaneous coronary intervention (PCI). However, its impact on the procedural parameters and clinical outcomes is still a matter of debate. To derive a more precise estimation of the relationship, a meta-analysis was performed. Studies exploring the advantages of MNS were identified in English-language articles by search of Medline, Web of Science, and Cochrane Library Databases (inception to October 2015). A standardized protocol was used to extract details on study design, region origin, demographic data, lesion type, and clinical outcomes. The main outcome measures were contrast consumption, procedural success rate, contrast used for wire crossing, procedure time to cross the lesions, and the fluoroscopy time fluoroscopy time. A total of 12 clinical trials involving 2174 patients were included for analysis (902 patients in the magnetic PCI group and 1272 in the conventional PCI group). Overall, contrast consumption was decreased by 40.45 mL (95% confidence interval [CI] -70.98 to -9.92, P = 0.009) in magnetic PCI group compared with control group. In addition, magnetic PCI was associated with significantly decreasing procedural time by 2.17 minutes (95% CI -3.91 to -0.44, P = 0.01) and the total fluoroscopy time was significantly decreased by 1.43 minutes (95% CI -2.29 to -0.57, P = 0.001) in magnetic PCI group. However, procedural success rate, contrast used for wire crossing, procedure time to cross the lesions, and the fluoroscopy time to cross the lesions demonstrated that no statistically difference was observed between 2 groups. The present meta-analysis indicated an improvement of overall contrast consumption, total procedural time, and fluoroscopy time in magnetic PCI group. However, no significant advantages were observed associated with procedural success rate.
Lee, O-Sung; Ahn, Soyeon; Ahn, Jin Hwan; Teo, Seow Hui; Lee, Yong Seuk
2018-02-01
The purpose of this systematic review and meta-analysis was to evaluate the efficacy of concurrent cartilage procedures during high tibial osteotomy (HTO) for medial compartment osteoarthritis (OA) by comparing the outcomes of studies that directly compared the use of HTO plus concurrent cartilage procedures versus HTO alone. Results that are possible to be compared in more than two articles were presented as forest plots. A 95% confidence interval was calculated for each effect size, and we calculated the I 2 statistic, which presents the percentage of total variation attributable to the heterogeneity among studies. The random effects model was used to calculate the effect size. Seven articles were included to the final analysis. Case groups were composed of HTO without concurrent procedures and control groups were composed of HTO with concurrent procedures such as marrow stimulation procedure, mesenchymal stem cell transplantation, and injection. The case group showed a higher hospital for special surgery score and mean difference was 4.10 [I 2 80.8%, 95% confidence interval (CI) - 9.02 to 4.82]. Mean difference of the mechanical femorotibial angle in five studies was 0.08° (I 2 0%, 95% CI - 0.26 to 0.43). However, improved arthroscopic, histologic, and MRI results were reported in the control group. Our analysis support that concurrent procedures during HTO for medial compartment OA have little beneficial effect regarding clinical and radiological outcomes. However, they might have some beneficial effects in terms of arthroscopic, histologic, and MRI findings even though the quality of healed cartilage is not good as that of original cartilage. Therefore, until now, concurrent procedures for medial compartment OA have been considered optional. Nevertheless, no conclusions can be drawn for younger patients with focal cartilage defects and concomitant varus deformity. This question needs to be addressed separately.
Docking studies on NSAID/COX-2 isozyme complexes using Contact Statistics analysis
NASA Astrophysics Data System (ADS)
Ermondi, Giuseppe; Caron, Giulia; Lawrence, Raelene; Longo, Dario
2004-11-01
The selective inhibition of COX-2 isozymes should lead to a new generation of NSAIDs with significantly reduced side effects; e.g. celecoxib (Celebrex®) and rofecoxib (Vioxx®). To obtain inhibitors with higher selectivity it has become essential to gain additional insight into the details of the interactions between COX isozymes and NSAIDs. Although X-ray structures of COX-2 complexed with a small number of ligands are available, experimental data are missing for two well-known selective COX-2 inhibitors (rofecoxib and nimesulide) and docking results reported are controversial. We use a combination of a traditional docking procedure with a new computational tool (Contact Statistics analysis) that identifies the best orientation among a number of solutions to shed some light on this topic.
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N
2018-04-09
The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
Salazar, Gloria; Yeddula, Kalpana; Wicky, Stephan; Oklu, Ramhi; Ganguli, Suvranu; Waltman, Arthur C; Walker, Thomas G; Kalva, Sanjeeva P
2013-01-01
To compare complication rates in patients who have port-a-catheters inserted and left accessed for immediate use and those who have ports inserted but not accessed. In this retrospective, IRB-approved study, medical records of patients who received a port catheter between 9/2009 and 2/2010 were reviewed. The data collected included patient demographics, diagnosis, procedure and complications. The patients were categorized into two groups: accessed (patients in whom the port was accessed with a Huber needle for immediate intravenous use and the patient left the procedure area with needle indwelling) and control (patients in whom the ports were not accessed). Complications were classified according to Society of Interventional Radiology guidelines. Results are given as mean ±SD. Statistical analysis was performed with student t test and statistical significance was considered at P<.05. A total of 467 ports were placed in 465 patients (Men: 206); 10.7% in the accessed group (n=50, age: 60±13.9) and 89.3% in the control group (n=417, age: 59±13.5). There were no statistically significant differences in patient demographics between the groups. The overall complication rate was 0.6% (n=3). Two complications (hematoma causing skin necrosis and thrombosis of the port) occurred in the control group and one (infection) in the accessed group. Infection rates after procedures were 2% (1/50) in the accessed group and 0% (0/417) in the control group. There was no statistically significant difference in overall complication (P=.1) and infection (P=.1) rates among the groups. Leaving the port accessed immediately after placement does not increase the risk of infection or other complications.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos; Kouli, Maria
2013-08-01
The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.
Osland, Emma; Yunus, Rossita M; Khan, Shahjahan; Memon, Breda; Memon, Muhammed A
2016-06-01
Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic vertical sleeve gastrectomy (LVSG), have been proposed as cost-effective strategies to manage obesity-related chronic disease. The objectives of this meta-analysis and systematic review were to analyze the "late postoperative complication rate (>30 days)" for these 2 procedures. Randomized controlled trials (RCTs) published between 2000 and 2015 comparing the late complication rates, that is, >30 days following LVSG and LRYGB in adult population (ie, 16 y and above) were selected from PubMed, Medline, Embase, Science Citation Index, Current Contents, and the Cochrane database. The outcome variables analyzed included mortality rate, major and minor complications, and interventions required for their management and readmission rates. Random effects model was used to calculate the effect size of both binary and continuous data. Heterogeneity among the outcome variables of these trials was determined by the Cochran Q statistic and I index. The meta-analysis was prepared in accordance with the Preferred Reporting of Systematic Reviews and Meta-Analyses guidelines. Six RCTs involving a total of 685 patients (LVSG, n=345; LRYGB, n=340) reported late major complications. A nonstatistical reduction in relative odds favoring the LVSG procedure was observed [odds ratio (OR), 0.64; 95% confidence interval (CI), 0.21-1.97; P=0.4]. Four RCTs representing 408 patients (LVSG, n=208; LRYGB, n=200) reported late minor complications. A nonstatistically significant reduction of 36% in relative odds favoring the LVSG procedure was observed (OR, 0.64; 95% CI, 0.28-1.47; P=0.3). A 37% relative reduction in odds was observed in favor of the LVSG for the need for additional interventions to manage late postoperative complications that did not reach statistical significance (OR, 0.63; 95% CI, 0.19-2.05; P=0.4). No study specifically reported readmissions required for the management of late complication. This meta-analysis and systematic review of RCTs shows that the development of late (major and minor) complications is similar between LVSG and LRYGB procedures, 6 months to 3 years postoperatively, and they do not lead to higher readmission rate or reoperation rate for either procedure. However longer-term surveillance is required to accurately describe the patterns of late complications in these patients.
Cosmetic surgery volume and its correlation with the major US stock market indices.
Gordon, Chad R; Pryor, Landon; Afifi, Ahmed M; Benedetto, Paul X; Langevin, C J; Papay, Francis; Yetman, Randall; Zins, James E
2010-01-01
As a consumer-driven industry, cosmetic plastic surgery is subject to ebbs and flows as the economy changes. There have been many predictions about the short, intermediate, and long-term impact on cosmetic plastic surgery as a result of difficulties in the current economic climate, but no studies published in the literature have quantified a direct correlation. The authors investigate a possible correlation between cosmetic surgery volume and the economic trends of the three major US stock market indices. A volume analysis for the time period from January 1992 to October 2008 was performed (n = 7360 patients, n = 8205 procedures). Four cosmetic procedures-forehead lift (FL), rhytidectomy (Rh), breast augmentation (BA), and liposuction (Li)-were chosen; breast reduction (BRd), breast reconstruction (BRc), and carpal tunnel release (CTR) were selected for comparison. Case volumes for each procedure and fiscal quarter were compared to the trends of the S&P 500, Dow Jones (DOW), and NASDAQ (NASD) indices. Pearson correlation statistics were used to evaluate a relationship between the market index trends and surgical volume. P values <.05 were considered statistically significant. Three of the four cosmetic surgery procedures investigated (Rh, n = 1540; Li, n = 1291; BA, n = 1959) demonstrated a direct (ie, positive) statistical correlation to all three major market indices. FL (n =312) only correlated to the NASD (P = .021) and did not reach significance with the S&P 500 (P = .077) or DOW (P = .14). BRd and BRc demonstrated a direct correlation to two of the three stock market indices, whereas CTR showed an inverse (ie, negative) correlation to two of the three indices. This study, to our knowledge, is the first to suggest a direct correlation of four cosmetic and two reconstructive plastic surgery procedures to the three major US stock market indices and further emphasizes the importance of a broad-based plastic surgery practice in times of economic recession.
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
Sakallioğlu, Oner; Düzer, Sertaç; Kapusuz, Zeliha
2014-01-01
The aim of our study was to investigate the efficiacy of the suturation technique after completing the tonsillectomy procedure for posttonsillectomy pain control in adult patients. August 2010-February 2011, 44 adult patients, ages ranged from 16 to 41 years old who underwent tonsillectomy at Elaziğ Training and Research Hospital Otorhinolaryngology Clinic were included to the study. After tonsillectomy procedure, anterior and posterior tonsillar archs were sutured each other and so, the area of tonsillectomy lodges which covered with mucosa were increased. Twenty two patients who applied posttonsillectomy suturation were used as study group and remnant 22 patients who did not applied posttonsillectomy suturation were used as control group. The visual analogue score (VAS) was used to evaluate the postoperative pain degree (0 no pain, 10 worst pain). ANOVA test (two ways classification with repeated measures) was used for statistical analysis of VAS values. P < 0.05 was accepted as statistically significant. The effect of time (each post-operative day) on VAS values was significant. The mean VAS values between study and control group on post-operative day 1st, 3rd, 7th, and 10th were statistically significant (P < 0.05). The severity of posttonsillectomy pain was less in study group patients than control group patients. The suturation of anterior and posterior tonsillar archs after tonsillectomy procedure was found effective to alleviate the posttonsillectomy pain in adult patients.
Camara, Jorge G.; Ruszkowski, Joseph M.; Worak, Sandra R.
2008-01-01
Context Music and surgery. Objective To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Design Retrospective case series. Setting and Patients 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Intervention Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Main outcome measure Mean arterial pressure, heart rate, and respiratory rate. Results 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Conclusion Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery. PMID:18679538
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.
Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N
2016-01-01
Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.
BTS statistical standards manual
DOT National Transportation Integrated Search
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Patel, Nitesh V; Sundararajan, Sri; Keller, Irwin; Danish, Shabbar
2018-01-01
Objective: Magnetic resonance (MR)-guided stereotactic laser amygdalohippocampectomy is a minimally invasive procedure for the treatment of refractory epilepsy in patients with mesial temporal sclerosis. Limited data exist on post-ablation volumetric trends associated with the procedure. Methods: 10 patients with mesial temporal sclerosis underwent MR-guided stereotactic laser amygdalohippocampectomy. Three independent raters computed ablation volumes at the following time points: pre-ablation (PreA), immediate post-ablation (IPA), 24 hours post-ablation (24PA), first follow-up post-ablation (FPA), and greater than three months follow-up post-ablation (>3MPA), using OsiriX DICOM Viewer (Pixmeo, Bernex, Switzerland). Statistical trends in post-ablation volumes were determined for the time points. Results: MR-guided stereotactic laser amygdalohippocampectomy produces a rapid rise and distinct peak in post-ablation volume immediately following the procedure. IPA volumes are significantly higher than all other time points. Comparing individual time points within each raters dataset (intra-rater), a significant difference was seen between the IPA time point and all others. There was no statistical difference between the 24PA, FPA, and >3MPA time points. A correlation analysis demonstrated the strongest correlations at the 24PA (r=0.97), FPA (r=0.95), and 3MPA time points (r=0.99), with a weaker correlation at IPA (r=0.92). Conclusion: MR-guided stereotactic laser amygdalohippocampectomy produces a maximal increase in post-ablation volume immediately following the procedure, which decreases and stabilizes at 24 hours post-procedure and beyond three months follow-up. Based on the correlation analysis, the lower inter-rater reliability at the IPA time point suggests it may be less accurate to assess volume at this time point. We recommend post-ablation volume assessments be made at least 24 hours post-selective ablation of the amygdalohippocampal complex (SLAH).
Economic analysis of the future growth of cosmetic surgery procedures.
Liu, Tom S; Miller, Timothy A
2008-06-01
The economic growth of cosmetic surgical and nonsurgical procedures has been tremendous. Between 1992 and 2005, annual U.S. cosmetic surgery volume increased by 725 percent, with over $10 billion spent in 2005. It is unknown whether this growth will continue for the next decade and, if so, what impact it will it have on the plastic surgeon workforce. The authors analyzed annual U.S. cosmetic surgery procedure volume reported by the American Society of Plastic Surgeons (ASPS) National Clearinghouse of Plastic Surgery Statistics between 1992 and 2005. Reconstructive plastic surgery volume was not included in the analysis. The authors analyzed the ability of economic and noneconomic variables to predict annual cosmetic surgery volume. The authors also used growth rate analyses to construct models with which to predict the future growth of cosmetic surgery. None of the economic and noneconomic variables were a significant predictor of annual cosmetic surgery volume. Instead, based on current compound annual growth rates, the authors predict that total cosmetic surgery volume (surgical and nonsurgical) will exceed 55 million annual procedures by 2015. ASPS members are projected to perform 299 surgical and 2165 nonsurgical annual procedures. Non-ASPS members are projected to perform 39 surgical and 1448 nonsurgical annual procedures. If current growth rates continue into the next decade, the future demand in cosmetic surgery will be driven largely by nonsurgical procedures. The growth of surgical procedures will be met by ASPS members. However, meeting the projected growth in nonsurgical procedures could be a potential challenge and a potential area for increased competition.
Failure Mode Identification Through Clustering Analysis
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.
Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests
Carr, R.S.; Biedenbach, J.M.
1999-01-01
When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.
77 FR 53889 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
..., methods, and statistical procedures for assessing and monitoring the health of communities and measuring... methods and the Community Guide, and coordinates division responses to requests for technical assistance...-federal partners in developing indicators, methods, and statistical procedures for measuring and reporting...
10 CFR Appendix II to Part 504 - Fuel Price Computation
Code of Federal Regulations, 2010 CFR
2010-01-01
... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.
2001-01-01
This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.
Kosins, Aaron M; Scholz, Thomas; Cetinkaya, Mine; Evans, Gregory R D
2013-08-01
The purpose of this study was to determine the evidenced-based value of prophylactic drainage of subcutaneous wounds in surgery. An electronic search was performed. Articles comparing subcutaneous prophylactic drainage with no drainage were identified and classified by level of evidence. If sufficient randomized controlled trials were included, a meta-analysis was performed using the random-effects model. Fifty-two randomized controlled trials were included in the meta-analysis, and subgroups were determined by specific surgical procedures or characteristics (cesarean delivery, abdominal wound, breast reduction, breast biopsy, femoral wound, axillary lymph node dissection, hip and knee arthroplasty, obesity, and clean-contaminated wound). Studies were compared for the following endpoints: hematoma, wound healing issues, seroma, abscess, and infection. Fifty-two studies with a total of 6930 operations were identified as suitable for this analysis. There were 3495 operations in the drain group and 3435 in the no-drain group. Prophylactic subcutaneous drainage offered a statistically significant advantage only for (1) prevention of hematomas in breast biopsy procedures and (2) prevention of seromas in axillary node dissections. In all other procedures studied, drainage did not offer an advantage. Many surgical operations can be performed safely without prophylactic drainage. Surgeons can consider omitting drains after cesarean section, breast reduction, abdominal wounds, femoral wounds, and hip and knee joint replacement. Furthermore, surgeons should consider not placing drains prophylactically in obese patients. However, drain placement following a surgical procedure is the surgeon's choice and can be based on multiple factors beyond the type of procedure being performed or the patient's body habitus. Therapeutic, II.
Management of sigmoid volvulus: options and prognosis.
Maddah, Ghodratollah; Kazemzadeh, Gholam Hossein; Abdollahi, Abbas; Bahar, Mostafa Mehrabi; Tavassoli, Alireza; Shabahang, Hossein
2014-01-01
To describe the management of sigmoid volvulus with reference to the type of surgical procedures performed and to determine the prognosis of sigmoid volvulus. A case series. Ghaem Hospital of Mashhad, University of Medical Sciences, Mashhad, Iran, from 1996 to 2008. A total of 944 cases of colon obstruction were reviewed. Demographic, laboratory and treatment results, mortality and complications were recorded. The data was analyzed using descriptive statistics as frequency and percentage for the qualitative variables and mean and standard deviation values for the quantitative variables. Also chisquare and Fisher's exact test were used for the association between the qualitative variables. SPSS statistical software (version 18) was used for the data analysis. In all patients except those with symptoms or signs of gangrenous bowel, a long rectal tube was inserted via the rectosigmoidoscope which was successful in 80 (36.87%) cases. Rectosigmoidoscopic detorsion was unsuccessful in 137 (63.13%) patients, who underwent an emergent laparotomy. The surgical procedures performed in these cases were resection and primary anastomosis in 40 (29.1%), Mikulicz procedure in 9 (6.6%), laparotomy detorsion in 37 (27.01%), Hartmann procedure in 47 (34.3%), mesosigmoidoplasty in 3 (2.19%) patients and total colectomy in one (0.73%) case. The overall mortality was 9.8% (22) patients. In sigmoid volvulus, the most important determinant of patient outcome is bowel viability. The initial treatment of sigmoid colon volvulus is sigmoidoscopy with rectal tube placement.
Hudson, Michelle; Bhogal, Nirmala
2004-11-01
The statistics for animal procedures performed in 2003 were recently released by the Home Office. They indicate that, for the second year running, there was a significant increase in the number of laboratory animal procedures undertaken in Great Britain. The species and genera used, the numbers of toxicology and non-toxicology procedures, and the overall trends, are described. The implications of these latest statistics are discussed with reference to key areas of interest and to the impact of existing regulations and pending legislative reforms.
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Zackay, Arie; Steinhoff, Christine
2010-12-15
Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.
2010-01-01
Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174
Hoffmann, Jürgen; Wallwiener, Diethelm
2009-04-08
One of the basic prerequisites for generating evidence-based data is the availability of classification systems. Attempts to date to classify breast cancer operations have focussed on specific problems, e.g. the avoidance of secondary corrective surgery for surgical defects, rather than taking a generic approach. Starting from an existing, simpler empirical scheme based on the complexity of breast surgical procedures, which was used in-house primarily in operative report-writing, a novel classification of ablative and breast-conserving procedures initially needed to be developed and elaborated systematically. To obtain proof of principle, a prospectively planned analysis of patient records for all major breast cancer-related operations performed at our breast centre in 2005 and 2006 was conducted using the new classification. Data were analysed using basic descriptive statistics such as frequency tables. A novel two-type, six-tier classification system comprising 12 main categories, 13 subcategories and 39 sub-subcategories of oncological, oncoplastic and reconstructive breast cancer-related surgery was successfully developed. Our system permitted unequivocal classification, without exception, of all 1225 procedures performed in 1166 breast cancer patients in 2005 and 2006. Breast cancer-related surgical procedures can be generically classified according to their surgical complexity. Analysis of all major procedures performed at our breast centre during the study period provides proof of principle for this novel classification system. We envisage various applications for this classification, including uses in randomised clinical trials, guideline development, specialist surgical training, continuing professional development as well as quality of care and public health research.
Subcellular object quantification with Squassh3C and SquasshAnalyst.
Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp
2015-11-01
Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.
Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian
2003-09-30
The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.
NASA Astrophysics Data System (ADS)
Olugboji, T. M.; Lekic, V.; McDonough, W.
2017-07-01
We present a new approach for evaluating existing crustal models using ambient noise data sets and its associated uncertainties. We use a transdimensional hierarchical Bayesian inversion approach to invert ambient noise surface wave phase dispersion maps for Love and Rayleigh waves using measurements obtained from Ekström (2014). Spatiospectral analysis shows that our results are comparable to a linear least squares inverse approach (except at higher harmonic degrees), but the procedure has additional advantages: (1) it yields an autoadaptive parameterization that follows Earth structure without making restricting assumptions on model resolution (regularization or damping) and data errors; (2) it can recover non-Gaussian phase velocity probability distributions while quantifying the sources of uncertainties in the data measurements and modeling procedure; and (3) it enables statistical assessments of different crustal models (e.g., CRUST1.0, LITHO1.0, and NACr14) using variable resolution residual and standard deviation maps estimated from the ensemble. These assessments show that in the stable old crust of the Archean, the misfits are statistically negligible, requiring no significant update to crustal models from the ambient noise data set. In other regions of the U.S., significant updates to regionalization and crustal structure are expected especially in the shallow sedimentary basins and the tectonically active regions, where the differences between model predictions and data are statistically significant.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2017-04-01
The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.
Price, Charlotte; Stallard, Nigel; Creton, Stuart; Indans, Ian; Guest, Robert; Griffiths, David; Edwards, Philippa
2010-01-01
Acute inhalation toxicity of chemicals has conventionally been assessed by the median lethal concentration (LC50) test (organisation for economic co-operation and development (OECD) TG 403). Two new methods, the recently adopted acute toxic class method (ATC; OECD TG 436) and a proposed fixed concentration procedure (FCP), have recently been considered, but statistical evaluations of these methods did not investigate the influence of differential sensitivity between male and female rats on the outcomes. This paper presents an analysis of data from the assessment of acute inhalation toxicity for 56 substances. Statistically significant differences between the LC50 for males and females were found for 16 substances, with greater than 10-fold differences in the LC50 for two substances. The paper also reports a statistical evaluation of the three test methods in the presence of unanticipated gender differences. With TG 403, a gender difference leads to a slightly greater chance of under-classification. This is also the case for the ATC method, but more pronounced than for TG 403, with misclassification of nearly all substances from Globally Harmonised System (GHS) class 3 into class 4. As the FCP uses females only, if females are more sensitive, the classification is unchanged. If males are more sensitive, the procedure may lead to under-classification. Additional research on modification of the FCP is thus proposed. PMID:20488841
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.