Code of Federal Regulations, 2011 CFR
2011-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...
Hoyle, R H
1991-02-01
Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-20
... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
Statistical Cost Estimation in Higher Education: Some Alternatives.
ERIC Educational Resources Information Center
Brinkman, Paul T.; Niwa, Shelley
Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…
Watanabe, Hiroshi
2012-01-01
Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.
Determining the Statistical Significance of Relative Weights
ERIC Educational Resources Information Center
Tonidandel, Scott; LeBreton, James M.; Johnson, Jeff W.
2009-01-01
Relative weight analysis is a procedure for estimating the relative importance of correlated predictors in a regression equation. Because the sampling distribution of relative weights is unknown, researchers using relative weight analysis are unable to make judgments regarding the statistical significance of the relative weights. J. W. Johnson…
A Statistical Test for Comparing Nonnested Covariance Structure Models.
ERIC Educational Resources Information Center
Levy, Roy; Hancock, Gregory R.
While statistical procedures are well known for comparing hierarchically related (nested) covariance structure models, statistical tests for comparing nonhierarchically related (nonnested) models have proven more elusive. While isolated attempts have been made, none exists within the commonly used maximum likelihood estimation framework, thereby…
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
Køppe, Simo; Dammeyer, Jesper
2014-09-01
The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.
ERIC Educational Resources Information Center
Hoover, H. D.; Plake, Barbara
The relative power of the Mann-Whitney statistic, the t-statistic, the median test, a test based on exceedances (A,B), and two special cases of (A,B) the Tukey quick test and the revised Tukey quick test, was investigated via a Monte Carlo experiment. These procedures were compared across four population probability models: uniform, beta, normal,…
Statistical analysis of radioimmunoassay. In comparison with bioassay (in Japanese)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakano, R.
1973-01-01
Using the data of RIA (radioimmunoassay), statistical procedures for dealing with two problems of the linearization of dose response curve and calculation of relative potency were described. There were three methods for linearization of dose response curve of RIA. In each method, the following parameters were shown on the horizontal and vertical axis: dose x, (B/T)/sup -1/; c/x + c, B/T (C: dose which makes B/T 50%); log x, logit B/T. Among them, the last method seems to be most practical. The statistical procedures for bioassay were employed for calculating the relative potency of unknown samples compared to the standardmore » samples from dose response curves of standand and unknown samples using regression coefficient. It is desirable that relative potency is calculated by plotting more than 5 points in the standard curve and plotting more than 2 points in unknow samples. For examining the statistical limit of precision of measuremert, LH activity of gonadotropin in urine was measured and relative potency, precision coefficient and the upper and lower limits of relative potency at 95% confidence limit were calculated. On the other hand, bioassay (by the ovarian ascorbic acid reduction method and anteriol lobe of prostate weighing method) was done in the same samples, and the precision was compared with that of RIA. In these examinations, the upper and lower limits of the relative potency at 95% confidence limit were near each other, while in bioassay, a considerable difference was observed between the upper and lower limits. The necessity of standardization and systematization of the statistical procedures for increasing the precision of RIA was pointed out. (JA)« less
From creation and annihilation operators to statistics
NASA Astrophysics Data System (ADS)
Hoyuelos, M.
2018-01-01
A procedure to derive the partition function of non-interacting particles with exotic or intermediate statistics is presented. The partition function is directly related to the associated creation and annihilation operators that obey some specific commutation or anti-commutation relations. The cases of Gentile statistics, quons, Polychronakos statistics, and ewkons are considered. Ewkons statistics was recently derived from the assumption of free diffusion in energy space (Hoyuelos and Sisterna, 2016); an ideal gas of ewkons has negative pressure, a feature that makes them suitable for the description of dark energy.
NASA Technical Reports Server (NTRS)
Wong, K. W.
1974-01-01
In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.
Impaired Statistical Learning in Developmental Dyslexia
Thiessen, Erik D.; Holt, Lori L.
2015-01-01
Purpose Developmental dyslexia (DD) is commonly thought to arise from phonological impairments. However, an emerging perspective is that a more general procedural learning deficit, not specific to phonological processing, may underlie DD. The current study examined if individuals with DD are capable of extracting statistical regularities across sequences of passively experienced speech and nonspeech sounds. Such statistical learning is believed to be domain-general, to draw upon procedural learning systems, and to relate to language outcomes. Method DD and control groups were familiarized with a continuous stream of syllables or sine-wave tones, the ordering of which was defined by high or low transitional probabilities across adjacent stimulus pairs. Participants subsequently judged two 3-stimulus test items with either high or low statistical coherence as being the most similar to the sounds heard during familiarization. Results As with control participants, the DD group was sensitive to the transitional probability structure of the familiarization materials as evidenced by above-chance performance. However, the performance of participants with DD was significantly poorer than controls across linguistic and nonlinguistic stimuli. In addition, reading-related measures were significantly correlated with statistical learning performance of both speech and nonspeech material. Conclusion Results are discussed in light of procedural learning impairments among participants with DD. PMID:25860795
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
Portillo, M C; Gonzalez, J M
2008-08-01
Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
ADEQUACY OF VISUALLY CLASSIFIED PARTICLE COUNT STATISTICS FROM REGIONAL STREAM HABITAT SURVEYS
Streamlined sampling procedures must be used to achieve a sufficient sample size with limited resources in studies undertaken to evaluate habitat status and potential management-related habitat degradation at a regional scale. At the same time, these sampling procedures must achi...
Code of Federal Regulations, 2012 CFR
2012-07-01
... carrying out a census survey or related activities. (7) Records may be disclosed for statistical research... research or reporting records; the records will only be transferred in a form that is not individually.... 3056. (iv) Used only to generate aggregate statistical data or for other similarly evaluative or...
Code of Federal Regulations, 2011 CFR
2011-07-01
... carrying out a census survey or related activities. (7) Records may be disclosed for statistical research... research or reporting records; the records will only be transferred in a form that is not individually.... 3056. (iv) Used only to generate aggregate statistical data or for other similarly evaluative or...
Code of Federal Regulations, 2010 CFR
2010-07-01
... carrying out a census survey or related activities. (7) Records may be disclosed for statistical research... research or reporting records; the records will only be transferred in a form that is not individually.... 3056. (iv) Used only to generate aggregate statistical data or for other similarly evaluative or...
Reproducibility-optimized test statistic for ranking genes in microarray studies.
Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero
2008-01-01
A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Computer Assisted Problem Solving in an Introductory Statistics Course. Technical Report No. 56.
ERIC Educational Resources Information Center
Anderson, Thomas H.; And Others
The computer assisted problem solving system (CAPS) described in this booklet administered "homework" problem sets designed to develop students' computational, estimation, and procedural skills. These skills were related to important concepts in an introductory statistics course. CAPS generated unique data, judged student performance,…
Handhayanti, Ludwy; Rustina, Yeni; Budiati, Tri
Premature infants tend to lose heat quickly. This loss can be aggravated when they have received an invasive procedure involving a venous puncture. This research uses crossover design by conducting 2 intervention tests to compare 2 different treatments on the same sample. This research involved 2 groups with 18 premature infants in each. The process of data analysis used a statistical independent t test. Interventions conducted in an open incubator showed a p value of .001 which statistically related to heat loss in premature infants. In contrast, the radiant warmer p value of .001 statistically referred to a different range of heat gain before and after the venous puncture was given. The radiant warmer saved the premature infant from hypothermia during the invasive procedure. However, it is inadvisable for routine care of newborn infants since it can increase insensible water loss.
de Sá, Joceline Cássia Ferezini; Marini, Gabriela; Gelaleti, Rafael Bottaro; da Silva, João Batista; de Azevedo, George Gantas; Rudge, Marilza Vieira Cunha
2013-11-01
To evaluate the methodological and statistical design evolution of the publications in the Brazilian Journal of Gynecology and Obstetrics (RBGO) from resolution 196/96. A review of 133 articles published in 1999 (65) and 2009 (68) was performed by two independent reviewers with training in clinical epidemiology and methodology of scientific research. We included all original clinical articles, case and series reports and excluded editorials, letters to the editor, systematic reviews, experimental studies, opinion articles, besides abstracts of theses and dissertations. Characteristics related to the methodological quality of the studies were analyzed in each article using a checklist that evaluated two criteria: methodological aspects and statistical procedures. We used descriptive statistics and the χ2 test for comparison of the two years. There was a difference between 1999 and 2009 regarding the study and statistical design, with more accuracy in the procedures and the use of more robust tests between 1999 and 2009. In RBGO, we observed an evolution in the methods of published articles and a more in-depth use of the statistical analyses, with more sophisticated tests such as regression and multilevel analyses, which are essential techniques for the knowledge and planning of health interventions, leading to fewer interpretation errors.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Code of Federal Regulations, 2013 CFR
2013-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Code of Federal Regulations, 2012 CFR
2012-07-01
... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...
Physics in Perspective Volume II, Part C, Statistical Data.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Physics Survey Committee.
Statistical data relating to the sociology and economics of the physics enterprise are presented and explained. The data are divided into three sections: manpower data, data on funding and costs, and data on the literature of physics. Each section includes numerous studies, with notes on the sources and types of data, gathering procedures, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... regulated as a carcinogen, FDA will analyze the data submitted using either a statistical extrapolation... million. * * * * * 0 3. In Sec. 500.84, revise paragraph (c) introductory text to read as follows: Sec...
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
How Near is a Near-Optimal Solution: Confidence Limits for the Global Optimum.
1980-05-01
or near-optimal solutions are the only practical solutions available. This paper identifies and compares some procedures which use independent near...approximate or near-optimal solutions are the only practical solutions available. This paper identifies and compares some procedures which use inde- pendent...The objective of this paper is to indicate some relatively new statistical procedures for obtaining an upper confidence limit on G Each of these
ERIC Educational Resources Information Center
South Carolina Commission on Higher Education, Columbia.
This manual outlines the policies and procedures related to the submission and review of facilities projects at South Carolina's public colleges and universities. It provides an overview of the South Carolina Commission on Higher Education's role and responsibilities and its general policy regarding permanent improvements to facilities. The report…
Multiple linear regression analysis
NASA Technical Reports Server (NTRS)
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Statistical relative gain calculation for Landsat 8
NASA Astrophysics Data System (ADS)
Anderson, Cody; Helder, Dennis L.; Jeno, Drake
2017-09-01
The Landsat 8 Operational Land Imager (OLI) is an optical multispectral push-broom sensor with a focal plane consisting of over 7000 detectors per spectral band. Each of the individual imaging detectors contributes one column of pixels to an image. Any difference in the response between neighboring detectors may result in a visible stripe or band in the imagery. An accurate estimate of each detector's relative gain is needed to account for any differences between detector responses. This paper describes a procedure for estimating relative gains which uses normally acquired Earth viewing statistics.
Statistical relative gain calculation for Landsat 8
Anderson (CTR), Cody; Helder, Dennis; Jeno (CTR), Drake
2017-01-01
The Landsat 8 Operational Land Imager (OLI) is an optical multispectral push-broom sensor with a focal plane consisting of over 7000 detectors per spectral band. Each of the individual imaging detectors contributes one column of pixels to an image. Any difference in the response between neighboring detectors may result in a visible stripe or band in the imagery. An accurate estimate of each detector’s relative gain is needed to account for any differences between detector responses. This paper describes a procedure for estimating relative gains which uses normally acquired Earth viewing statistics.
Evaluation of noise pollution level in the operating rooms of hospitals: A study in Iran.
Giv, Masoumeh Dorri; Sani, Karim Ghazikhanlou; Alizadeh, Majid; Valinejadi, Ali; Majdabadi, Hesamedin Askari
2017-06-01
Noise pollution in the operating rooms is one of the remaining challenges. Both patients and physicians are exposed to different sound levels during the operative cases, many of which can last for hours. This study aims to evaluate the noise pollution in the operating rooms during different surgical procedures. In this cross-sectional study, sound level in the operating rooms of Hamadan University-affiliated hospitals (totally 10) in Iran during different surgical procedures was measured using B&K sound meter. The gathered data were compared with national and international standards. Statistical analysis was performed using descriptive statistics and one-way ANOVA, t -test, and Pearson's correlation test. Noise pollution level at majority of surgical procedures is higher than national and international documented standards. The highest level of noise pollution is related to orthopedic procedures, and the lowest one related to laparoscopic and heart surgery procedures. The highest and lowest registered sound level during the operation was 93 and 55 dB, respectively. Sound level generated by equipments (69 ± 4.1 dB), trolley movement (66 ± 2.3 dB), and personnel conversations (64 ± 3.9 dB) are the main sources of noise. The noise pollution of operating rooms are higher than available standards. The procedure needs to be corrected for achieving the proper conditions.
78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...
Statistical Properties of a Two-Stage Procedure for Creating Sky Flats
NASA Astrophysics Data System (ADS)
Crawford, R. W.; Trueblood, M.
2004-05-01
Accurate flat fielding is an essential factor in image calibration and good photometry, yet no single method for creating flat fields is both practical and effective in all cases. At Winer Observatory, robotic telescope opera- tion and the research program of Near Earth Object follow-up astrometry favor the use of sky flats formed from the many images that are acquired during a night. This paper reviews the statistical properties of the median-combine process used to create sky flats and discusses a computationally efficient procedure for two-stage combining of many images to form sky flats with relatively high signal-to-noise ratio (SNR). This procedure is in use at Winer for the flat field calibration of unfiltered images taken for NEO follow-up astrometry.
Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell
2012-01-01
Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.
Beta, Jaroslaw; Lesmes-Heredia, Cristina; Bedetti, Chiara; Akolekar, Ranjit
2018-04-01
The aim of this paper was to estimate the risk of miscarriage after amniocentesis or chorionic villus sampling (CVS) based on a systematic review of the literature. A search of Medline, Embase, and The Cochrane Library (2000-2017) was carried out to identify studies reporting complications following CVS or amniocentesis. The inclusion criteria for the systematic review were studies reporting results from large controlled studies (N.≥1000 invasive procedures) and those reporting data for pregnancy loss prior to 24 weeks' gestation. Data for cases that had invasive procedure and controls were inputted in contingency tables and risk of miscarriage was estimated for each study. Summary statistics were calculated after taking into account the weighting for each study included in the systematic review. Procedure-related risk of miscarriage was estimated as a weighted risk difference from the summary statistics for cases and controls. The electronic search from the databases yielded 2465 potential citations of which 2431 were excluded, leaving 34 studies for full-text review. The final review included 10 studies for amniocentesis and 6 studies for CVS, which were used to estimate risk of miscarriage in pregnancies that had an invasive procedure and the control pregnancies that did not. The procedure-related risk of miscarriage following amniocentesis was 0.35% (95% confidence interval [CI]: 0.07 to 0.63) and that following CVS was 0.35% (95% CI: -0.31 to 1.00). The procedure-related risks of miscarriage following amniocentesis and CVS are lower than currently quoted to women.
Why McNemar's Procedure Needs to Be Included in the Business Statistics Curriculum
ERIC Educational Resources Information Center
Berenson, Mark L.; Koppel, Nicole B.
2005-01-01
In business research situations it is often of interest to examine the differences in the responses in repeated measurements of the same subjects or from among matched or paired subjects. A simple and useful procedure for comparing differences between proportions in two related samples was devised by McNemar (1947) nearly 60 years ago. Although…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less
Statistical and procedural issues in the use of heated taxidermic mounts
Bakken, G.S.; Kenow, K.P.; Korschgen, C.E.; Boysen, A.F.
2000-01-01
Studies using mounts have an inherently nested error structure; calibration and standardization should use the appropriate procedures and statistics. One example is that individual mount differences are nested within morphological factors related to species, age, or gender; without replication, mount differences may be confused with differences due to morphology. Also, the sensitivity of mounts to orientation to wind or sun is nested within mount; without replication, inadvertent variation in mount positioning may be confused with differences among mounts. Data on heat loss from a of 1-day-old mallard duckling mount are used to illustrate orientation sensitivity. (C) 2000 Elsevier Science Ltd. All rights reserved.
Hood of the truck statistics for food animal practitioners.
Slenning, Barrett D
2006-03-01
This article offers some tips on working with statistics and develops four relatively simple procedures to deal with most kinds of data with which veterinarians work. The criterion for a procedure to be a "Hood of the Truck Statistics" (HOT Stats) technique is that it must be simple enough to be done with pencil, paper, and a calculator. The goal of HOT Stats is to have the tools available to run quick analyses in only a few minutes so that decisions can be made in a timely fashion. The discipline allows us to move away from the all-too-common guess work about effects and differences we perceive following a change in treatment or management. The techniques allow us to move toward making more defensible, credible, and more quantifiably "risk-aware" real-time recommendations to our clients.
75 FR 38871 - Proposed Collection; Comment Request for Revenue Procedure 2004-29
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
... comments concerning Revenue Procedure 2004-29, Statistical Sampling in Sec. 274 Context. DATES: Written... Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling in Sec...: Revenue Procedure 2004-29 prescribes the statistical sampling methodology by which taxpayers under...
Hazardous substances, the environment and public health: a statistical overview.
Hunter, W G; Crowley, J J
1979-01-01
The purpose of this paper is to provide an overview of the statistical problems that exist and procedures that are available when attempts are made to assess the possible harm which has been or might be caused by substances in the environment. These issues bear directly on important decisions of public policy such as those related to the establishment and enforcement of regulations. PMID:540596
Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates
Bartroff, Jay; Song, Jinlin
2014-01-01
This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948
NASA Astrophysics Data System (ADS)
Giovanna, Vessia; Luca, Pisano; Carmela, Vennari; Mauro, Rossi; Mario, Parise
2016-01-01
This paper proposes an automated method for the selection of rainfall data (duration, D, and cumulated, E), responsible for shallow landslide initiation. The method mimics an expert person identifying D and E from rainfall records through a manual procedure whose rules are applied according to her/his judgement. The comparison between the two methods is based on 300 D-E pairs drawn from temporal rainfall data series recorded in a 30 days time-lag before the landslide occurrence. Statistical tests, employed on D and E samples considered both paired and independent values to verify whether they belong to the same population, show that the automated procedure is able to replicate the expert pairs drawn by the expert judgment. Furthermore, a criterion based on cumulated distribution functions (CDFs) is proposed to select the most related D-E pairs to the expert one among the 6 drawn from the coded procedure for tracing the empirical rainfall threshold line.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yip, Doris; Vanasco, Matthew; Funaki, Brian
2004-01-15
To compare complication rates and tube performance of percutaneous mushroom gastrostomy, balloon gastrostomy, and gastrojejunostomy. Between September 9, 1999 and April 23, 2001, 203 patients underwent 250 radiologically guided percutaneous gastrostomy and gastrojejunostomy procedures. Follow-up was conducted through chart reviews and review of our interventional radiology database. Procedural and catheter-related complications were recorded. Chi-square statistical analysis was performed. In patients receiving mushroom-retained gastrostomy catheters (n = 114), the major complication rate was 0.88% (n = 1), the minor complication rate was 5.3% (n = 6), and the tube complication rate was 4.4% (n = 5). In patients receiving balloon-retained gastrostomymore » tubes (n = 67), the major complication rate was 0, the minor complication rate was 4.5% (n = 3), and the tube complication rate was 34.3% (n = 23). In patients receiving gastrojejunostomy catheters (n = 69), the major complication rate was 1.4% (n = 1), the minor complication rate was 2.9% (n = 2), and the tube complication rate was 34.8% (n = 24). No statistically significant differences were found between procedural or peri-procedural complications among the different types of tubes. Mushroom-retained catheters had significantly fewer tube complications (p < 0.01). Percutaneous gastrostomy and gastrojejunostomy have similar procedural and peri-procedural complication rates. Mushroom gastrostomy catheters have fewer tube-related complications compared with balloon gastrostomy and gastrojejunostomy catheters. In addition, mushroom-retained catheters exhibit the best overall long-term tube patency and are therefore the gastrostomy catheter of choice.« less
Jacob, Laurent; Combes, Florence; Burger, Thomas
2018-06-18
We propose a new hypothesis test for the differential abundance of proteins in mass-spectrometry based relative quantification. An important feature of this type of high-throughput analyses is that it involves an enzymatic digestion of the sample proteins into peptides prior to identification and quantification. Due to numerous homology sequences, different proteins can lead to peptides with identical amino acid chains, so that their parent protein is ambiguous. These so-called shared peptides make the protein-level statistical analysis a challenge and are often not accounted for. In this article, we use a linear model describing peptide-protein relationships to build a likelihood ratio test of differential abundance for proteins. We show that the likelihood ratio statistic can be computed in linear time with the number of peptides. We also provide the asymptotic null distribution of a regularized version of our statistic. Experiments on both real and simulated datasets show that our procedures outperforms state-of-the-art methods. The procedures are available via the pepa.test function of the DAPAR Bioconductor R package.
75 FR 53738 - Proposed Collection; Comment Request for Rev. Proc. 2007-35
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
... Revenue Procedure Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES... through the Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling...: This revenue procedure provides for determining when statistical sampling may be used in purposes of...
Actuarial analysis of surgical results: rationale and method.
Grunkemeier, G L; Starr, A
1977-11-01
The use of time-related methods of statistical analysis is essential for valid evaluation of the long-term results of a surgical procedure. Accurate comparison of two procedures or two prosthetic devices is possible only when the length of follow-up is properly accounted for. The purpose of this report is to make the technical aspects of the acturial, or life table, method easily accessible to the surgeon, with emphasis on the motivation for and the rationale behind it. This topic is illustrated in terms of heart valve prostheses, a field that is rapidly developing. Both the authors and readers of articles must be aware that controversies surrounding the relative merits of various prosthetic designs or operative procedures can be settled only if proper time-related methods of analysis are utilized.
The epistemology of mathematical and statistical modeling: a quiet methodological revolution.
Rodgers, Joseph Lee
2010-01-01
A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.
Statistical methodology for the analysis of dye-switch microarray experiments
Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques
2008-01-01
Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965
Linking the Smarter Balanced Assessments to NWEA MAP Assessments
ERIC Educational Resources Information Center
Northwest Evaluation Association, 2015
2015-01-01
Concordance tables have been used for decades to relate scores on different tests measuring similar but distinct constructs. These tables, typically derived from statistical linking procedures, provide a direct link between scores on different tests and serve various purposes. Aside from describing how a score on one test relates to performance on…
Primary Factors Related to Multiple Placements for Children in Out-of-Home Care
ERIC Educational Resources Information Center
Eggertsen, Lars
2008-01-01
Using an ecological framework, this study identified which factors related to out-of-home placements significantly influenced multiple placements for children in Utah during 2000, 2001, and 2002. Multinomial logistic regression statistical procedures and a geographical information system (GIS) were used to analyze the data. The final model…
NASA Astrophysics Data System (ADS)
Pisano, Luca; Vessia, Giovanna; Vennari, Carmela; Parise, Mario
2015-04-01
Empirical rainfall thresholds are a well established method to draw information about Duration (D) and Cumulated (E) values of the rainfalls that are likely to initiate shallow landslides. To this end, rain-gauge records of rainfall heights are commonly used. Several procedures can be applied to address the calculation of the Duration-Cumulated height and, eventually, the Intensity values related to the rainfall events responsible for shallow landslide onset. A large number of procedures are drawn from particular geological settings and climate conditions based on an expert identification of the rainfall event. A few researchers recently devised automated procedures to reconstruct the rainfall events responsible for landslide onset. In this study, 300 pairs of D, E couples, related to shallow landslides that occurred in a ten year span 2002-2012 on the Italian territory, have been drawn by means of two procedures: the expert method (Brunetti et al., 2010) and the automated method (Vessia et al., 2014). The two procedures start from the same sources of information on shallow landslides occurred during or soon after a rainfall. Although they have in common the method to select the date (up to the hour of the landslide occurrence), the site of the landslide and the choice of the rain-gauge representative for the rainfall, they differ when calculating the Duration and Cumulated height of the rainfall event. Moreover, the expert procedure identifies only one D, E pair for each landslide whereas the automated procedure draws 6 possible D,E pairs for the same landslide event. Each one of the 300 D, E pairs calculated by the automated procedure reproduces about 80% of the E values and about 60% of the D values calculated by the expert procedure. Unfortunately, no standard methods are available for checking the forecasting ability of both the expert and the automated reconstruction of the true D, E pairs that result in shallow landslide. Nonetheless, a statistical analysis on marginal distributions of the seven samples of 300 D and E values are performed in this study. The main objective of this statistical analysis is to highlight similarities and differences in the two sets of samples of Duration and Cumulated values collected by the two procedures. At first, the sample distributions have been investigated: the seven E samples are Lognormal distributed, whereas the D samples are all distributed Weibull like. On E samples, due to their Lognormal distribution, statistical tests can be applied to check two null hypotheses: equal mean values through the Student test, equal standard deviations through the Fisher test. These two hypotheses are accepted for the seven E samples, meaning that they come from the same population, at a confidence level of 95%. Conversely, the preceding tests cannot be applied to the seven D samples that are Weibull distributed with shape parameters k ranging between 0.9 to 1.2. Nonetheless, the two procedures calculate the rainfall event through the selection of the E values; after that the D is drawn. Thus, the results of this statistical analysis preliminary confirms the similarities of the two D,E pair set of values drawn from the two different procedures. References Brunetti, M.T., Peruccacci, S., Rossi, M., Luciani, S., Valigi, D., and Guzzetti, F.: Rainfall thresholds for the possible occurrence of landslides in Italy, Nat. Hazards Earth Syst. Sci., 10, 447-458, doi:10.5194/nhess-10-447-2010, 2010. Vessia G., Parise M., Brunetti M.T., Peruccacci S., Rossi M., Vennari C., and Guzzetti F.: Automated reconstruction of rainfall events responsible for shallow landslides, Nat. Hazards Earth Syst. Sci., 14, 2399-2408, doi: 10.5194/nhess-14-2399-2014, 2014.
Engineering Students Designing a Statistical Procedure for Quantifying Variability
ERIC Educational Resources Information Center
Hjalmarson, Margret A.
2007-01-01
The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...
49 CFR 199.117 - Recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing... provided by DOT Procedures. Statistical data related to drug testing and rehabilitation that is not name... employee drug test that indicate a verified positive result, records that demonstrate compliance with the...
49 CFR 199.117 - Recordkeeping.
Code of Federal Regulations, 2013 CFR
2013-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing... provided by DOT Procedures. Statistical data related to drug testing and rehabilitation that is not name... employee drug test that indicate a verified positive result, records that demonstrate compliance with the...
49 CFR 199.117 - Recordkeeping.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing... provided by DOT Procedures. Statistical data related to drug testing and rehabilitation that is not name... employee drug test that indicate a verified positive result, records that demonstrate compliance with the...
49 CFR 199.117 - Recordkeeping.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing... provided by DOT Procedures. Statistical data related to drug testing and rehabilitation that is not name... employee drug test that indicate a verified positive result, records that demonstrate compliance with the...
49 CFR 199.117 - Recordkeeping.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY DRUG AND ALCOHOL TESTING Drug Testing... provided by DOT Procedures. Statistical data related to drug testing and rehabilitation that is not name... employee drug test that indicate a verified positive result, records that demonstrate compliance with the...
A simple test of association for contingency tables with multiple column responses.
Decady, Y J; Thomas, D R
2000-09-01
Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.
Plastic Surgery Statistics in the US: Evidence and Implications.
Heidekrueger, Paul I; Juran, Sabrina; Patel, Anup; Tanna, Neil; Broer, P Niclas
2016-04-01
The American Society of Plastic Surgeons publishes yearly procedural statistics, collected through questionnaires and online via tracking operations and outcomes for plastic surgeons (TOPS). The statistics, disaggregated by U.S. region, leave two important factors unaccounted for: (1) the underlying base population and (2) the number of surgeons performing the procedures. The presented analysis puts the regional distribution of surgeries into perspective and contributes to fulfilling the TOPS legislation objectives. ASPS statistics from 2005 to 2013 were analyzed by geographic region in the U.S. Using population estimates from the 2010 U.S. Census Bureau, procedures were calculated per 100,000 population. Then, based on the ASPS member roster, the rate of surgeries per surgeon by region was calculated and the interaction of these two variables was related to each other. In 2013, 1668,420 esthetic surgeries were performed in the U.S., resulting in the following ASPS ranking: 1st Mountain/Pacific (Region 5; 502,094 procedures, 30 % share), 2nd New England/Middle Atlantic (Region 1; 319,515, 19 %), 3rd South Atlantic (Region 3; 310,441, 19 %), 4th East/West South Central (Region 4; 274,282, 16 %), and 5th East/West North Central (Region 2; 262,088, 16 %). However, considering underlying populations, distribution and ranking appear to be different, displaying a smaller variance in surgical demand. Further, the number of surgeons and rate of procedures show great regional variation. Demand for plastic surgery is influenced by patients' geographic background and varies among U.S. regions. While ASPS data provide important information, additional insight regarding the demand for surgical procedures can be gained by taking certain demographic factors into consideration. This journal requires that the authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F
2015-01-01
To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.
A Statistical Analysis of Brain Morphology Using Wild Bootstrapping
Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.
2008-01-01
Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909
Statistics in the pharmacy literature.
Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R
2004-09-01
Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.
Kopp-Schneider, Annette; Prieto, Pilar; Kinsner-Ovaskainen, Agnieszka; Stanzel, Sven
2013-06-01
In the framework of toxicology, a testing strategy can be viewed as a series of steps which are taken to come to a final prediction about a characteristic of a compound under study. The testing strategy is performed as a single-step procedure, usually called a test battery, using simultaneously all information collected on different endpoints, or as tiered approach in which a decision tree is followed. Design of a testing strategy involves statistical considerations, such as the development of a statistical prediction model. During the EU FP6 ACuteTox project, several prediction models were proposed on the basis of statistical classification algorithms which we illustrate here. The final choice of testing strategies was not based on statistical considerations alone. However, without thorough statistical evaluations a testing strategy cannot be identified. We present here a number of observations made from the statistical viewpoint which relate to the development of testing strategies. The points we make were derived from problems we had to deal with during the evaluation of this large research project. A central issue during the development of a prediction model is the danger of overfitting. Procedures are presented to deal with this challenge. Copyright © 2012 Elsevier Ltd. All rights reserved.
Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
Reveal Listeria 2.0 test for detection of Listeria spp. in foods and environmental samples.
Alles, Susan; Curry, Stephanie; Almy, David; Jagadeesan, Balamurugan; Rice, Jennifer; Mozola, Mark
2012-01-01
A Performance Tested Method validation study was conducted for a new lateral flow immunoassay (Reveal Listeria 2.0) for detection of Listeria spp. in foods and environmental samples. Results of inclusivity testing showed that the test detects all species of Listeria, with the exception of L. grayi. In exclusivity testing conducted under nonselective growth conditions, all non-listeriae tested produced negative Reveal assay results, except for three strains of Lactobacillus spp. However, these lactobacilli are inhibited by the selective Listeria Enrichment Single Step broth enrichment medium used with the Reveal method. Six foods were tested in parallel by the Reveal method and the U.S. Food and Drug Administration/Bacteriological Analytical Manual (FDA/BAM) reference culture procedure. Considering data from both internal and independent laboratory trials, overall sensitivity of the Reveal method relative to that of the FDA/BAM procedure was 101%. Four foods were tested in parallel by the Reveal method and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference culture procedure. Overall sensitivity of the Reveal method relative to that of the USDA-FSIS procedure was 98.2%. There were no statistically significant differences in the number of positives obtained by the Reveal and reference culture procedures in any food trials. In testing of swab or sponge samples from four types of environmental surfaces, sensitivity of Reveal relative to that of the USDA-FSIS reference culture procedure was 127%. For two surface types, differences in the number of positives obtained by the Reveal and reference methods were statistically significant, with more positives by the Reveal method in both cases. Specificity of the Reveal assay was 100%, as there were no unconfirmed positive results obtained in any phase of the testing. Results of ruggedness experiments showed that the Reveal assay is tolerant of modest deviations in test sample volume and device incubation time.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...
Bowling, Mark R; Kohan, Matthew W; Walker, Paul; Efird, Jimmy; Ben Or, Sharon
2015-01-01
Navigational bronchoscopy is utilized to guide biopsies of peripheral lung nodules and place fiducial markers for treatment of limited stage lung cancer with stereotactic body radiotherapy. The type of sedation used for this procedure remains controversial. We performed a retrospective chart review to evaluate the differences of diagnostic yield and overall success of the procedure based on anesthesia type. Electromagnetic navigational bronchoscopy was performed using the superDimension software system. Once the targeted lesion was within reach, multiple tissue samples were obtained. Statistical analysis was used to correlate the yield with the type of sedation among other factors. A successful procedure was defined if a diagnosis was made or a fiducial marker was adequately placed. Navigational bronchoscopy was performed on a total of 120 targeted lesions. The overall complication rate of the procedure was 4.1%. The diagnostic yield and success of the procedure was 74% and 87%, respectively. Duration of the procedure was the only significant difference between the general anesthesia and IV sedation groups (mean, 58 vs. 43 min, P=0.0005). A larger tumor size was associated with a higher diagnostic yield (P=0.032). All other variables in terms of effect on diagnostic yield and an unsuccessful procedure did not meet statistical significance. Navigational bronchoscopy is a safe and effective pulmonary diagnostic tool with relatively low complication rate. The diagnostic yield and overall success of the procedure does not seem to be affected by the type of sedation used.
34 CFR 361.23 - Requirements related to the statewide workforce investment system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... technology for individuals with disabilities; (ii) The use of information and financial management systems... statistics, job vacancies, career planning, and workforce investment activities; (iii) The use of customer service features such as common intake and referral procedures, customer databases, resource information...
Nazi Medical Research in Neuroscience: Medical Procedures, Victims, and Perpetrators.
Loewenau, Aleksandra; Weindling, Paul J
Issues relating to the euthanasia killings of the mentally ill, the medical research conducted on collected body parts, and the clinical investigations on living victims under National Socialism are among the best-known abuses in medical history. But to date, there have been no statistics compiled regarding the extent and number of the victims and perpetrators, or regarding their identities in terms of age, nationality, and gender. "Victims of Unethical Human Experiments and Coerced Research under National Socialism," a research project based at Oxford Brookes University, has established an evidence-based documentation of the overall numbers of victims and perpetrators through specific record linkages of the evidence from the period of National Socialism, as well as from post-WWII trials and other records. This article examines the level and extent of these unethical medical procedures as they relate to the field of neuroscience. It presents statistical information regarding the victims, as well as detailing the involvement of the perpetrators and Nazi physicians with respect to their post-war activities and subsequent court trials.
Randomization Procedures Applied to Analysis of Ballistic Data
1991-06-01
test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE
Richardson, Janet; Smith, Joanna E; McCall, Gillian; Pilkington, Karen
2006-01-01
The aim of this study was to systematically review and critically appraise the evidence on the effectiveness of hypnosis for procedure-related pain and distress in pediatric cancer patients. A comprehensive search of major biomedical and specialist complementary and alternative medicine databases was conducted. Citations were included from the databases' inception to March 2005. Efforts were made to identify unpublished and ongoing research. Controlled trials were appraised using predefined criteria. Clinical commentaries were obtained for each study. Seven randomized controlled clinical trials and one controlled clinical trial were found. Studies report positive results, including statistically significant reductions in pain and anxiety/distress, but a number of methodological limitations were identified. Systematic searching and appraisal has demonstrated that hypnosis has potential as a clinically valuable intervention for procedure-related pain and distress in pediatric cancer patients. Further research into the effectiveness and acceptability of hypnosis for pediatric cancer patients is recommended.
Value assignment and uncertainty evaluation for single-element reference solutions
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.
2018-06-01
A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Dominance, Information, and Hierarchical Scaling of Variance Space.
ERIC Educational Resources Information Center
Ceurvorst, Robert W.; Krus, David J.
1979-01-01
A method for computation of dominance relations and for construction of their corresponding hierarchical structures is presented. The link between dominance and variance allows integration of the mathematical theory of information with least squares statistical procedures without recourse to logarithmic transformations of the data. (Author/CTM)
The 1996 Food Quality Protection Act (FQPA) and the 1996 Safe Drinking Water Act Amendments (SDWAA) reaffirm previous Acts that mandate the EPA to evaluate risks posed by environmental chemical mixtures. The current report develops biological concepts and statistical procedures f...
Medical statistics and hospital medicine: the case of the smallpox vaccination.
Rusnock, Andrea
2007-01-01
Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.
Using Relative Statistics and Approximate Disease Prevalence to Compare Screening Tests.
Samuelson, Frank; Abbey, Craig
2016-11-01
Schatzkin et al. and other authors demonstrated that the ratios of some conditional statistics such as the true positive fraction are equal to the ratios of unconditional statistics, such as disease detection rates, and therefore we can calculate these ratios between two screening tests on the same population even if negative test patients are not followed with a reference procedure and the true and false negative rates are unknown. We demonstrate that this same property applies to an expected utility metric. We also demonstrate that while simple estimates of relative specificities and relative areas under ROC curves (AUC) do depend on the unknown negative rates, we can write these ratios in terms of disease prevalence, and the dependence of these ratios on a posited prevalence is often weak particularly if that prevalence is small or the performance of the two screening tests is similar. Therefore we can estimate relative specificity or AUC with little loss of accuracy, if we use an approximate value of disease prevalence.
40 CFR 1065.12 - Approval of alternate procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engine meets all applicable emission standards according to specified procedures. (iii) Use statistical.... (e) We may give you specific directions regarding methods for statistical analysis, or we may approve... statistical tests. Perform the tests as follows: (1) Repeat measurements for all applicable duty cycles at...
Bayesian estimation of the transmissivity spatial structure from pumping test data
NASA Astrophysics Data System (ADS)
Demir, Mehmet Taner; Copty, Nadim K.; Trinchero, Paolo; Sanchez-Vila, Xavier
2017-06-01
Estimating the statistical parameters (mean, variance, and integral scale) that define the spatial structure of the transmissivity or hydraulic conductivity fields is a fundamental step for the accurate prediction of subsurface flow and contaminant transport. In practice, the determination of the spatial structure is a challenge because of spatial heterogeneity and data scarcity. In this paper, we describe a novel approach that uses time drawdown data from multiple pumping tests to determine the transmissivity statistical spatial structure. The method builds on the pumping test interpretation procedure of Copty et al. (2011) (Continuous Derivation method, CD), which uses the time-drawdown data and its time derivative to estimate apparent transmissivity values as a function of radial distance from the pumping well. A Bayesian approach is then used to infer the statistical parameters of the transmissivity field by combining prior information about the parameters and the likelihood function expressed in terms of radially-dependent apparent transmissivities determined from pumping tests. A major advantage of the proposed Bayesian approach is that the likelihood function is readily determined from randomly generated multiple realizations of the transmissivity field, without the need to solve the groundwater flow equation. Applying the method to synthetically-generated pumping test data, we demonstrate that, through a relatively simple procedure, information on the spatial structure of the transmissivity may be inferred from pumping tests data. It is also shown that the prior parameter distribution has a significant influence on the estimation procedure, given the non-uniqueness of the estimation procedure. Results also indicate that the reliability of the estimated transmissivity statistical parameters increases with the number of available pumping tests.
Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.
G.R. Johnson; J.N. King
1998-01-01
Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...
Subotin, Michael; Davis, Anthony R
2016-09-01
Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Dangers in Using Analysis of Covariance Procedures.
ERIC Educational Resources Information Center
Campbell, Kathleen T.
Problems associated with the use of analysis of covariance (ANCOVA) as a statistical control technique are explained. Three problems relate to the use of "OVA" methods (analysis of variance, analysis of covariance, multivariate analysis of variance, and multivariate analysis of covariance) in general. These are: (1) the wasting of information when…
Standard Operating Procedures for Collecting Data from Local Education Agencies.
ERIC Educational Resources Information Center
McElreath, Nancy R., Ed.
A systematic approach to planning and presenting the data collection activities of a State Department of Education is described. The Information Communication System, a model communication system used by the state of New Jersey, conveys narrative and statistical information relating to a school district's students, teachers, finances, facilities…
Applications of Longitudinal Research to Advance Organizational Theory: A Primer
ERIC Educational Resources Information Center
Newman, David; Newman, Isadore; Hitchcock, John H.
2016-01-01
The purpose of this article is to inform researchers about and encourage the use of longitudinal designs to further understanding of human resource development and organizational theory. This article presents information about a variety of longitudinal research designs, related statistical procedures, and an overview of general data collecting…
Observed-Score Equating with a Heterogeneous Target Population
ERIC Educational Resources Information Center
Duong, Minh Q.; von Davier, Alina A.
2012-01-01
Test equating is a statistical procedure for adjusting for test form differences in difficulty in a standardized assessment. Equating results are supposed to hold for a specified target population (Kolen & Brennan, 2004; von Davier, Holland, & Thayer, 2004) and to be (relatively) independent of the subpopulations from the target population (see…
Linear retrieval and global measurements of wind speed from the Seasat SMMR
NASA Technical Reports Server (NTRS)
Pandey, P. C.
1983-01-01
Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.
Trends in Percutaneous Coronary Intervention and Coronary Artery Bypass Surgery in Korea.
Lee, Heeyoung; Lee, Kun Sei; Sim, Sung Bo; Jeong, Hyo Seon; Ahn, Hye Mi; Chee, Hyun Keun
2016-12-01
Coronary angioplasty has been replacing coronary artery bypass grafting (CABG) because of the relative advantage in terms of recovery time and noninvasiveness of the procedure. Compared to other Organization for Economic Cooperation and Development (OECD) countries, Korea has experienced a rapid increase in coronary angioplasty volumes. We analyzed changes in procedure volumes of CABG and of percutaneous coronary intervention (PCI) from three sources: the OECD Health Data, the National Health Insurance Service (NHIS) surgery statistics, and the National Health Insurance claims data. We found the ratio of procedure volume of PCI to that of CABG per 100,000 population was 19.12 in 2014, which was more than triple the OECD average of 5.92 for the same year. According to data from NHIS statistics, this ratio was an increase from 11.4 to 19.3 between 2006 and 2013. We found that Korea has a higher ratio of total procedure volumes of PCI with respect to CABG and also a more rapid increase of volumes of PCI than other countries. Prospective studies are required to determine whether this increase in absolute volumes of PCI is a natural response to a real medical need or representative of medical overuse.
Monitoring Items in Real Time to Enhance CAT Security
ERIC Educational Resources Information Center
Zhang, Jinming; Li, Jie
2016-01-01
An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…
Statistical approaches used to assess and redesign surface water-quality-monitoring networks.
Khalil, B; Ouarda, T B M J
2009-11-01
An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.
McAlinden, Colm; Khadka, Jyoti; Pesudovs, Konrad
2011-07-01
The ever-expanding choice of ocular metrology and imaging equipment has driven research into the validity of their measurements. Consequently, studies of the agreement between two instruments or clinical tests have proliferated in the ophthalmic literature. It is important that researchers apply the appropriate statistical tests in agreement studies. Correlation coefficients are hazardous and should be avoided. The 'limits of agreement' method originally proposed by Altman and Bland in 1983 is the statistical procedure of choice. Its step-by-step use and practical considerations in relation to optometry and ophthalmology are detailed in addition to sample size considerations and statistical approaches to precision (repeatability or reproducibility) estimates. Ophthalmic & Physiological Optics © 2011 The College of Optometrists.
Using Statistical Process Control to Make Data-Based Clinical Decisions.
ERIC Educational Resources Information Center
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
Enhancement of event related potentials by iterative restoration algorithms
NASA Astrophysics Data System (ADS)
Pomalaza-Raez, Carlos A.; McGillem, Clare D.
1986-12-01
An iterative procedure for the restoration of event related potentials (ERP) is proposed and implemented. The method makes use of assumed or measured statistical information about latency variations in the individual ERP components. The signal model used for the restoration algorithm consists of a time-varying linear distortion and a positivity/negativity constraint. Additional preprocessing in the form of low-pass filtering is needed in order to mitigate the effects of additive noise. Numerical results obtained with real data show clearly the presence of enhanced and regenerated components in the restored ERP's. The procedure is easy to implement which makes it convenient when compared to other proposed techniques for the restoration of ERP signals.
Topical anaesthesia for needle-related pain in newborn infants.
Foster, Jann P; Taylor, Christine; Spence, Kaye
2017-02-04
Hospitalised newborn neonates frequently undergo painful invasive procedures that involve penetration of the skin and other tissues by a needle. One intervention that can be used prior to a needle insertion procedure is application of a topical local anaesthetic. To evaluate the efficacy and safety of topical anaesthetics such as amethocaine and EMLA in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), PubMed, Embase and CINAHL up to 15 May 2016; previous reviews including cross-references, abstracts, and conference proceedings. We contacted expert informants. We contacted authors directly to obtain additional data. We imposed no language restrictions. Randomised, quasi-randomised controlled trials, and cluster and cross-over randomised trials that compared the topical anaesthetics amethocaine and eutectic mixture of local anaesthetics (EMLA) in terms of anaesthetic efficacy and safety in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle DATA COLLECTION AND ANALYSIS: From the reports of the clinical trials we extracted data regarding clinical outcomes including pain, number of infants with methaemoglobin level 5% and above, number of needle prick attempts prior to successful needle-related procedure, crying, time taken to complete the procedure, episodes of apnoea, episodes of bradycardia, episodes of oxygen desaturation, neurodevelopmental disability and other adverse events. Eight small randomised controlled trials met the inclusion criteria (n = 506). These studies compared either EMLA and placebo or amethocaine and placebo. No studies compared EMLA and amethocaine. We were unable to meta-analyse the outcome of pain due to differing outcome measures and methods of reporting. For EMLA, two individual studies reported a statistically significant reduction in pain compared to placebo during lumbar puncture and venepuncture. Three studies found no statistical difference between the groups during heel lancing. For amethocaine, three studies reported a statistically significant reduction in pain compared to placebo during venepuncture and one study reported a statistically significant reduction in pain compared to placebo during cannulation. One study reported no statistical difference between the two groups during intramuscular injection.One study reported no statistical difference between EMLA and the placebo group for successful venepuncture at first attempt. One study similarly reported no statistically significant difference between Amethocaine and the placebo group for successful cannulation at first attempt.Risk for local redness, swelling or blanching was significantly higher with EMLA (typical risk ratio (RR) 1.65, 95% confidence interval (CI) 1.24 to 2.19; typical risk difference (RD) 0.17, 95% CI 0.09 to 0.26; n = 272; number needed to treat for an additional harmful outcome (NNTH) 6, 95% CI 4 to 11; I 2 = 92% indicating considerable heterogeneity) although not for amethocaine (typical RR 2.11, 95% CI 0.72 to 6.16; typical RD 0.05, 95% CI -0.02 to 0.11, n = 221). These local skin reactions for EMLA and amethocaine were reported as short-lasting. Two studies reported no methaemoglobinaemia with single application of EMLA. The quality of the evidence on outcomes assessed according to GRADE was low to moderate. Overall, all the trials were small, and the effects of uncertain clinical significance. The evidence regarding the effectiveness or safety of the interventions studied is inadequate to support clinical recommendations. There has been no evaluation regarding any long-term effects of topical anaesthetics in newborn infants.High quality studies evaluating the efficacy and safety of topical anaesthetics such as amethocaine and EMLA for needle-related pain in newborn term or preterm infants are required. These studies should aim to determine efficacy of these topical anaesthetics and on homogenous groups of infants for gestational age. While there was no methaemoglobinaemia in the studies that reported methaemoglobin, the efficacy and safety of EMLA, especially in very preterm infants, and for repeated application, need to be further evaluated in future studies.
Vanaelst, Jolien; Spruyt, Adriaan; De Houwer, Jan
2016-01-01
We demonstrate that feature-specific attention allocation influences the way in which repeated exposure modulates implicit and explicit evaluations toward fear-related stimuli. During an exposure procedure, participants were encouraged to assign selective attention either to the evaluative meaning (i.e., Evaluative Condition) or a non-evaluative, semantic feature (i.e., Semantic Condition) of fear-related stimuli. The influence of the exposure procedure was captured by means of a measure of implicit evaluation, explicit evaluative ratings, and a measure of automatic approach/avoidance tendencies. As predicted, the implicit measure of evaluation revealed a reduced expression of evaluations in the Semantic Condition as compared to the Evaluative Condition. Moreover, this effect generalized toward novel objects that were never presented during the exposure procedure. The explicit measure of evaluation mimicked this effect, although it failed to reach conventional levels of statistical significance. No effects were found in terms of automatic approach/avoidance tendencies. Potential implications for the treatment of anxiety disorders are discussed. PMID:27242626
Vanaelst, Jolien; Spruyt, Adriaan; De Houwer, Jan
2016-01-01
We demonstrate that feature-specific attention allocation influences the way in which repeated exposure modulates implicit and explicit evaluations toward fear-related stimuli. During an exposure procedure, participants were encouraged to assign selective attention either to the evaluative meaning (i.e., Evaluative Condition) or a non-evaluative, semantic feature (i.e., Semantic Condition) of fear-related stimuli. The influence of the exposure procedure was captured by means of a measure of implicit evaluation, explicit evaluative ratings, and a measure of automatic approach/avoidance tendencies. As predicted, the implicit measure of evaluation revealed a reduced expression of evaluations in the Semantic Condition as compared to the Evaluative Condition. Moreover, this effect generalized toward novel objects that were never presented during the exposure procedure. The explicit measure of evaluation mimicked this effect, although it failed to reach conventional levels of statistical significance. No effects were found in terms of automatic approach/avoidance tendencies. Potential implications for the treatment of anxiety disorders are discussed.
Zhang, Q-X; Xie, J-F; Zhou, J-D; Xiao, S-S; Liu, A-Z; Hu, G-Q; Chen, Y; Wang, C-Y
2017-11-01
This study's purpose was to investigate the attitudes toward organ donation among renal transplantation patients and their caregivers. In addition, we sought to explore the impact factors that affect their attitudes toward deceased organ donation. A self-administrated questionnaire was used, which consisted of two parts: 1) demographic data, and 2) transplantation and donation-related data. This study was conducted in three transplantation follow-up centers in three hospitals using a cross-sectional approach. SPSS 17.0 software was used to analysis descriptive and inferential statistics for data. The responses were analyzed using descriptive statistics and logistic regression analysis. We received 426 effective questionnaires. The renal transplantation patients' mean age was 40.84 years. Among these patients, 67.8% were willing to accept the organ transplantation surgery for their relatives, 67.4% were willing to donate a living kidney to a close relative, 62.7% were willing to donate organs after death, 53.5% were willing to register in the national organ donation system, and 51.4% were willing to sign the organ donation consent when facing their relatives becoming a potential organ donor. Age, marriage status, education level, understanding of transplantation procedures and understanding of donation procedures had statistical significance in the difference of the attitudes toward donate their organs after death (P < .05). Renal transplantation patients in our study are more willing to donate organs after death than their caregivers, but both their attitudes toward deceased donation were not very optimistic. There is a significant relationship between participants' willingness and knowledge of organ donation; patients with more understanding of the transplantation and donation procedure were more willing to donate organs after death. Affected by traditional values such as Confucianism, many people still cannot accept registering in the national organ donation system or sign the organ donation consent when facing their relatives becoming potential organ donors. There is a need to give adequate training regarding donation to increase donation rates. The government must provide education from the perspective of scientific knowledge to change the traditional views of the public, which may then increase the donation rate in China. Copyright © 2017 Elsevier Inc. All rights reserved.
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
ERIC Educational Resources Information Center
Madhere, Serge
An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…
Alles, Susan; Peng, Linda X; Mozola, Mark A
2009-01-01
A modification to Performance-Tested Method 010403, GeneQuence Listeria Test (DNAH method), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C, and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there were statistically significant differences in method performance between the DNAH method and reference culture procedures for only 2 foods (pasteurized crab meat and lettuce) at the 27 h enrichment time point and for only a single food (pasteurized crab meat) in one trial at the 30 h enrichment time point. Independent laboratory testing with 3 foods showed statistical equivalence between the methods for all foods, and results support the findings of the internal trials. Overall, considering both internal and independent laboratory trials, sensitivity of the DNAH method relative to the reference culture procedures was 90.5%. Results of testing 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the DNAH method was more productive than the reference U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the DNAH method at the 24 h time point. Overall, sensitivity of the DNAH method at 24 h relative to that of the USDA-FSIS method was 152%. The DNAH method exhibited extremely high specificity, with only 1% false-positive reactions overall.
Roshan, N M; Sakeenabi, B
2012-01-01
To evaluate the anxiety in children during occlusal atraumatic restorative treatment (ART) in the primary molars of children; and compare the anxiety for ART procedure performed in school environment and in hospital dental setup. A randomized controlled trial where one dentist placed 120 ART restorations in 60 five- to seven year-olds who had bilateral matched pairs of occlusal carious primary molars. A split-mouth design was used to place restorations in school and in hospital dental setup, which were assigned randomly to contralateral sides. Anxiety was evaluated by Modified Venhem score and the heart rate of the children at five fixed moments during dental treatment. At the entrance of the children into the treatment room, statistically significant difference between treatment in school environment and treatment in hospital dental setup for venham score and heart rate could be found (P = 0.023 and P = 0.037 respectively). At the start of the treatment procedure higher venham score and heart rate was observed in children treated in hospital dental setup in comparison with the children treated in school environment, finding was statistically significant (P = 0.011 and P = 0.029 respectively). During all other three points of treatment, the Venham scores of the children treated in school were lower than those of the children treated in hospital dental setup but statistically not significant (P > 0.05). Positive co-relation between Venham scores and Heart rate was established. No statistically significant relation could be established between boys and girls. Overall anxiety in children for ART treatment was found to be less and the procedure was well accepted irrespective of environment where treatment was performed Hospital dental setup by itself made children anxious during entrance and starting of the treatment when compared to children treated in school environment.
Consequences of common data analysis inaccuracies in CNS trauma injury basic research.
Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K
2013-05-15
The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.
Factors Affecting Smoking Tendency and Smoking Intensity
ERIC Educational Resources Information Center
David, Nissim Ben; Zion, Uri Ben
2009-01-01
Purpose: The purpose of this paper is to measure the relative effect of relevant explanatory variable on smoking tendency and smoking intensity. Design/methodology/approach: Using survey data collected by the Israeli Bureau of Statistics in 2003-2004, a probit procedure is estimated for analyzing factors that affect the probability of being a…
Outliers: A Potential Data Problem.
ERIC Educational Resources Information Center
Douzenis, Cordelia; Rakow, Ernest A.
Outliers, extreme data values relative to others in a sample, may distort statistics that assume internal levels of measurement and normal distribution. The outlier may be a valid value or an error. Several procedures are available for identifying outliers, and each may be applied to errors of prediction from the regression lines for utility in a…
Principles of Quantile Regression and an Application
ERIC Educational Resources Information Center
Chen, Fang; Chalhoub-Deville, Micheline
2014-01-01
Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…
Endodontic interappointment flare-ups: a prospective study of incidence and related factors.
Walton, R; Fouad, A
1992-04-01
Severe pain and/or swelling following a root canal treatment appointment are serious sequelae. Information varies or is incomplete as to the incidence of these conditions and related factors. In this study, data were collected at root canal treatment appointments on demographics, pulp/periapical diagnoses, presenting symptoms, treatment procedures, and number of appointments. Patients that then experienced a flare-up (a severe problem requiring an unscheduled visit and treatment) had the correlating factors examined. Statistical determinations were by chi-square analysis with significance at 0.05 or less. Nine hundred forty-six visits resulted in an incidence of 3.17% flare-ups. Flare-ups were positively correlated with more severe presenting symptoms, pulp necrosis with painful apical pathosis, and patients on analgesics. Fewer flare-ups occurred in undergraduate patients and following obturation procedures. There was no correlation between patient demographics or systemic conditions, number of appointments, treatment procedures, or taking antibiotics.
Osland, Emma; Yunus, Rossita Mohamad; Khan, Shahjahan; Alodat, Tareq; Memon, Breda; Memon, Muhammed Ashraf
2016-10-01
Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic vertical sleeve gastrectomy (LVSG) have been proposed as cost-effective strategies to manage obesity-related chronic disease. The aim of this meta-analysis and systematic review was to compare the "early postoperative complication rate i.e. within 30-days" reported from randomized control trials (RCTs) comparing these two procedures. RCTs comparing the early complication rates following LVSG and LRYGB between 2000 and 2015 were selected from PubMed, Medline, Embase, Science Citation Index, Current Contents, and the Cochrane database. The outcome variables analyzed included 30-day mortality, major and minor complications and interventions required for their management, length of hospital stay, readmission rates, operating time, and conversions from laparoscopic to open procedures. Six RCTs involving a total of 695 patients (LVSG n = 347, LRYGB n = 348) reported on early major complications. A statistically significant reduction in relative odds of early major complications favoring the LVSG procedure was noted (p = 0.05). Five RCTs representing 633 patients (LVSG n = 317, LRYGB n = 316) reported early minor complications. A non-statically significant reduction in relative odds of 29 % favoring the LVSG procedure was observed for early minor complications (p = 0.4). However, other outcomes directly related to complications which included reoperation rates, readmission rate, and 30-day mortality rate showed comparable effect size for both surgical procedures. This meta-analysis and systematic review of RCTs suggests that fewer early major and minor complications are associated with LVSG compared with LRYGB procedure. However, this does not translate into higher readmission rate, reoperation rate, or 30-day mortality for either procedure.
Global aesthetic surgery statistics: a closer look.
Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas
2017-08-01
Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.
Statistical analysis and digital processing of the Mössbauer spectra
NASA Astrophysics Data System (ADS)
Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri
2010-02-01
This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.
Counihan, T.D.; Miller, Allen I.; Parsley, M.J.
1999-01-01
The development of recruitment monitoring programs for age-0 white sturgeons Acipenser transmontanus is complicated by the statistical properties of catch-per-unit-effort (CPUE) data. We found that age-0 CPUE distributions from bottom trawl surveys violated assumptions of statistical procedures based on normal probability theory. Further, no single data transformation uniformly satisfied these assumptions because CPUE distribution properties varied with the sample mean (??(CPUE)). Given these analytic problems, we propose that an additional index of age-0 white sturgeon relative abundance, the proportion of positive tows (Ep), be used to estimate sample sizes before conducting age-0 recruitment surveys and to evaluate statistical hypothesis tests comparing the relative abundance of age-0 white sturgeons among years. Monte Carlo simulations indicated that Ep was consistently more precise than ??(CPUE), and because Ep is binomially rather than normally distributed, surveys can be planned and analyzed without violating the assumptions of procedures based on normal probability theory. However, we show that Ep may underestimate changes in relative abundance at high levels and confound our ability to quantify responses to management actions if relative abundance is consistently high. If data suggest that most samples will contain age-0 white sturgeons, estimators of relative abundance other than Ep should be considered. Because Ep may also obscure correlations to climatic and hydrologic variables if high abundance levels are present in time series data, we recommend ??(CPUE) be used to describe relations to environmental variables. The use of both Ep and ??(CPUE) will facilitate the evaluation of hypothesis tests comparing relative abundance levels and correlations to variables affecting age-0 recruitment. Estimated sample sizes for surveys should therefore be based on detecting predetermined differences in Ep, but data necessary to calculate ??(CPUE) should also be collected.
Ballesteros-Peña, Sendoa; Vallejo-De la Hoz, Gorka; Fernández-Aedo, Irrintzi
2017-12-23
To analyse vein catheterisation and blood gas test-related pain among adult patients in the emergency department and to explore pain score-related factors. An observational and multicentre research study was performed. Patients undergoing vein catheterisation or arterial puncture for gas test were included consecutively. After each procedure, patients scored the pain experienced using the NRS-11. 780 vein catheterisations and 101 blood gas tests were analysed. Venipuncture was scored with an average score of 2.8 (95% CI: 2.6-3), and arterial puncture with 3.6 (95%CI 3.1-4). Iatrogenic pain scores were associated with moderate - high difficulty procedures (P<.001); with the choice of the humeral rather than the radial artery (P=.02) in the gas test and correlated to baseline pain in venipunctures (P<.001). Pain scores related to other variables such as sex, place of origin or needle gauge did not present statistically significant differences. Vein catheterisation and blood gas test-related pain can be considered mild to moderately and moderately painful procedures, respectively. The pain score is associated with certain variables such as the difficulty of the procedure, the anatomic area of the puncture or baseline pain. A better understanding of painful effects related to emergency nursing procedures and the factors associated with pain self-perception could help to determine when and how to act to mitigate this undesired effect. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial
Hallgren, Kevin A.
2012-01-01
Many research designs require the assessment of inter-rater reliability (IRR) to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR. PMID:22833776
Exposure of the surgeon's hands to radiation during hand surgery procedures.
Żyluk, Andrzej; Puchalski, Piotr; Szlosser, Zbigniew; Dec, Paweł; Chrąchol, Joanna
2014-01-01
The objective of the study was to assess the time of exposure of the surgeon's hands to radiation and calculate of the equivalent dose absorbed during surgery of hand and wrist fractures with C-arm fluoroscope guidance. The necessary data specified by the objective of the study were acquired from operations of 287 patients with fractures of fingers, metacarpals, wrist bones and distal radius. 218 operations (78%) were percutaneous procedures and 60 (22%) were performed by open method. Data on the time of exposure and dose of radiation were acquired from the display of the fluoroscope, where they were automatically generated. These data were assigned to the individual patient, type of fracture, method of surgery and the operating surgeon. Fixations of distal radial fractures required longer times of radiation exposure (mean 61 sec.) than fractures of the wrist/metacarpals and fingers (38 and 32 sec., respectively), which was associated with absorption of significantly higher equivalent doses. Fixations of distal radial fractures by open method were associated with statistically significantly higher equivalent doses (0.41 mSv) than percutaneous procedures (0.3 mSv). Fixations of wrist and metacarpal bone fractures by open method were associated with lower equivalent doses (0.34 mSv) than percutaneous procedures (0.37 mSv),but the difference was not significant. Fixations of finger fractures by open method were associated with lower equivalent doses (0.13 mSv) than percutaneous procedures (0.24 mSv), the difference being statistically non-significant. Statistically significant differences in exposure time and equivalent doses were noted between 4 surgeons participating in the study, but no definitive relationship was found between these parameters and surgeons' employment time. 1. Hand surgery procedures under fluoroscopic guidance are associated with mild exposure of the surgeons' hands to radiation. 2. The equivalent dose was related to the type of fracture, operative technique and - to some degree - to the time of employment of the surgeon.
Majstorović, Branislava M; Simić, Snezana; Milaković, Branko D; Vucović, Dragan S; Aleksić, Valentina V
2010-01-01
In anaesthesiology, economic aspects have been insufficiently studied. The aim of this paper was the assessment of rational choice of the anaesthesiological services based on the analysis of the scope, distribution, trend and cost. The costs of anaesthesiological services were counted based on "unit" prices from the Republic Health Insurance Fund. Data were analysed by methods of descriptive statistics and statistical significance was tested by Student's t-test and chi2-test. The number of general anaesthesia was higher and average time of general anaesthesia was shorter, without statistical significance (t-test, p = 0.436) during 2006 compared to the previous year. Local anaesthesia was significantly higher (chi2-test, p = 0.001) in relation to planned operation in emergency surgery. The analysis of total anaesthesiological procedures revealed that a number of procedures significantly increased in ENT and MFH surgery, and ophthalmology, while some reduction was observed in general surgery, orthopaedics and trauma surgery and cardiovascular surgery (chi2-test, p = 0.000). The number of analgesia was higher than other procedures (chi2-test, p = 0.000). The structure of the cost was 24% in neurosurgery, 16% in digestive (general) surgery,14% in gynaecology and obstetrics, 13% in cardiovascular surgery and 9% in emergency room. Anaesthesiological services costs were the highest in neurosurgery, due to the length anaesthesia, and digestive surgery due to the total number of general anaesthesia performed. It is important to implement pharmacoeconomic studies in all departments, and to separate the anaesthesia services for emergency and planned operations. Disproportions between the number of anaesthesia, surgery interventions and the number of patients in surgical departments gives reason to design relation database.
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
NASA Technical Reports Server (NTRS)
Batthauer, Byron E.
1987-01-01
This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
Applying a statistical PTB detection procedure to complement the gold standard.
Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd
2011-04-01
This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kaleva Oikarinen, Juho; Järvelä, Sanna; Kaasila, Raimo
2014-04-01
This design-based research project focuses on documenting statistical learning among 16-17-year-old Finnish upper secondary school students (N = 78) in a computer-supported collaborative learning (CSCL) environment. One novel value of this study is in reporting the shift from teacher-led mathematical teaching to autonomous small-group learning in statistics. The main aim of this study is to examine how student collaboration occurs in learning statistics in a CSCL environment. The data include material from videotaped classroom observations and the researcher's notes. In this paper, the inter-subjective phenomena of students' interactions in a CSCL environment are analysed by using a contact summary sheet (CSS). The development of the multi-dimensional coding procedure of the CSS instrument is presented. Aptly selected video episodes were transcribed and coded in terms of conversational acts, which were divided into non-task-related and task-related categories to depict students' levels of collaboration. The results show that collaborative learning (CL) can facilitate cohesion and responsibility and reduce students' feelings of detachment in our classless, periodic school system. The interactive .pdf material and collaboration in small groups enable statistical learning. It is concluded that CSCL is one possible method of promoting statistical teaching. CL using interactive materials seems to foster and facilitate statistical learning processes.
Delwel, E J; de Jong, D A; Avezaat, C J J
2005-10-01
It is difficult to predict which patients with symptoms and radiological signs of normal pressure hydrocephalus (NPH) will benefit from a shunting procedure and which patients will not. Risk of this procedure is also higher in patients with NPH than in the overall population of hydrocephalic patients. The aim of this study is to investigate which clinical characteristics, CT parameters and parameters of cerebrospinal fluid dynamics could predict improvement after shunting. Eighty-three consecutive patients with symptoms and radiological signs of NPH were included in a prospective study. Parameters of the cerebrospinal fluid dynamics were measured by calculation of computerised data obtained by a constant-flow lumbar infusion test. Sixty-six patients considered candidates for surgery were treated with a medium-pressure Spitz-Holter valve; in seventeen patients a shunting procedure was not considered indicated. Clinical and radiological follow-up was performed for at least one year postoperatively. The odds ratio, the sensitivity and specificity as well as the positive and negative predictive value of individual and combinations of measured parameters did not show a statistically significant relation to clinical improvement after shunting. We conclude that neither individual parameters nor combinations of measured parameters show any statistically significant relation to clinical improvement following shunting procedures in patients suspected of NPH. We suggest restricting the term normal pressure hydrocephalus to cases that improve after shunting and using the term normal pressure hydrocephalus syndrome for patients suspected of NPH and for patients not improving after implantation of a proven well-functioning shunt.
Withington, John; Hirji, Sadaf; Sahai, Arun
2014-08-01
To quantify changes in surgical practice in the treatment of stress urinary incontinence (SUI), urge urinary incontinence (UUI) and post-prostatectomy stress incontinence (PPI) in England, using the Hospital Episode Statistics (HES) database. We used public domain information from the HES database, an administrative dataset recording all hospital admissions and procedures in England, to find evidence of change in the use of various surgical procedures for urinary incontinence from 2000 to 2012. For the treatment of SUI, a general increase in the use of synthetic mid-urethral tapes, such as tension-free vaginal tape (TVTO) and transobturator tape (TOT), was observed, while there was a significant decrease in colposuspension procedures over the same period. The number of procedures to remove TVT and TOT has also increased in recent years. In the treatment of overactive bladder and UUI, there has been a significant increase in the use of botulinum toxin A and neuromodulation in recent years. This coincided with a steady decline in the recorded use of clam ileocystoplasty. A steady increase was observed in the insertion of artificial urinary sphincter (AUS) devices in men, related to PPI. Mid-urethral synthetic tapes now represent the mainstream treatment of SUI in women, but tape-related complications have led to an increase in procedures to remove these devices. The uptake of botulinum toxin A and sacral neuromodulation has led to fewer clam ileocystoplasty procedures being performed. The steady increase in insertions of AUSs in men is unsurprising and reflects the widespread uptake of radical prostatectomy in recent years. There are limitations to results sourced from the HES database, with potential inaccuracy of coding; however, these data support the trends observed by experts in this field. © 2014 The Authors. BJU International published by John Wiley & Sons Ltd on behalf of BJU International.
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Uncertainty Analysis for DAM Projects.
1987-09-01
overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases
NASA Technical Reports Server (NTRS)
Thomas, R. E.; Gaines, G. B.
1978-01-01
Recommended design procedures to reduce the complete factorial design by retaining information on anticipated important interaction effects, and by generally giving up information on unconditional main effects are discussed. A hypothetical photovoltaic module used in the test design is presented. Judgments were made of the relative importance of various environmental stresses such as UV radiation, abrasion, chemical attack, temperature, mechanical stress, relative humidity and voltage. Consideration is given to a complete factorial design and its graphical representation, elimination of selected test conditions, examination and improvement of an engineering design, and parametric study. The resulting design consists of a mix of conditional main effects and conditional interactions and represents a compromise between engineering and statistical requirements.
Computer Administering of the Psychological Investigations: Set-Relational Representation
NASA Astrophysics Data System (ADS)
Yordzhev, Krasimir
Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.
Randomized study of surgical prophylaxis in immunocompromised hosts.
Lopes, D R; Peres, M P S M; Levin, A S
2011-02-01
Although prophylaxis is current practice, there are no randomized controlled studies evaluating preoperative antimicrobial prophylaxis in dental procedures in patients immunocompromised by chemotherapy or organ transplants. To evaluate prophylaxis in dental-invasive procedures in patients with cancer or solid organ transplants, 414 patients were randomized to receive one oral 500-mg dose 2 hours before the procedure (1-dose group) or a 500-mg dose 2 hours before the procedure and an additional dose 8 hours later (2-dose group). Procedures were exodontia or periodontal scaling/root planing. Follow-up was 4 weeks. No deaths or surgical site infections occurred. Six patients (1.4%) presented with use of pain medication > 3 days or hospitalization during follow-up: 4 of 207 (2%) in the 1-dose group and 2 of 207 (1%) in the 2-dose group (relative risk, 2.02; 95% confidence interval, 0.37-11.15). In conclusion, no statistically significant difference occurred in outcome using 1 or 2 doses of prophylactic amoxicillin for invasive dental procedures in immunocompromised patients.
Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi
2017-01-01
High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Reliability Sampling Plans: A Review and Some New Results
ERIC Educational Resources Information Center
Isaic-Maniu, Alexandru; Voda, Viorel Gh.
2009-01-01
In this work we present a large area of aspects related to the problem of sampling inspection in the case of reliability. First we discuss the actual status of this domain, mentioning the newest approaches (from a technical view point) such as HALT and HASS and the statistical perspective. After a brief description of the general procedure in…
Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai
2014-11-10
Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.
Statistical Reform in School Psychology Research: A Synthesis
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
NASA Astrophysics Data System (ADS)
Lange, J.; O'Shaughnessy, R.; Boyle, M.; Calderón Bustillo, J.; Campanelli, M.; Chu, T.; Clark, J. A.; Demos, N.; Fong, H.; Healy, J.; Hemberger, D. A.; Hinder, I.; Jani, K.; Khamesra, B.; Kidder, L. E.; Kumar, P.; Laguna, P.; Lousto, C. O.; Lovelace, G.; Ossokine, S.; Pfeiffer, H.; Scheel, M. A.; Shoemaker, D. M.; Szilagyi, B.; Teukolsky, S.; Zlochower, Y.
2017-11-01
We present and assess a Bayesian method to interpret gravitational wave signals from binary black holes. Our method directly compares gravitational wave data to numerical relativity (NR) simulations. In this study, we present a detailed investigation of the systematic and statistical parameter estimation errors of this method. This procedure bypasses approximations used in semianalytical models for compact binary coalescence. In this work, we use the full posterior parameter distribution for only generic nonprecessing binaries, drawing inferences away from the set of NR simulations used, via interpolation of a single scalar quantity (the marginalized log likelihood, ln L ) evaluated by comparing data to nonprecessing binary black hole simulations. We also compare the data to generic simulations, and discuss the effectiveness of this procedure for generic sources. We specifically assess the impact of higher order modes, repeating our interpretation with both l ≤2 as well as l ≤3 harmonic modes. Using the l ≤3 higher modes, we gain more information from the signal and can better constrain the parameters of the gravitational wave signal. We assess and quantify several sources of systematic error that our procedure could introduce, including simulation resolution and duration; most are negligible. We show through examples that our method can recover the parameters for equal mass, zero spin, GW150914-like, and unequal mass, precessing spin sources. Our study of this new parameter estimation method demonstrates that we can quantify and understand the systematic and statistical error. This method allows us to use higher order modes from numerical relativity simulations to better constrain the black hole binary parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoon Sohn; Charles Farrar; Norman Hunter
2001-01-01
This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less
Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems
1985-10-01
statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful
ERIC Educational Resources Information Center
Mackey, William F.; And Others
The first of a two-volume study of the relative accessibility of French vocabulary in French-speaking Canada presents statistical data concerning the frequency, distribution, valence, and accessibility of vocabulary related to 16 fundamental centers of interest found in normal conversation. The scope, procedures, and results of the study are…
The cancellous bone multiscale morphology-elasticity relationship.
Agić, Ante; Nikolić, Vasilije; Mijović, Budimir
2006-06-01
The cancellous bone effective properties relations are analysed on multiscale across two aspects; properties of representative volume element on micro scale and statistical measure of trabecular trajectory orientation on mesoscale. Anisotropy of the microstructure is described across fabric tensor measure with trajectory orientation tensor as bridging scale connection. The scatter measured data (elastic modulus, trajectory orientation, apparent density) from compression test are fitted by stochastic interpolation procedure. The engineering constants of the elasticity tensor are estimated by last square fitt procedure in multidimensional space by Nelder-Mead simplex. The multiaxial failure surface in strain space is constructed and interpolated by modified super-ellipsoid.
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
Robust model selection and the statistical classification of languages
NASA Astrophysics Data System (ADS)
García, J. E.; González-López, V. A.; Viola, M. L. L.
2012-10-01
In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset.
NASA Astrophysics Data System (ADS)
Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo
2006-08-01
Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.
Brodbelt, D C; Pfeiffer, D U; Young, L E; Wood, J L N
2007-11-01
Cats are commonly anaesthetized in veterinary practice, but recent figures describing the frequency of or risk factors for anaesthetic-related death are not available. The aims of this study were to address these deficiencies. A nested case-control study was undertaken in 117 UK veterinary centres. All anaesthetic and sedation procedures and anaesthetic and sedation-related deaths (i.e. 'cases') occurring within 48 h were recorded. Details of patient, procedure, and perioperative management were recorded for all cases and randomly selected non-deaths (controls). A detailed statistical model of factors associated with anaesthetic and sedation-related death was constructed. Between June 2002 and June 2004, 175 deaths were classified as anaesthetic and sedation-related and 14 additional deaths (with insufficient information to be excluded) were included for the estimation of risk. During the study, 79 178 anaesthetic and sedation procedures were recorded and the overall risk of anaesthetic and sedation-related death was 0.24% (95% CI 0.20-0.27). Factors associated with increased odds of anaesthetic-related death were poor health status (ASA physical status classification), increasing age, extremes of weight, increasing procedural urgency and complexity, endotracheal intubation, and fluid therapy. Pulse monitoring and pulse oximetry were associated with reduced odds. The risk of anaesthetic-related death in cats appears to have decreased since the last published study in the UK. The results should aid the preoperative identification of cats at greatest risk. Greater care with endotracheal intubation and fluid administration are recommended, and pulse and pulse oximetry monitoring should be routinely implemented in cats.
BTS statistical standards manual
DOT National Transportation Integrated Search
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
ERIC Educational Resources Information Center
Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.
In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…
77 FR 53889 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
..., methods, and statistical procedures for assessing and monitoring the health of communities and measuring... methods and the Community Guide, and coordinates division responses to requests for technical assistance...-federal partners in developing indicators, methods, and statistical procedures for measuring and reporting...
10 CFR Appendix II to Part 504 - Fuel Price Computation
Code of Federal Regulations, 2010 CFR
2010-01-01
... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...
Hudson, Michelle; Bhogal, Nirmala
2004-11-01
The statistics for animal procedures performed in 2003 were recently released by the Home Office. They indicate that, for the second year running, there was a significant increase in the number of laboratory animal procedures undertaken in Great Britain. The species and genera used, the numbers of toxicology and non-toxicology procedures, and the overall trends, are described. The implications of these latest statistics are discussed with reference to key areas of interest and to the impact of existing regulations and pending legislative reforms.
NASA Technical Reports Server (NTRS)
Amling, G. E.; Holms, A. G.
1973-01-01
A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
Sun, Hokeun; Wang, Shuang
2014-08-15
Existing association methods for rare variants from sequencing data have focused on aggregating variants in a gene or a genetic region because of the fact that analysing individual rare variants is underpowered. However, these existing rare variant detection methods are not able to identify which rare variants in a gene or a genetic region of all variants are associated with the complex diseases or traits. Once phenotypic associations of a gene or a genetic region are identified, the natural next step in the association study with sequencing data is to locate the susceptible rare variants within the gene or the genetic region. In this article, we propose a power set-based statistical selection procedure that is able to identify the locations of the potentially susceptible rare variants within a disease-related gene or a genetic region. The selection performance of the proposed selection procedure was evaluated through simulation studies, where we demonstrated the feasibility and superior power over several comparable existing methods. In particular, the proposed method is able to handle the mixed effects when both risk and protective variants are present in a gene or a genetic region. The proposed selection procedure was also applied to the sequence data on the ANGPTL gene family from the Dallas Heart Study to identify potentially susceptible rare variants within the trait-related genes. An R package 'rvsel' can be downloaded from http://www.columbia.edu/∼sw2206/ and http://statsun.pusan.ac.kr. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Fairchild, James F; Allert, Ann; Sappington, Linda S; Nelson, Karen J; Valle, Janet
2008-03-01
We conducted 96-h static acute toxicity studies to evaluate the relative sensitivity of juveniles of the threatened bull trout (Salvelinus confluentus) and the standard cold-water surrogate rainbow trout (Onchorhyncus mykiss) to three rangeland herbicides commonly used for controlling invasive weeds in the northwestern United States. Relative species sensitivity was compared using three procedures: standard acute toxicity testing, fractional estimates of lethal concentrations, and accelerated life testing chronic estimation procedures. The acutely lethal concentrations (ALC) resulting in 50% mortality at 96 h (96-h ALC50s) were determined using linear regression and indicated that the three herbicides were toxic in the order of picloram acid > 2,4-D acid > clopyralid acid. The 96-h ALC50 values for rainbow trout were as follows: picloram, 41 mg/L; 2.4-D, 707 mg/L; and clopyralid, 700 mg/L. The 96-h ALC50 values for bull trout were as follows: picloram, 24 mg/L; 2.4-D, 398 mg/L; and clopyralid, 802 mg/L. Fractional estimates of safe concentrations, based on 5% of the 96-h ALC50, were conservative (overestimated toxicity) of regression-derived 96-h ALC5 values by an order of magnitude. Accelerated life testing procedures were used to estimate chronic lethal concentrations (CLC) resulting in 1% mortality at 30 d (30-d CLC1) for the three herbicides: picloram (1 mg/L rainbow trout, 5 mg/L bull trout), 2,4-D (56 mg/L rainbow trout, 84 mg/L bull trout), and clopyralid (477 mg/L rainbow trout; 552 mg/L bull trout). Collectively, the results indicated that the standard surrogate rainbow trout is similar in sensitivity to bull trout. Accelerated life testing procedures provided cost-effective, statistically defensible methods for estimating safe chronic concentrations (30-d CLC1s) of herbicides from acute toxicity data because they use statistical models based on the entire mortality:concentration:time data matrix.
Fairchild, J.F.; Allert, A.; Sappington, L.S.; Nelson, K.J.; Valle, J.
2008-01-01
We conducted 96-h static acute toxicity studies to evaluate the relative sensitivity of juveniles of the threatened bull trout (Salvelinus confluentus) and the standard cold-water surrogate rainbow trout (Onchorhyncus mykiss) to three rangeland herbicides commonly used for controlling invasive weeds in the northwestern United States. Relative species sensitivity was compared using three procedures: standard acute toxicity testing, fractional estimates of lethal concentrations, and accelerated life testing chronic estimation procedures. The acutely lethal concentrations (ALC) resulting in 50% mortality at 96 h (96-h ALC50s) were determined using linear regression and indicated that the three herbicides were toxic in the order of picloram acid > 2,4-D acid > clopyralid acid. The 96-h ALC50 values for rainbow trout were as follows: picloram, 41 mg/L; 2.4-D, 707 mg/L; and clopyralid, 700 mg/L. The 96-h ALC50 values for bull trout were as follows: picloram, 24 mg/L; 2.4-D, 398 mg/L; and clopyralid, 802 mg/L. Fractional estimates of safe concentrations, based on 5% of the 96-h ALC50, were conservative (overestimated toxicity) of regression-derived 96-h ALC5 values by an order of magnitude. Accelerated life testing procedures were used to estimate chronic lethal concentrations (CLC) resulting in 1% mortality at 30 d (30-d CLC1) for the three herbicides: picloram (1 mg/L rainbow trout, 5 mg/L bull trout), 2,4-D (56 mg/L rainbow trout, 84 mg/L bull trout), and clopyralid (477 mg/L rainbow trout; 552 mg/L bull trout). Collectively, the results indicated that the standard surrogate rainbow trout is similar in sensitivity to bull trout. Accelerated life testing procedures provided cost-effective, statistically defensible methods for estimating safe chronic concentrations (30-d CLC1s) of herbicides from acute toxicity data because they use statistical models based on the entire mortality:concentration: time data matrix. ?? 2008 SETAC.
Predicting juvenile recidivism: new method, old problems.
Benda, B B
1987-01-01
This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
Reduction to Outside the Atmosphere and Statistical Tests Used in Geneva Photometry
NASA Technical Reports Server (NTRS)
Rufener, F.
1984-01-01
Conditions for creating a precise photometric system are investigated. The analytical and discriminatory potentials of a photometry obviously result from the localization of the passbands in the spectrum; they do, however, also depend critically on the precision attained. This precision is the result of two different types of precautions. Two procedures which contribute in an efficient manner to achieving greater precision are examined. These two methods are known as hardware related precision and software related precision.
Kim, Youn Hwan; Kim, Sang Wha; Kim, Jeong Tae; Kim, Chang Yeon
2013-06-01
Tensor fascia lata (TFL) musculocutaneous flaps often require a donor site graft when harvesting a large flap. However, a major drawback is that it also sacrifices the muscle. To overcome this disadvantage, we designed a TFL perforator-based island flap that was harvested from a site near the defect and involved transposition within 90 degrees without full isolation of the pedicles. We performed procedures on 17 musculocutaneous flaps and 23 perforator-based island flaps, and compared the outcomes of these surgeries. The overall complication rate was 27.5% (11 regions). There were 7 complications related to the musculocutaneous flaps and 4 complications related to the perforator flaps. Although there were no statistical differences between those groups, lower complication rates were associated with procedures involving perforator flaps. The TFL perforator procedure is a simple and fast operation that avoids sacrificing muscle. This decreases complication rates compared to true perforator flap techniques that require dissection around the perforator or pedicle.
Alles, Susan; Peng, Linda X; Mozola, Mark A
2009-01-01
A modification to Performance-Tested Method (PTM) 070601, Reveal Listeria Test (Reveal), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there was a statistically significant difference in performance between the Reveal and reference culture [U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA/BAM) or U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS)] methods for only a single food in one trial (pasteurized crab meat) at the 27 h enrichment time point, with more positive results obtained with the FDA/BAM reference method. No foods showed statistically significant differences in method performance at the 30 h time point. Independent laboratory testing of 3 foods again produced a statistically significant difference in results for crab meat at the 27 h time point; otherwise results of the Reveal and reference methods were statistically equivalent. Overall, considering both internal and independent laboratory trials, sensitivity of the Reveal method relative to the reference culture procedures in testing of foods was 85.9% at 27 h and 97.1% at 30 h. Results from 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the Reveal method was more productive than the reference USDA-FSIS culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the Reveal method at the 24 h time point. Overall, sensitivity of the Reveal method at 24 h relative to that of the USDA-FSIS method was 153%. The Reveal method exhibited extremely high specificity, with only a single false-positive result in all trials combined for overall specificity of 99.5%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
A Primer on Multivariate Analysis of Variance (MANOVA) for Behavioral Scientists
ERIC Educational Resources Information Center
Warne, Russell T.
2014-01-01
Reviews of statistical procedures (e.g., Bangert & Baumberger, 2005; Kieffer, Reese, & Thompson, 2001; Warne, Lazo, Ramos, & Ritter, 2012) show that one of the most common multivariate statistical methods in psychological research is multivariate analysis of variance (MANOVA). However, MANOVA and its associated procedures are often not…
Knowledge dimensions in hypothesis test problems
NASA Astrophysics Data System (ADS)
Krishnan, Saras; Idris, Noraini
2012-05-01
The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.
Different Manhattan project: automatic statistical model generation
NASA Astrophysics Data System (ADS)
Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore
2002-03-01
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.
Osland, Emma; Yunus, Rossita M; Khan, Shahjahan; Memon, Breda; Memon, Muhammed A
2016-06-01
Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic vertical sleeve gastrectomy (LVSG), have been proposed as cost-effective strategies to manage obesity-related chronic disease. The objectives of this meta-analysis and systematic review were to analyze the "late postoperative complication rate (>30 days)" for these 2 procedures. Randomized controlled trials (RCTs) published between 2000 and 2015 comparing the late complication rates, that is, >30 days following LVSG and LRYGB in adult population (ie, 16 y and above) were selected from PubMed, Medline, Embase, Science Citation Index, Current Contents, and the Cochrane database. The outcome variables analyzed included mortality rate, major and minor complications, and interventions required for their management and readmission rates. Random effects model was used to calculate the effect size of both binary and continuous data. Heterogeneity among the outcome variables of these trials was determined by the Cochran Q statistic and I index. The meta-analysis was prepared in accordance with the Preferred Reporting of Systematic Reviews and Meta-Analyses guidelines. Six RCTs involving a total of 685 patients (LVSG, n=345; LRYGB, n=340) reported late major complications. A nonstatistical reduction in relative odds favoring the LVSG procedure was observed [odds ratio (OR), 0.64; 95% confidence interval (CI), 0.21-1.97; P=0.4]. Four RCTs representing 408 patients (LVSG, n=208; LRYGB, n=200) reported late minor complications. A nonstatistically significant reduction of 36% in relative odds favoring the LVSG procedure was observed (OR, 0.64; 95% CI, 0.28-1.47; P=0.3). A 37% relative reduction in odds was observed in favor of the LVSG for the need for additional interventions to manage late postoperative complications that did not reach statistical significance (OR, 0.63; 95% CI, 0.19-2.05; P=0.4). No study specifically reported readmissions required for the management of late complication. This meta-analysis and systematic review of RCTs shows that the development of late (major and minor) complications is similar between LVSG and LRYGB procedures, 6 months to 3 years postoperatively, and they do not lead to higher readmission rate or reoperation rate for either procedure. However longer-term surveillance is required to accurately describe the patterns of late complications in these patients.
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
Iosif, Christina; Di Maria, Federico; Sourour, Nader; Degos, Vincent; Bonneville, Fabrice; Biondi, Alessandra; Jean, Betty; Colonne, Chantal; Nouet, Aurelien; Chiras, Jacques; Clarençon, Frédéric
2014-05-01
Coiling of ruptured intracranial aneurysms in elderly patients remains debatable in terms of technical feasibility and clinical outcome. In this observational cohort study we aimed to assess the technical feasibility, complication profile and clinical outcomes of elderly patients with subarachnoid hemorrhage (SAH) treated with endovascular therapy. The study included 59 consecutive patients (47 women) aged ≥70 years (mean age 76 years, range 71-84) admitted to our institution with SAH from January 2002 to July 2011. The patients were treated for 66 aneurysms (regular coiling: n=62 (94%), balloon-assisted technique: n=2 (3%), stent and coil technique: n=2 (3%)). World Federation of Neurosurgery (WFNS) grade at admission was 1 in 13 patients, 2 in 23 patients, 3 in 8 patients, 4 in 11 patients and 5 in 4 patients. We analysed data by univariate and multivariate statistical analyses with an emphasis on the initial clinical situation, complications and clinical outcome. The technical success rate was 98% with a procedure-related deficit rate of 10% and procedure-related death rate of 5%. The Glasgow Outcome Scale score at 6 months was 1 in 15 patients (25.4%), 2 in 8 patients (13.6%), 3 in 14 patients (23.7%), 4 in 11 patients (18.6%) and 5 in 11 patients (18.6%). Patients admitted with a high initial WFNS grade did not differ statistically in terms of clinical outcome. The final clinical outcome was not significantly correlated with age, initial Fisher score or procedure-related complications. Endovascular treatment of elderly patients with ruptured cerebral aneurysms is feasible, safe and beneficial regardless of the presenting WFNS score.
Techniques for recognizing identity of several response functions from the data of visual inspection
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.
1996-08-01
The purpose of this paper is to present some efficient techniques for recognizing from the observed data whether several response functions are identical to each other. For example, in an industrial setting the problem may be to determine whether the production coefficients established in a small-scale pilot study apply to each of several large- scale production facilities. The techniques proposed here combine sensor information from automated visual inspection of manufactured products which is carried out by means of pixel-by-pixel comparison of the sensed image of the product to be inspected with some reference pattern (or image). Let (a1, . . . , am) be p-dimensional parameters associated with m response models of the same type. This study is concerned with the simultaneous comparison of a1, . . . , am. A generalized maximum likelihood ratio (GMLR) test is derived for testing equality of these parameters, where each of the parameters represents a corresponding vector of regression coefficients. The GMLR test reduces to an equivalent test based on a statistic that has an F distribution. The main advantage of the test lies in its relative simplicity and the ease with which it can be applied. Another interesting test for the same problem is an application of Fisher's method of combining independent test statistics which can be considered as a parallel procedure to the GMLR test. The combination of independent test statistics does not appear to have been used very much in applied statistics. There does, however, seem to be potential data analytic value in techniques for combining distributional assessments in relation to statistically independent samples which are of joint experimental relevance. In addition, a new iterated test for the problem defined above is presented. A rejection of the null hypothesis by this test provides some reason why all the parameters are not equal. A numerical example is discussed in the context of the proposed procedures for hypothesis testing.
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
NASA Technical Reports Server (NTRS)
Wallace, G. R.; Weathers, G. D.; Graf, E. R.
1973-01-01
The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.
Hoffmann, Jürgen; Wallwiener, Diethelm
2009-04-08
One of the basic prerequisites for generating evidence-based data is the availability of classification systems. Attempts to date to classify breast cancer operations have focussed on specific problems, e.g. the avoidance of secondary corrective surgery for surgical defects, rather than taking a generic approach. Starting from an existing, simpler empirical scheme based on the complexity of breast surgical procedures, which was used in-house primarily in operative report-writing, a novel classification of ablative and breast-conserving procedures initially needed to be developed and elaborated systematically. To obtain proof of principle, a prospectively planned analysis of patient records for all major breast cancer-related operations performed at our breast centre in 2005 and 2006 was conducted using the new classification. Data were analysed using basic descriptive statistics such as frequency tables. A novel two-type, six-tier classification system comprising 12 main categories, 13 subcategories and 39 sub-subcategories of oncological, oncoplastic and reconstructive breast cancer-related surgery was successfully developed. Our system permitted unequivocal classification, without exception, of all 1225 procedures performed in 1166 breast cancer patients in 2005 and 2006. Breast cancer-related surgical procedures can be generically classified according to their surgical complexity. Analysis of all major procedures performed at our breast centre during the study period provides proof of principle for this novel classification system. We envisage various applications for this classification, including uses in randomised clinical trials, guideline development, specialist surgical training, continuing professional development as well as quality of care and public health research.
Effect of the image resolution on the statistical descriptors of heterogeneous media.
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Effect of the image resolution on the statistical descriptors of heterogeneous media
NASA Astrophysics Data System (ADS)
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Dexter, Franklin; Ledolter, Johannes; Hindman, Bradley J
2016-01-01
In this Statistical Grand Rounds, we review methods for the analysis of the diversity of procedures among hospitals, the activities among anesthesia providers, etc. We apply multiple methods and consider their relative reliability and usefulness for perioperative applications, including calculations of SEs. We also review methods for comparing the similarity of procedures among hospitals, activities among anesthesia providers, etc. We again apply multiple methods and consider their relative reliability and usefulness for perioperative applications. The applications include strategic analyses (e.g., hospital marketing) and human resource analytics (e.g., comparisons among providers). Measures of diversity of procedures and activities (e.g., Herfindahl and Gini-Simpson index) are used for quantification of each facility (hospital) or anesthesia provider, one at a time. Diversity can be thought of as a summary measure. Thus, if the diversity of procedures for 48 hospitals is studied, the diversity (and its SE) is being calculated for each hospital. Likewise, the effective numbers of common procedures at each hospital can be calculated (e.g., by using the exponential of the Shannon index). Measures of similarity are pairwise assessments. Thus, if quantifying the similarity of procedures among cases with a break or handoff versus cases without a break or handoff, a similarity index represents a correlation coefficient. There are several different measures of similarity, and we compare their features and applicability for perioperative data. We rely extensively on sensitivity analyses to interpret observed values of the similarity index.
7 CFR 800.86 - Inspection of shiplot, unit train, and lash barge grain in single lots.
Code of Federal Regulations, 2010 CFR
2010-01-01
... prescribed in the instructions. (b) Application procedure. Applications for the official inspection of... statistical acceptance sampling and inspection plan according to the provisions of this section and procedures... inspection as part of a single lot and accepted by a statistical acceptance sampling and inspection plan...
Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof
2009-01-01
The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066
DOT National Transportation Integrated Search
1981-10-01
Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...
The use of analysis of variance procedures in biological studies
Williams, B.K.
1987-01-01
The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.
Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U
2009-05-01
In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.
A close examination of double filtering with fold change and t test in microarray analysis
2009-01-01
Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439
Detection of crossover time scales in multifractal detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Ge, Erjia; Leung, Yee
2013-04-01
Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.
Application of real rock pore-threat statistics to a regular pore network model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rakibul, M.; Sarker, H.; McIntyre, D.
2011-01-01
This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less
Application of real rock pore-throat statistics to a regular pore network model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarker, M.R.; McIntyre, D.; Ferer, M.
2011-01-01
This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less
Kasasbeh, Ehab S; Parvez, Babar; Huang, Robert L; Hasselblad, Michele Marie; Glazer, Mark D; Salloum, Joseph G; Cleator, John H; Zhao, David X
2012-11-01
To determine whether radial artery access is associated with a reduction in fluoroscopy time, procedure time, and other procedural variables over a 27-month period during which the radial artery approach was incorporated in a single academic Medical Center. Although previous studies have demonstrated a relationship between increased volume and decreased procedural time, no studies have looked at the integration of radial access over time. Data were collected from consecutive patients who presented to the Vanderbilt University Medical Center cardiac catheterization laboratory from January 1, 2009 to April 1, 2011. Patients who underwent radial access diagnostic catheterization with and without percutaneous coronary intervention were included in this study. A total of 1112 diagnostic cardiac catheterizations through the radial access site were analyzed. High-volume, intermediate-volume, and low-volume operators were grouped based on the percentage of procedures performed through a radial approach. From 2009 to 2011, there was a significant decrease in fluoroscopy time in all operator groups for diagnostic catheterization (P=.035). The high-volume operator group had 1.88 and 3.66 minute reductions in fluoroscopy time compared to the intermediate- and low-volume operator groups, respectively (both P<.001). Likewise, the intermediate-volume operator group had a 1.77 minute improvement compared to the low-volume operator group, but this did not reach statistical significance (P=.102). The improvement in fluoroscopy time and other procedure-related parameters was seen after approximately 25 cases with further improvement after 75 cases. The incorporation of the radial access approach in the cardiac catheterization laboratory led to a decrease in fluoroscopy time for each operator and operator group over the last 3 years. Our data demonstrated that higher-volume radial operators have better procedure, room, and fluoroscopy times when compared to intermediate- and low-volume operators. However, lower-volume operators have a reduction in procedure-related parameters with increased radial cases. Number of procedures needed to become sufficient was demonstrated in the current study.
Hudson-Shore, Michelle
2016-12-01
The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.
Hoppe, Ian C; Pastor, Craig J; Paik, Angie M
2012-10-01
In plastic surgery, 2 predominant practice environments exist, namely, the academic setting and private practice. These 2 groups cater their practice toward the needs and demands of 2 very different patient populations. The goal of this paper is to examine well-established economic indicators and delineate their relationship, if any, with the volume of different plastic surgical procedures performed in the United States. Information from the American Society of Plastic Surgeons' annual reports on plastic surgery statistics was collected from the year 2000 through 2010 and compared to readily available and established economic indicators. There was a significant positive relationship with total cosmetic procedures and gross domestic product (GDP), GDP per capita, personal income, consumer price index (CPI) (all), and CPI (medical). There was a significant positive relationship between cosmetic surgical procedures and the issuance of new home permits and the average prime rate charged by banks. There was a significant positive relationship with cosmetic minimally invasive procedures and GDP, GDP per capita, personal income, CPI (all), and CPI (medical). There was a significant negative relationship between reconstructive procedures and GDP, GDP per capita, personal income, CPI (all), and CPI (medical). Cosmetic minimally invasive procedures seem to be decided on relatively quickly during good economic times. Cosmetic surgical procedures seem to be more planned and less related to the economic environment. The plastic surgeon may use this relationship to tailor the focus of his or her practice to be best situated for economic fluctuations.
An evaluation of periodontal assessment procedures among Indiana dental hygienists.
Stephan, Christine A
2014-01-01
Using a descriptive correlational design, this study surveyed periodontal assessment procedures currently performed by Indiana dental hygienists in general dentistry practices to reveal if deficiencies in assessment exist. Members (n = 354) of the Indiana Dental Hygienists' Association (IDHA) were invited to participate in the survey. A 22 multiple choice question survey, using Likert scales for responses, was open to participants for three weeks. Descriptive and non-parametric inferential statistics analyzed questions related to demographics and assessment procedures practiced. In addition, an evaluation of the awareness of periodontal assessment procedures recommended by the American Academy of Periodontology (AAP) was examined. Of the 354 Indiana dental hygienists surveyed, a 31.9% response rate was achieved. Participants were asked to identify the recommended AAP periodontal assessment procedures they perform. The majority of respondents indicated either frequently or always performing the listed assessment procedures. Additionally, significant relationships were found between demographic factors and participants' awareness and performance of recommended AAP assessment procedures. While information gathered from this study is valuable to the body of literature regarding periodontal disease assessment, continued research with larger survey studies should be conducted to obtain a more accurate national representation of what is being practiced by dental hygienists.
Strum, David P; May, Jerrold H; Sampson, Allan R; Vargas, Luis G; Spangler, William E
2003-01-01
Variability inherent in the duration of surgical procedures complicates surgical scheduling. Modeling the duration and variability of surgeries might improve time estimates. Accurate time estimates are important operationally to improve utilization, reduce costs, and identify surgeries that might be considered outliers. Surgeries with multiple procedures are difficult to model because they are difficult to segment into homogenous groups and because they are performed less frequently than single-procedure surgeries. The authors studied, retrospectively, 10,740 surgeries each with exactly two CPTs and 46,322 surgical cases with only one CPT from a large teaching hospital to determine if the distribution of dual-procedure surgery times fit more closely a lognormal or a normal model. The authors tested model goodness of fit to their data using Shapiro-Wilk tests, studied factors affecting the variability of time estimates, and examined the impact of coding permutations (ordered combinations) on modeling. The Shapiro-Wilk tests indicated that the lognormal model is statistically superior to the normal model for modeling dual-procedure surgeries. Permutations of component codes did not appear to differ significantly with respect to total procedure time and surgical time. To improve individual models for infrequent dual-procedure surgeries, permutations may be reduced and estimates may be based on the longest component procedure and type of anesthesia. The authors recommend use of the lognormal model for estimating surgical times for surgeries with two component procedures. Their results help legitimize the use of log transforms to normalize surgical procedure times prior to hypothesis testing using linear statistical models. Multiple-procedure surgeries may be modeled using the longest (statistically most important) component procedure and type of anesthesia.
Yan, Chao; Liang, Wei-Tao; Wang, Zhong-Gao; Hu, Zhi-Wei; Wu, Ji-Min; Zhang, Chao; Chen, Mei-Ping
2015-12-07
To compare the outcomes between the Stretta procedure and laparoscopic toupet fundoplication (LTF) in patients with gastroesophageal reflux disease (GERD)-related extra-esophageal symptoms. From January 2011 to February 2012, a total of 98 patients diagnosed with GERD-related extra-esophageal symptoms who met the inclusion criteria were enrolled in this study. All patients who either underwent the Stretta procedure or LTF treatment have now completed the 3-year follow-up. Primary outcome measures, including frequency and severity of extra-esophageal symptoms, proton pump inhibitor (PPI) use, satisfaction, and postoperative complications, were assessed. The results of the Stretta procedure and LTF therapy were analyzed and compared. There were 47 patients in the Stretta group and 51 patients in the LTF group. Ninety patients were available at the 3-year follow-up. The total of the frequency and severity scores for every symptom improved in both groups (P < 0.05). Improvement in symptom scores of cough, sputum, and wheezing did not achieve statistical significance between the two groups (P > 0.05). However, the score for globus hysterics was different between the Stretta group and the LTF group (4.9 ± 2.24 vs 3.2 ± 2.63, P < 0.05). After the Stretta procedure and LTF treatment, 29 and 33 patients in each group achieved PPI therapy independence (61.7% vs 64.7%, P = 0.835). The patients in the LTF group were more satisfied with their quality of life than those in the Stretta procedure group (P < 0.05). Most complications resolved without intervention within two weeks; however, two patients in the LTF group still suffered from severe dysphagia 2 wk after the operation, and it improved after bougie dilation treatment in both patients. The Stretta procedure and LTF were both safe and effective for the control of GERD-related extra-esophageal symptoms and the reduction of PPI use.
Yan, Chao; Liang, Wei-Tao; Wang, Zhong-Gao; Hu, Zhi-Wei; Wu, Ji-Min; Zhang, Chao; Chen, Mei-Ping
2015-01-01
AIM: To compare the outcomes between the Stretta procedure and laparoscopic toupet fundoplication (LTF) in patients with gastroesophageal reflux disease (GERD)-related extra-esophageal symptoms. METHODS: From January 2011 to February 2012, a total of 98 patients diagnosed with GERD-related extra-esophageal symptoms who met the inclusion criteria were enrolled in this study. All patients who either underwent the Stretta procedure or LTF treatment have now completed the 3-year follow-up. Primary outcome measures, including frequency and severity of extra-esophageal symptoms, proton pump inhibitor (PPI) use, satisfaction, and postoperative complications, were assessed. The results of the Stretta procedure and LTF therapy were analyzed and compared. RESULTS: There were 47 patients in the Stretta group and 51 patients in the LTF group. Ninety patients were available at the 3-year follow-up. The total of the frequency and severity scores for every symptom improved in both groups (P < 0.05). Improvement in symptom scores of cough, sputum, and wheezing did not achieve statistical significance between the two groups (P > 0.05). However, the score for globus hysterics was different between the Stretta group and the LTF group (4.9 ± 2.24 vs 3.2 ± 2.63, P < 0.05). After the Stretta procedure and LTF treatment, 29 and 33 patients in each group achieved PPI therapy independence (61.7% vs 64.7%, P = 0.835). The patients in the LTF group were more satisfied with their quality of life than those in the Stretta procedure group (P < 0.05). Most complications resolved without intervention within two weeks; however, two patients in the LTF group still suffered from severe dysphagia 2 wk after the operation, and it improved after bougie dilation treatment in both patients. CONCLUSION: The Stretta procedure and LTF were both safe and effective for the control of GERD-related extra-esophageal symptoms and the reduction of PPI use. PMID:26668513
An Empirical Investigation of Methods for Assessing Item Fit for Mixed Format Tests
ERIC Educational Resources Information Center
Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N.
2013-01-01
Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
Biostatistical analysis of quantitative immunofluorescence microscopy images.
Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C
2016-12-01
Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Fan, Tao; Zhao, XinGang; Zhao, HaiJun; Liang, Cong; Wang, YinQian; Gai, QiFei; Zhang, Fangyi
2015-10-01
It is well established that syringomyelia can cause neurological symptoms and deficit by accumulation of fluid within syrinx cavities that lead to internal compression within the spinal cord. When other intervention treating the underlying etiology failed to yield any improvement, the next option would be a procedure to divert the fluid from the syrinx cavity, such as syringo-subarachnoid, syringo-peritoneal or syringo-pleural shunting. The indications and long term efficacy of these direct shunting procedures are still questionable and controversial. To investigate the clinical indication, outcome and complication of syringe-pleural shunt (SPS) as an alternative for treatment of syringomyelia. We reported a retrospective 26 cases of syringomyelia were found to have indication for a diversion procedure. SPS was offered. Patients' symptoms, mJOA score, and MRI were collected to evaluate the change of the syringomyelia and prognosis of the patients. 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. All 26 patients underwent SPS. The clinical information was collected, the mean follow-up time was 27.4 months, 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. The key surgical technique, outcome and complications of SPS were reported in detail. No mortality and severe complications occurred. Postoperative MRIs revealed near-complete resolution of syrinx in 14 patients, significant shrinkage of syrinx in 10 patients, no obvious reduction or unchanged in remaining 2 patient. Postoperatively, the symptoms improved in 24 cases (92.3%). Statistical analysis of the mJOA scores showed a statistical significance (P<0.001) between the preoperative group and the 2-week postoperative group. No further significant improvement between 2 weeks to the final follow up at 27 months. Collapse or remarkable shrinkage of the syrinx by SPS could ameliorate or at least stabilize the symptoms for the patient. We recommend small laminectomy and a less than 3mm myelotomy either at PML or DREZ. The SPS procedure can be an effective and relatively long-lived treatment for the idiopathic syringomyelia and those that failed other options. Copyright © 2015 Elsevier B.V. All rights reserved.
Shek, Daniel T L; Ma, Cecilia M S
2011-01-05
Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.
Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations
Shek, Daniel T. L.; Ma, Cecilia M. S.
2011-01-01
Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented. PMID:21218263
Effects of aerodynamic heating and TPS thermal performance uncertainties on the Shuttle Orbiter
NASA Technical Reports Server (NTRS)
Goodrich, W. D.; Derry, S. M.; Maraia, R. J.
1980-01-01
A procedure for estimating uncertainties in the aerodynamic-heating and thermal protection system (TPS) thermal-performance methodologies developed for the Shuttle Orbiter is presented. This procedure is used in predicting uncertainty bands around expected or nominal TPS thermal responses for the Orbiter during entry. Individual flowfield and TPS parameters that make major contributions to these uncertainty bands are identified and, by statistical considerations, combined in a manner suitable for making engineering estimates of the TPS thermal confidence intervals and temperature margins relative to design limits. Thus, for a fixed TPS design, entry trajectories for future Orbiter missions can be shaped subject to both the thermal-margin and confidence-interval requirements. This procedure is illustrated by assessing the thermal margins offered by selected areas of the existing Orbiter TPS design for an entry trajectory typifying early flight test missions.
Rasmussen, M; Espelund, U S; Juul, N; Yoo, A J; Sørensen, L H; Sørensen, K E; Johnsen, S P; Andersen, G; Simonsen, C Z
2018-06-01
Observational studies have suggested that low blood pressure and blood pressure variability may partially explain adverse neurological outcome after endovascular therapy with general anaesthesia (GA) for acute ischaemic stroke. The aim of this study was to further examine whether blood pressure related parameters during endovascular therapy are associated with neurological outcome. The GOLIATH trial randomised 128 patients to either GA or conscious sedation for endovascular therapy in acute ischaemic stroke. The primary outcome was 90 day modified Rankin Score. The haemodynamic protocol aimed at keeping the systolic blood pressure >140 mm Hg and mean blood pressure >70 mm Hg during the procedure. Blood pressure related parameters of interest included 20% reduction in mean blood pressure; mean blood pressure <70 mm Hg, <80 mm Hg, and <90 mm Hg, respectively; time with systolic blood pressure <140 mm Hg; procedural minimum and maximum mean and systolic blood pressure; mean blood pressure at the time of groin puncture; postreperfusion mean blood pressure; blood pressure variability; and use of vasopressors. Sensitivity analyses were performed in the subgroup of reperfused patients. Procedural average mean and systolic blood pressures were higher in the conscious sedation group (P<0.001). The number of patients with mean blood pressure <70-90 mm Hg and systolic blood pressure <140 mm Hg, blood pressure variability, and use of vasopressors were all higher in the GA group (P<0.001). There was no statistically significant association between any of the examined blood pressure related parameters and the modified Rankin Score in the overall patient population, and in the subgroup of patients with full reperfusion. We found no statistically significant association between blood pressure related parameters during endovascular therapy and neurological outcome. NCT 02317237. Copyright © 2018 British Journal of Anaesthesia. Published by Elsevier Ltd. All rights reserved.
Generalized Appended Product Indicator Procedure for Nonlinear Structural Equation Analysis.
ERIC Educational Resources Information Center
Wall, Melanie M.; Amemiya, Yasuo
2001-01-01
Considers the estimation of polynomial structural models and shows a limitation of an existing method. Introduces a new procedure, the generalized appended product indicator procedure, for nonlinear structural equation analysis. Addresses statistical issues associated with the procedure through simulation. (SLD)
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Barros, Raquel R M; Novaes, Arthur B Júnior; Grisi, Márcio F M; Souza, Sérgio L S; Taba, Mário Júnior; Palioto, Daniela B
2004-10-01
The acellular dermal matrix graft (ADMG) has become widely used in periodontal surgeries as a substitute for the subepithelial connective tissue graft (SCTG). These grafts exhibit different healing processes due to their distinct cellular and vascular structures. Therefore the surgical technique primarily developed for the autograft may not be adequate for the allograft. This study compared the clinical results of two surgical techniques--the "conventional" and a modified procedure--for the treatment of localized gingival recessions with the ADMG. A total of 32 bilateral Miller Class I or II gingival recessions were selected and randomly assigned to test and control groups. The control group received the SCTG and the test group the modified surgical technique. Probing depth (PD), relative clinical attachment level (RCAL), gingival recession (GR), and width of keratinized tissue (KT) were measured 2 weeks prior to surgery and 6 months post-surgery. Both procedures improved all the evaluated parameters after 6 months. Comparisons between the groups by Mann-Whitney rank sum test revealed no statistically significant differences in terms of CAL gain, PD reduction, and increase in KT from baseline to 6-month evaluation. However, there was a statistically significant greater reduction of GR favoring the modified technique (P = 0.002). The percentage of root coverage was 79% for the test group and 63.9% for the control group. We conclude that the modified technique is more suitable for root coverage procedures with the ADMG since it had statistically significant better clinical results compared to the traditional technique.
Davis, Joe M
2011-10-28
General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.
Hayton, Anna; Wallace, Anthony; Johnston, Peter
2015-12-01
The radiation dose to the Australian paediatric population as a result of medical imaging is of growing concern, in particular the dose from CT. Estimates of the Australian population dose have largely relied on Medicare Australia statistics, which capture only a fraction of those imaging procedures actually performed. The fraction not captured has been estimated using a value obtained for a survey of the adult population in the mid-1990s. To better quantify the fraction of procedures that are not captured by Medicare Australia, procedure frequency and funding data for adult and paediatric patients were obtained from a metropolitan tertiary teaching and research hospital. Five calendar years of data were obtained with a financial class specified for each individual procedure. The financial classes were grouped to give the percentage of Medicare Australia billable procedures for both adult and paediatric patients. The data were also grouped to align with the Medicare Australia age cohorts. The percentage of CT procedures billable to Medicare Australia increased from 16% to 28% between 2008 and 2012. In 2012, the percentage billable for adult and paediatric patients was 28% and 33%, respectively; however, many adult CT procedures are performed at stand-alone clinics, which bulk bill. Using Medicare Australia statistics alone, the frequency of paediatric CT procedures performed on the Australian paediatric population will be grossly under estimated. A correction factor of 4.5 is suggested for paediatric procedures and 1.5 for adult procedures. The fraction of actual procedures performed that are captured by Medicare Australia will vary with time. © 2015 The Royal Australian and New Zealand College of Radiologists.
Establishing the traceability of a uranyl nitrate solution to a standard reference material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, C.H.; Clark, J.P.
1978-01-01
A uranyl nitrate solution for use as a Working Calibration and Test Material (WCTM) was characterized, using a statistically designed procedure to document traceability to National Bureau of Standards Reference Material (SPM-960). A Reference Calibration and Test Material (PCTM) was prepared from SRM-960 uranium metal to approximate the acid and uranium concentration of the WCTM. This solution was used in the characterization procedure. Details of preparing, handling, and packaging these solutions are covered. Two outside laboratories, each having measurement expertise using a different analytical method, were selected to measure both solutions according to the procedure for characterizing the WCTM. Twomore » different methods were also used for the in-house characterization work. All analytical results were tested for statistical agreement before the WCTM concentration and limit of error values were calculated. A concentration value was determined with a relative limit of error (RLE) of approximately 0.03% which was better than the target RLE of 0.08%. The use of this working material eliminates the expense of using SRMs to fulfill traceability requirements for uranium measurements on this type material. Several years' supply of uranyl nitrate solution with NBS traceability was produced. The cost of this material was less than 10% of an equal quantity of SRM-960 uranium metal.« less
NASA Astrophysics Data System (ADS)
Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.
2015-06-01
Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).
Factors Influencing the Fatigue Strength of Materials
NASA Technical Reports Server (NTRS)
Bollenrath, F
1941-01-01
A number of factors are considered which influence the static and fatigue strength of materials under practical operating conditions as contrasted with the relations obtaining under conditions of the usual testing procedure. Such factors are interruptions in operation, periodically fluctuating stress limits and mean stresses with periodic succession of several groups and stress states, statistical changes and succession of stress limits and mean stresses, frictional corrosion at junctures, and notch effects.
Biau, D J; Meziane, M; Bhumbra, R S; Dumaine, V; Babinet, A; Anract, P
2011-09-01
The purpose of this study was to define immediate post-operative 'quality' in total hip replacements and to study prospectively the occurrence of failure based on these definitions of quality. The evaluation and assessment of failure were based on ten radiological and clinical criteria. The cumulative summation (CUSUM) test was used to study 200 procedures over a one-year period. Technical criteria defined failure in 17 cases (8.5%), those related to the femoral component in nine (4.5%), the acetabular component in 32 (16%) and those relating to discharge from hospital in five (2.5%). Overall, the procedure was considered to have failed in 57 of the 200 total hip replacements (28.5%). The use of a new design of acetabular component was associated with more failures. For the CUSUM test, the level of adequate performance was set at a rate of failure of 20% and the level of inadequate performance set at a failure rate of 40%; no alarm was raised by the test, indicating that there was no evidence of inadequate performance. The use of a continuous monitoring statistical method is useful to ensure that the quality of total hip replacement is maintained, especially as newer implants are introduced.
Pasquotti, Giulio; Faccinetto, Alex; Marchioro, Umberto; Todisco, Matteo; Baldo, Vincenzo; Cocchio, Silvia; De Conti, Giorgio
2016-08-01
To monitor the results of ultrasound (US)-guided percutaneous treatment of calcific tendinopathy of the shoulder at 12 months (T12) after treatment (T0). To verify the possible relations between some pre- and post-procedural variables with the clinical outcome at T12. Forty-seven patients (26 female and 21 male) were enrolled in the study. Patients' approval and written informed consent were obtained. Symptoms were assessed by Constant Shoulder Score (CSS) at T0 and T12. Thirty of these also underwent a CSS control at 3 months (T3). The treatment efficacy was statistically tested for relation with location and type of calcification, characteristics of the tendon and subdeltoid bursa, impingement, and rehabilitation treatments. There was a significant increase in the average CSS value between T0 and T12 (40.7 vs. 75.3). The variables analysed did not show a statistically significant effect on the outcome at T12. A link was noticed only between patients' increasing age and score improvement, particularly among female subjects. US-guided treatment of calcific tendonitis is a viable therapeutic option. No pre- or intra-procedural parameters emerged which might help in predicting the outcome, apart from patients' needs in everyday life. • US-guided tcreatment of shoulder calcific tendinopathy is an excellent therapeutic option • Long-term results seem greatly affected by patients' features and needs in everyday life • No proven pre- or intra-procedural parameters emerged that might predict the outcome.
Silver, Matt; Montana, Giovanni
2012-01-01
Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682
Tosri, Nisanuch; Rojanasthien, Noppamas; Srichairatanakool, Somdet; Sangdee, Chaichan
2013-01-01
The objective of this study was to determine the pharmacokinetics of caffeine after single administration of a coffee enema versus coffee consumed orally in healthy male subjects. The study design was an open-label, randomized two-phase crossover study. Eleven healthy subjects were randomly assigned either to receive 500 mL of coffee enema for 10 minutes or to consume 180 mL of ready-to-drink coffee beverage. After a washout period of at least 10 days, all the subjects were switched to receive the alternate coffee procedure. Blood samples were collected immediately before and at specific time points until 12 hours after coffee administration in each phase. The mean caffeine content in both the coffee solution prepared for the coffee enema and the ready-to-drink coffee beverage was not statistically different. The C max and AUC of caffeine obtained from the coffee enema were about 3.5 times significantly less than those of the coffee consumed orally, despite having slightly but statistically faster T max. The t 1/2 of caffeine obtained following both coffee procedures did not statistically differ. In summary, the relative bioavailability of caffeine obtained from the coffee enema was about 3.5 times significantly less than those of the coffee consumed orally. PMID:23533801
The Mantel-Haenszel procedure revisited: models and generalizations.
Fidler, Vaclav; Nagelkerke, Nico
2013-01-01
Several statistical methods have been developed for adjusting the Odds Ratio of the relation between two dichotomous variables X and Y for some confounders Z. With the exception of the Mantel-Haenszel method, commonly used methods, notably binary logistic regression, are not symmetrical in X and Y. The classical Mantel-Haenszel method however only works for confounders with a limited number of discrete strata, which limits its utility, and appears to have no basis in statistical models. Here we revisit the Mantel-Haenszel method and propose an extension to continuous and vector valued Z. The idea is to replace the observed cell entries in strata of the Mantel-Haenszel procedure by subject specific classification probabilities for the four possible values of (X,Y) predicted by a suitable statistical model. For situations where X and Y can be treated symmetrically we propose and explore the multinomial logistic model. Under the homogeneity hypothesis, which states that the odds ratio does not depend on Z, the logarithm of the odds ratio estimator can be expressed as a simple linear combination of three parameters of this model. Methods for testing the homogeneity hypothesis are proposed. The relationship between this method and binary logistic regression is explored. A numerical example using survey data is presented.
The Mantel-Haenszel Procedure Revisited: Models and Generalizations
Fidler, Vaclav; Nagelkerke, Nico
2013-01-01
Several statistical methods have been developed for adjusting the Odds Ratio of the relation between two dichotomous variables X and Y for some confounders Z. With the exception of the Mantel-Haenszel method, commonly used methods, notably binary logistic regression, are not symmetrical in X and Y. The classical Mantel-Haenszel method however only works for confounders with a limited number of discrete strata, which limits its utility, and appears to have no basis in statistical models. Here we revisit the Mantel-Haenszel method and propose an extension to continuous and vector valued Z. The idea is to replace the observed cell entries in strata of the Mantel-Haenszel procedure by subject specific classification probabilities for the four possible values of (X,Y) predicted by a suitable statistical model. For situations where X and Y can be treated symmetrically we propose and explore the multinomial logistic model. Under the homogeneity hypothesis, which states that the odds ratio does not depend on Z, the logarithm of the odds ratio estimator can be expressed as a simple linear combination of three parameters of this model. Methods for testing the homogeneity hypothesis are proposed. The relationship between this method and binary logistic regression is explored. A numerical example using survey data is presented. PMID:23516463
NASA Astrophysics Data System (ADS)
Cheong, Youjin; Kim, Young Jin; Kang, Heeyoon; Choi, Samjin; Lee, Hee Joo
2017-08-01
Although many methodologies have been developed to identify unknown bacteria, bacterial identification in clinical microbiology remains a complex and time-consuming procedure. To address this problem, we developed a label-free method for rapidly identifying clinically relevant multilocus sequencing typing-verified quinolone-resistant Klebsiella pneumoniae strains. We also applied the method to identify three strains from colony samples, ATCC70063 (control), ST11 and ST15; these are the prevalent quinolone-resistant K. pneumoniae strains in East Asia. The colonies were identified using a drop-coating deposition surface-enhanced Raman scattering (DCD-SERS) procedure coupled with a multivariate statistical method. Our workflow exhibited an enhancement factor of 11.3 × 106 to Raman intensities, high reproducibility (relative standard deviation of 7.4%), and a sensitive limit of detection (100 pM rhodamine 6G), with a correlation coefficient of 0.98. All quinolone-resistant K. pneumoniae strains showed similar spectral Raman shifts (high correlations) regardless of bacterial type, as well as different Raman vibrational modes compared to Escherichia coli strains. Our proposed DCD-SERS procedure coupled with the multivariate statistics-based identification method achieved excellent performance in discriminating similar microbes from one another and also in subtyping of K. pneumoniae strains. Therefore, our label-free DCD-SERS procedure coupled with the computational decision supporting method is a potentially useful method for the rapid identification of clinically relevant K. pneumoniae strains.
Rüter, Anders; Vikstrom, Tore
2009-01-01
Good staff procedure skills in a management group during incidents and disasters are believed to be a prerequisite for good management of the situation. However, this has not been demonstrated scientifically. Templates for evaluation results from performance indicators during simulation exercises have previously been tested. The aim of this study was to demonstrate the possibility that these indicators can be used as a tool for studying the relationship between good management skills and good staff procedure skills. Good and structured work (staff procedure skills) in a hospital management group during simulation exercises in disaster medicine is related to good and timely decisions (good management skills). Results from 29 consecutive simulation exercises in which staff procedure skills and management skills were evaluated using quantitative measurements were included. The statistical analysis method used was simple linear regression with staff procedure skills as the response variable and management skills as the predictor variable. An overall significant relationship was identified between staff procedure skills and management skills (p(2)0.05). This study suggests that there is a relationship between staff procedure skills and management skills in the educational setting used. Future studies are needed to demonstrate if this also can be observed during actual incidents.
Certification of highly complex safety-related systems.
Reinert, D; Schaefer, M
1999-01-01
The BIA has now 15 years of experience with the certification of complex electronic systems for safety-related applications in the machinery sector. Using the example of machining centres this presentation will show the systematic procedure for verifying and validating control systems using Application Specific Integrated Circuits (ASICs) and microcomputers for safety functions. One section will describe the control structure of machining centres with control systems using "integrated safety." A diverse redundant architecture combined with crossmonitoring and forced dynamization is explained. In the main section the steps of the systematic certification procedure are explained showing some results of the certification of drilling machines. Specification reviews, design reviews with test case specification, statistical analysis, and walk-throughs are the analytical measures in the testing process. Systematic tests based on the test case specification, Electro Magnetic Interference (EMI), and environmental testing, and site acceptance tests on the machines are the testing measures for validation. A complex software driven system is always undergoing modification. Most of the changes are not safety-relevant but this has to be proven. A systematic procedure for certifying software modifications is presented in the last section of the paper.
DFLOWZ: A free program to evaluate the area potentially inundated by a debris flow
NASA Astrophysics Data System (ADS)
Berti, M.; Simoni, A.
2014-06-01
The transport and deposition mechanisms of debris flows are still poorly understood due to the complexity of the interactions governing the behavior of water-sediment mixtures. Empirical-statistical methods can therefore be used, instead of more sophisticated numerical methods, to predict the depositional behavior of these highly dangerous gravitational movements. We use widely accepted semi-empirical scaling relations and propose an automated procedure (DFLOWZ) to estimate the area potentially inundated by a debris flow event. Beside a digital elevation model (DEM), the procedure has only two input requirements: the debris flow volume and the possible flow-path. The procedure is implemented in Matlab and a Graphical User Interface helps to visualize initial conditions, flow propagation and final results. Different hypothesis about the depositional behavior of an event can be tested together with the possible effect of simple remedial measures. Uncertainties associated to scaling relations can be treated and their impact on results evaluated. Our freeware application aims to facilitate and speed up the process of susceptibility mapping. We discuss limits and advantages of the method in order to inform inexperienced users.
A Web Terminology Server Using UMLS for the Description of Medical Procedures
Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre
1997-01-01
Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841
Avoiding overstating the strength of forensic evidence: Shrunk likelihood ratios/Bayes factors.
Morrison, Geoffrey Stewart; Poh, Norman
2018-05-01
When strength of forensic evidence is quantified using sample data and statistical models, a concern may be raised as to whether the output of a model overestimates the strength of evidence. This is particularly the case when the amount of sample data is small, and hence sampling variability is high. This concern is related to concern about precision. This paper describes, explores, and tests three procedures which shrink the value of the likelihood ratio or Bayes factor toward the neutral value of one. The procedures are: (1) a Bayesian procedure with uninformative priors, (2) use of empirical lower and upper bounds (ELUB), and (3) a novel form of regularized logistic regression. As a benchmark, they are compared with linear discriminant analysis, and in some instances with non-regularized logistic regression. The behaviours of the procedures are explored using Monte Carlo simulated data, and tested on real data from comparisons of voice recordings, face images, and glass fragments. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Regression methods for spatially correlated data: an example using beetle attacks in a seed orchard
Preisler Haiganoush; Nancy G. Rappaport; David L. Wood
1997-01-01
We present a statistical procedure for studying the simultaneous effects of observed covariates and unmeasured spatial variables on responses of interest. The procedure uses regression type analyses that can be used with existing statistical software packages. An example using the rate of twig beetle attacks on Douglas-fir trees in a seed orchard illustrates the...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks Certifying to the Provisions of Part 86... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks Certifying to the Provisions of Part 86... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES...
Information flow and quantum cryptography using statistical fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Home, D.; Whitaker, M.A.B.
2003-02-01
A procedure is formulated, using the quantum teleportation arrangement, that communicates knowledge of an apparatus setting between the wings of the experiment, using statistical fluctuations in a sequence of measurement results. It requires an entangled state, and transmission of classical information totally unrelated to the apparatus setting actually communicated. Our procedure has conceptual interest, and has applications to quantum cryptography.
Weighting Composite Endpoints in Clinical Trials: Essential Evidence for the Heart Team
Tong, Betty C.; Huber, Joel C.; Ascheim, Deborah D.; Puskas, John D.; Ferguson, T. Bruce; Blackstone, Eugene H.; Smith, Peter K.
2013-01-01
Background Coronary revascularization trials often use a composite endpoint of major adverse cardiac and cerebrovascular events (MACCE). The usual practice in analyzing data with a composite endpoint is to assign equal weights to each of the individual MACCE elements. Non-inferiority margins are used to offset effects of presumably less important components, but their magnitudes are subject to bias. This study describes the relative importance of MACCE elements from a patient perspective. Methods A discrete choice experiment was conducted. Survey respondents were presented with a scenario that would make them eligible for the SYNTAX 3-Vessel Disease cohort. Respondents chose among pairs of procedures that differed on the 3-year probability of MACCE, potential for increased longevity, and procedure/recovery time. Conjoint analysis derived relative weights for these attributes. Results In all, 224 respondents completed the survey. The attributes did not have equal weight. Risk of death was most important (relative weight 0.23), followed by stroke (.18), potential increased longevity and recovery time (each 0.17), MI (0.14) and risk of repeat revascularization (0.11). Applying these weights to the SYNTAX 3-year endpoints resulted in a persistent, but decreased margin of difference in MACCE favoring CABG compared to PCI. When labeled only as “Procedure A” and “B,” 87% of respondents chose CABG over PCI. When procedures were labeled as “Coronary Stent” and “Coronary Bypass Surgery,” only 73% chose CABG. Procedural preference varied with demographics, gender and familiarity with the procedures. Conclusions MACCE elements do not carry equal weight in a composite endpoint, from a patient perspective. Using a weighted composite endpoint increases the validity of statistical analyses and trial conclusions. Patients are subject to bias by labels when considering coronary revascularization. PMID:22795064
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
Uehleke, Bernhard; Hopfenmueller, Werner; Stange, Rainer; Saller, Reinhard
2012-01-01
Ancient and medieval herbal books are often believed to describe the same claims still in use today. Medieval herbal books, however, provide long lists of claims for each herb, most of which are not approved today, while the herb's modern use is often missing. So the hypothesis arises that a medieval author could have randomly hit on 'correct' claims among his many 'wrong' ones. We developed a statistical procedure based on a simple probability model. We applied our procedure to the herbal books of Hildegard von Bingen (1098- 1179) as an example for its usefulness. Claim attributions for a certain herb were classified as 'correct' if approximately the same as indicated in actual monographs. The number of 'correct' claim attributions was significantly higher than it could have been by pure chance, even though the vast majority of Hildegard von Bingen's claims were not 'correct'. The hypothesis that Hildegard would have achieved her 'correct' claims purely by chance can be clearly rejected. The finding that medical claims provided by a medieval author are significantly related to modern herbal use supports the importance of traditional medicinal systems as an empirical source. However, since many traditional claims are not in accordance with modern applications, they should be used carefully and analyzed in a systematic, statistics-based manner. Our statistical approach can be used for further systematic comparison of herbal claims of traditional sources as well as in the fields of ethnobotany and ethnopharmacology. Copyright © 2012 S. Karger AG, Basel.
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
Statistics and Discoveries at the LHC (1/4)
Cowan, Glen
2018-02-09
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
Cowan, Glen
2018-02-19
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
Cowan, Glen
2018-05-22
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
Cowan, Glen
2018-04-26
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Wisaijohn, Thunthita; Pimkhaokham, Atiphan; Lapying, Phenkhae; Itthichaisri, Chumpot; Pannarunothai, Supasit; Igarashi, Isao; Kawabuchi, Koichi
2010-01-01
This study aimed to develop a new casemix classification system as an alternative method for the budget allocation of oral healthcare service (OHCS). Initially, the International Statistical of Diseases and Related Health Problem, 10th revision, Thai Modification (ICD-10-TM) related to OHCS was used for developing the software “Grouper”. This model was designed to allow the translation of dental procedures into eight-digit codes. Multiple regression analysis was used to analyze the relationship between the factors used for developing the model and the resource consumption. Furthermore, the coefficient of variance, reduction in variance, and relative weight (RW) were applied to test the validity. The results demonstrated that 1,624 OHCS classifications, according to the diagnoses and the procedures performed, showed high homogeneity within groups and heterogeneity between groups. Moreover, the RW of the OHCS could be used to predict and control the production costs. In conclusion, this new OHCS casemix classification has a potential use in a global decision making. PMID:20936134
Wisaijohn, Thunthita; Pimkhaokham, Atiphan; Lapying, Phenkhae; Itthichaisri, Chumpot; Pannarunothai, Supasit; Igarashi, Isao; Kawabuchi, Koichi
2010-01-01
This study aimed to develop a new casemix classification system as an alternative method for the budget allocation of oral healthcare service (OHCS). Initially, the International Statistical of Diseases and Related Health Problem, 10th revision, Thai Modification (ICD-10-TM) related to OHCS was used for developing the software "Grouper". This model was designed to allow the translation of dental procedures into eight-digit codes. Multiple regression analysis was used to analyze the relationship between the factors used for developing the model and the resource consumption. Furthermore, the coefficient of variance, reduction in variance, and relative weight (RW) were applied to test the validity. The results demonstrated that 1,624 OHCS classifications, according to the diagnoses and the procedures performed, showed high homogeneity within groups and heterogeneity between groups. Moreover, the RW of the OHCS could be used to predict and control the production costs. In conclusion, this new OHCS casemix classification has a potential use in a global decision making.
50 CFR 600.130 - Protection of confidentiality of statistics.
Code of Federal Regulations, 2011 CFR
2011-10-01
... statistics. 600.130 Section 600.130 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... Fishery Management Councils § 600.130 Protection of confidentiality of statistics. Each Council must establish appropriate procedures for ensuring the confidentiality of the statistics that may be submitted to...
50 CFR 600.130 - Protection of confidentiality of statistics.
Code of Federal Regulations, 2010 CFR
2010-10-01
... statistics. 600.130 Section 600.130 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... Fishery Management Councils § 600.130 Protection of confidentiality of statistics. Each Council must establish appropriate procedures for ensuring the confidentiality of the statistics that may be submitted to...
LaBudde, Robert A; Harnly, James M
2012-01-01
A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.
Sorrell, Jeanne M
2013-03-01
The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) is targeted for publication in May 2013. Older adults and their families should be aware of the potential impact that changes in this important document may have on diagnosis and treatment of mental health concerns. Two specific changes related to a new category of Neurocognitive Disorders and a new interpretation of criteria for depression after bereavement are discussed in this article. Nurses can help older adults and their families understand the new DSM-5 terminology and encourage them to discuss risks, benefits, and likely outcomes of diagnoses, procedures, and treatments that may seem unfamiliar. Copyright 2013, SLACK Incorporated.
Ungers, L J; Moskowitz, P D; Owens, T W; Harmon, A D; Briggs, T M
1982-02-01
Determining occupational health and safety risks posed by emerging technologies is difficult because of limited statistics. Nevertheless, estimates of such risks must be constructed to permit comparison of various technologies to identify the most attractive processes. One way to estimate risks is to use statistics on related industries. Based on process labor requirements and associated occupational health data, risks to workers and to society posed by an emerging technology can be calculated. Using data from the California semiconductor industry, this study applies a five-step occupational risk assessment procedure to four processes for the fabrication of photovoltaic cells. The validity of the occupational risk assessment method is discussed.
Illustrating the practice of statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamada, Christina A; Hamada, Michael S
2009-01-01
The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem andmore » incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.« less
Optimum runway orientation relative to crosswinds
NASA Technical Reports Server (NTRS)
Falls, L. W.; Brown, S. C.
1972-01-01
Specific magnitudes of crosswinds may exist that could be constraints to the success of an aircraft mission such as the landing of the proposed space shuttle. A method is required to determine the orientation or azimuth of the proposed runway which will minimize the probability of certain critical crosswinds. Two procedures for obtaining the optimum runway orientation relative to minimizing a specified crosswind speed are described and illustrated with examples. The empirical procedure requires only hand calculations on an ordinary wind rose. The theoretical method utilizes wind statistics computed after the bivariate normal elliptical distribution is applied to a data sample of component winds. This method requires only the assumption that the wind components are bivariate normally distributed. This assumption seems to be reasonable. Studies are currently in progress for testing wind components for bivariate normality for various stations. The close agreement between the theoretical and empirical results for the example chosen substantiates the bivariate normal assumption.
Mason, John O; Patel, Shyam A
2015-01-01
To study the efficacy of epiretinal membrane (ERM) peeling in eyes with dry age-related macular degeneration (AMD). We retrospectively analyzed patient charts on 17 eyes (16 patients) that underwent ERM peeling with a concurrent diagnosis of dry AMD. Eyes with concurrent dry AMD and with a good preoperative best-corrected visual acuity (BCVA) (better than or equal to 20/50) had a statistically significant mean BCVA improvement at 6 months after ERM peeling. There was a statistical increase in mean BCVA from 20/95 to 20/56 in dry AMD eyes, and no eyes showed worsening in BCVA at 6 months or at most recent follow-up. Five/seventeen (29.4%) eyes had cataract formation or progression. There were no other complications, reoperations, or reoccurrences. ERM peeling in eyes with dry AMD may show significant improvement, especially in eyes with good preoperative BCVA. The procedure is relatively safe with low complications and reoccurrences.
Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan
2015-01-01
Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.
Standard Errors and Confidence Intervals of Norm Statistics for Educational and Psychological Tests.
Oosterhuis, Hannah E M; van der Ark, L Andries; Sijtsma, Klaas
2016-11-14
Norm statistics allow for the interpretation of scores on psychological and educational tests, by relating the test score of an individual test taker to the test scores of individuals belonging to the same gender, age, or education groups, et cetera. Given the uncertainty due to sampling error, one would expect researchers to report standard errors for norm statistics. In practice, standard errors are seldom reported; they are either unavailable or derived under strong distributional assumptions that may not be realistic for test scores. We derived standard errors for four norm statistics (standard deviation, percentile ranks, stanine boundaries and Z-scores) under the mild assumption that the test scores are multinomially distributed. A simulation study showed that the standard errors were unbiased and that corresponding Wald-based confidence intervals had good coverage. Finally, we discuss the possibilities for applying the standard errors in practical test use in education and psychology. The procedure is provided via the R function check.norms, which is available in the mokken package.
Implementation of false discovery rate for exploring novel paradigms and trait dimensions with ERPs.
Crowley, Michael J; Wu, Jia; McCreary, Scott; Miller, Kelly; Mayes, Linda C
2012-01-01
False discovery rate (FDR) is a multiple comparison procedure that targets the expected proportion of false discoveries among the discoveries. Employing FDR methods in event-related potential (ERP) research provides an approach to explore new ERP paradigms and ERP-psychological trait/behavior relations. In Study 1, we examined neural responses to escape behavior from an aversive noise. In Study 2, we correlated a relatively unexplored trait dimension, ostracism, with neural response. In both situations we focused on the frontal cortical region, applying a channel by time plots to display statistically significant uncorrected data and FDR corrected data, controlling for multiple comparisons.
48 CFR 215.404-76 - Reporting profit and fee statistics.
Code of Federal Regulations, 2011 CFR
2011-10-01
... statistics. 215.404-76 Section 215.404-76 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-76 Reporting profit and fee statistics. Follow the procedures at PGI 215.404-76 for reporting profit and fee statistics. [71 FR 69494, Dec. 1, 2006] ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Statistics. 1065.602 Section 1065.602... PROCEDURES Calculations and Data Requirements § 1065.602 Statistics. (a) Overview. This section contains equations and example calculations for statistics that are specified in this part. In this section we use...
48 CFR 215.404-76 - Reporting profit and fee statistics.
Code of Federal Regulations, 2010 CFR
2010-10-01
... statistics. 215.404-76 Section 215.404-76 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-76 Reporting profit and fee statistics. Follow the procedures at PGI 215.404-76 for reporting profit and fee statistics. [71 FR 69494, Dec. 1, 2006] ...
Visscher, Arjan P; Lam, Tze J; Meurs-Szojda, Maria M; Felt-Bersma, Richelle J F
2017-08-01
Controlled delivery of radiofrequency energy has been suggested as treatment for fecal incontinence. The aim of this study was to determine whether the clinical response to the radiofrequency energy procedure is superior to sham in patients with fecal incontinence. This was a randomized sham-controlled clinical trial from 2008 to 2015. This study was conducted in an outpatient clinic. Forty patients with fecal incontinence in whom maximal conservative management had failed were randomly assigned to receiving either radiofrequency energy or sham procedure. Fecal incontinence was measured using the Vaizey incontinence score (range, 0-24). The impact of fecal incontinence on quality of life was measured by using the fecal incontinence quality-of-life score (range, 1-4). Measurements were performed at baseline and at 6 months. Anorectal function was evaluated using anal manometry and anorectal endosonography at baseline and at 3 months. At baseline, Vaizey incontinence score was 16.8 (SD 2.9). At t = 6 months, the radiofrequency energy group improved by 2.5 points on the Vaizey incontinence score compared with the sham group (13.2 (SD 3.1), 15.6 (SD 3.3), p = 0.02). The fecal incontinence quality-of-life score at t = 6 months was not statistically different. Anorectal function did not show any alteration. Patients with severe fecal incontinence were included in the study, thus making it difficult to generalize the results. Both radiofrequency energy and sham procedure improved the fecal incontinence score, the radiofrequency energy procedure more than sham. Although statistically significant, the clinical impact for most of the patients was negligible. Therefore, the radiofrequency energy procedure should not be recommended for patients with fecal incontinence until patient-related factors associated with treatment success are known. See Video Abstract at http://links.lww.com/DCR/A373.
Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh
2017-01-01
Context: Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. Aim: The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. Settings and Design: This was a randomized, split-mouth, cross-over study. Materials and Methods: A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Statistical Analysis Used: Paired t-test and repeated measure ANOVA. Results: The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Conclusion: Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures. PMID:29386796
Bottle, Alex; Darzi, Ara W; Athanasiou, Thanos; Vale, Justin A
2010-01-01
Objectives To investigate the relation between volume and mortality after adjustment for case mix for radical cystectomy in the English healthcare setting using improved statistical methodology, taking into account the institutional and surgeon volume effects and institutional structural and process of care factors. Design Retrospective analysis of hospital episode statistics using multilevel modelling. Setting English hospitals carrying out radical cystectomy in the seven financial years 2000/1 to 2006/7. Participants Patients with a primary diagnosis of cancer undergoing an inpatient elective cystectomy. Main outcome measure Mortality within 30 days of cystectomy. Results Compared with low volume institutions, medium volume ones had a significantly higher odds of in-hospital and total mortality: odds ratio 1.72 (95% confidence interval 1.00 to 2.98, P=0.05) and 1.82 (1.08 to 3.06, P=0.02). This was only seen in the final model, which included adjustment for structural and processes of care factors. The surgeon volume-mortality relation showed weak evidence of reduced odds of in-hospital mortality (by 35%) for the high volume surgeons, although this did not reach statistical significance at the 5% level. Conclusions The relation between case volume and mortality after radical cystectomy for bladder cancer became evident only after adjustment for structural and process of care factors, including staffing levels of nurses and junior doctors, in addition to case mix. At least for this relatively uncommon procedure, adjusting for these confounders when examining the volume-outcome relation is critical before considering centralisation of care to a few specialist institutions. Outcomes other than mortality, such as functional morbidity and disease recurrence may ultimately influence towards centralising care. PMID:20305302
NASA Astrophysics Data System (ADS)
Uhlemann, C.; Feix, M.; Codis, S.; Pichon, C.; Bernardeau, F.; L'Huillier, B.; Kim, J.; Hong, S. E.; Laigle, C.; Park, C.; Shin, J.; Pogosyan, D.
2018-02-01
Starting from a very accurate model for density-in-cells statistics of dark matter based on large deviation theory, a bias model for the tracer density in spheres is formulated. It adopts a mean bias relation based on a quadratic bias model to relate the log-densities of dark matter to those of mass-weighted dark haloes in real and redshift space. The validity of the parametrized bias model is established using a parametrization-independent extraction of the bias function. This average bias model is then combined with the dark matter PDF, neglecting any scatter around it: it nevertheless yields an excellent model for densities-in-cells statistics of mass tracers that is parametrized in terms of the underlying dark matter variance and three bias parameters. The procedure is validated on measurements of both the one- and two-point statistics of subhalo densities in the state-of-the-art Horizon Run 4 simulation showing excellent agreement for measured dark matter variance and bias parameters. Finally, it is demonstrated that this formalism allows for a joint estimation of the non-linear dark matter variance and the bias parameters using solely the statistics of subhaloes. Having verified that galaxy counts in hydrodynamical simulations sampled on a scale of 10 Mpc h-1 closely resemble those of subhaloes, this work provides important steps towards making theoretical predictions for density-in-cells statistics applicable to upcoming galaxy surveys like Euclid or WFIRST.
Statistical Signal Models and Algorithms for Image Analysis
1984-10-25
In this report, two-dimensional stochastic linear models are used in developing algorithms for image analysis such as classification, segmentation, and object detection in images characterized by textured backgrounds. These models generate two-dimensional random processes as outputs to which statistical inference procedures can naturally be applied. A common thread throughout our algorithms is the interpretation of the inference procedures in terms of linear prediction
Analysis of cost regression and post-accident absence
NASA Astrophysics Data System (ADS)
Wojciech, Drozd
2017-07-01
The article presents issues related with costs of work safety. It proves the thesis that economic aspects cannot be overlooked in effective management of occupational health and safety and that adequate expenditures on safety can bring tangible benefits to the company. Reliable analysis of this problem is essential for the description the problem of safety the work. In the article attempts to carry it out using the procedures of mathematical statistics [1, 2, 3].
Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A
2007-01-15
Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.
Elshal, Ahmed M; Mekkawy, Ramy; Laymon, Mahmoud; Barakat, Tamer S; Elsaadany, Mohamed M; El-Assmy, Ahmed; El-Nahas, Ahmed R
2016-03-01
To assess the functional outcome and cumulative health-resource-related cost of holmium laser enucleation of the prostate (HoLEP) in comparison with transvesical open prostatectomy (TVOP) in a developing country. Matching of 92 HoLEP and 91 TVOP procedures was performed using resected prostate tissue weight as a sole matching criterion. Safety, efficacy, and accordingly health-related cost-efficiency of both procedures were statistically compared. Preoperative criteria and mean prostate size (166.7 ± 49.7, 161.4 ± 35.7 ml) were similar in HoLEP and TVOP, respectively; however, HoLEP treated more comorbid patients. Blood transfusion was 2.1 and 26.1 % after HoLEP and TVOP, respectively (P = 0.001). Median time to catheter removal and hospital stay was 2 days after HoLEP and 5 and 9 days, respectively, after TVOP (P < 0.001). On modified Clavien scale, grade per grade, there was no statistically significant difference between the two groups apart from local wound complications in TVOP group. High-grade complications (≥ grade 3) were reported in 3.2 and 6.5 % in HoLEP and TVOP, respectively (P = 0.49). Resected prostate tissue weight was independently associated with high-grade periprocedure complications (OR[95 %CI] 1.22[1.02:1.49], P = 0.03). Last follow-up symptom score, peak urine flow rate, residual urine, % PSA reduction, and need for reoperation were comparable between the two groups. HoLEP costs the hospital in the first 3 months 4111.8EP (575US$) versus 4305.4EP (602US$) for TVOP (P = 0.09). In high-volume hospital, HoLEP procedure seems to be equally safe and effective as TVOP with the advantages of minimally invasive procedures. Two years after adopting the technique, HoLEP equally costs the hospital as TVOP. Significant hospital cost savings are anticipated in subsequent cases.
Abercrombie, M L; Jewell, J S
1986-01-01
Results of EMIT, Abuscreen RIA, and GC/MS tests for THC metabolites in a high volume random urinalysis program are compared. Samples were field tested by non-laboratory personnel with an EMIT system using a 100 ng/mL cutoff. Samples were then sent to the Army Forensic Toxicology Drug Testing Laboratory (WRAMC) at Fort Meade, Maryland, where they were tested by RIA (Abuscreen) using a statistical 100 ng/mL cutoff. Confirmations of all RIA positives were accomplished using a GC/MS procedure. EMIT and RIA results agreed for 91% of samples. Data indicated a 4% false positive rate and a 10% false negative rate for EMIT field testing. In a related study, results for samples which tested positive by RIA for THC metabolites using a statistical 100 ng/mL cutoff were compared with results by GC/MS utilizing a 20 ng/mL cutoff for the THCA metabolite. Presence of THCA metabolite was detected in 99.7% of RIA positive samples. No relationship between quantitations determined by the two tests was found.
Communication skills in individuals with spastic diplegia.
Lamônica, Dionísia Aparecida Cusin; Paiva, Cora Sofia Takaya; Abramides, Dagma Venturini Marques; Biazon, Jamile Lozano
2015-01-01
To assess communication skills in children with spastic diplegia. The study included 20 subjects, 10 preschool children with spastic diplegia and 10 typical matched according to gender, mental age, and socioeconomic status. Assessment procedures were the following: interviews with parents, Stanford - Binet method, Gross Motor Function Classification System, Observing the Communicative Behavior, Vocabulary Test by Peabody Picture, Denver Developmental Screening Test II, MacArthur Development Inventory on Communicative Skills. Statistical analysis was performed using the values of mean, median, minimum and maximum value, and using Student's t-test, Mann-Whitney test, and Paired t-test. Individuals with spastic diplegia, when compared to their peers of the same mental age, presented no significant difference in relation to receptive and expressive vocabulary, fine motor skills, adaptive, personal-social, and language. The most affected area was the gross motor skills in individuals with spastic cerebral palsy. The participation in intervention procedures and the pairing of participants according to mental age may have approximated the performance between groups. There was no statistically significant difference in the comparison between groups, showing appropriate communication skills, although the experimental group has not behaved homogeneously.
Correlation of Thermally Induced Pores with Microstructural Features Using High Energy X-rays
NASA Astrophysics Data System (ADS)
Menasche, David B.; Shade, Paul A.; Lind, Jonathan; Li, Shiu Fai; Bernier, Joel V.; Kenesei, Peter; Schuren, Jay C.; Suter, Robert M.
2016-11-01
Combined application of a near-field High Energy Diffraction Microscopy measurement of crystal lattice orientation fields and a tomographic measurement of pore distributions in a sintered nickel-based superalloy sample allows pore locations to be correlated with microstructural features. Measurements were carried out at the Advanced Photon Source beamline 1-ID using an X-ray energy of 65 keV for each of the measurement modes. The nickel superalloy sample was prepared in such a way as to generate significant thermally induced porosity. A three-dimensionally resolved orientation map is directly overlaid with the tomographically determined pore map through a careful registration procedure. The data are shown to reliably reproduce the expected correlations between specific microstructural features (triple lines and quadruple nodes) and pore positions. With the statistics afforded by the 3D data set, we conclude that within statistical limits, pore formation does not depend on the relative orientations of the grains. The experimental procedures and analysis tools illustrated are being applied to a variety of materials problems in which local heterogeneities can affect materials properties.
Permutation tests for goodness-of-fit testing of mathematical models to experimental data.
Fişek, M Hamit; Barlas, Zeynep
2013-03-01
This paper presents statistical procedures for improving the goodness-of-fit testing of theoretical models to data obtained from laboratory experiments. We use an experimental study in the expectation states research tradition which has been carried out in the "standardized experimental situation" associated with the program to illustrate the application of our procedures. We briefly review the expectation states research program and the fundamentals of resampling statistics as we develop our procedures in the resampling context. The first procedure we develop is a modification of the chi-square test which has been the primary statistical tool for assessing goodness of fit in the EST research program, but has problems associated with its use. We discuss these problems and suggest a procedure to overcome them. The second procedure we present, the "Average Absolute Deviation" test, is a new test and is proposed as an alternative to the chi square test, as being simpler and more informative. The third and fourth procedures are permutation versions of Jonckheere's test for ordered alternatives, and Kendall's tau(b), a rank order correlation coefficient. The fifth procedure is a new rank order goodness-of-fit test, which we call the "Deviation from Ideal Ranking" index, which we believe may be more useful than other rank order tests for assessing goodness-of-fit of models to experimental data. The application of these procedures to the sample data is illustrated in detail. We then present another laboratory study from an experimental paradigm different from the expectation states paradigm - the "network exchange" paradigm, and describe how our procedures may be applied to this data set. Copyright © 2012 Elsevier Inc. All rights reserved.
45 CFR 160.536 - Statistical sampling.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...
45 CFR 160.536 - Statistical sampling.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder
2009-08-15
In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.
"Hyperstat": an educational and working tool in epidemiology.
Nicolosi, A
1995-01-01
The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.
2011-01-01
Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252
Patient use of social media to evaluate cosmetic treatments and procedures.
Schlichte, Megan J; Karimkhani, Chante; Jones, Trevor; Trikha, Ritika; Dellavalle, Robert P
2015-04-16
With a growing sphere of influence in the modern world, online social media serves as a readily accessible interface for communication of information. Aesthetic medicine is one of many industries increasingly influenced by social media, as evidenced by the popular website, "RealSelf," an online community founded in 2006 that compiles ratings, reviews, photographs, and expert physician commentary for nearly 300 cosmetic treatments. To investigate the current preferences of patients regarding cosmetic non-surgical, surgical, and dental treatments on RealSelf and in the documented medical literature. On a single day of data collection, all cosmetic treatments or procedures reviewed on the RealSelf website were tabulated, including name, percent "worth it" rating, total number of reviews, and average cost. Patient satisfaction rates documented in the current medical literature for each cosmetic treatment or procedure were also recorded. Statistical t-testingcomparing RealSelf ratings and satisfaction rates in the literature was performed for each category-non-surgical, surgical, and dental. The top ten most-commonly reviewed non-surgical treatments, top ten most-commonly reviewed surgical procedures, and top 5 most-commonly reviewed dental treatments, along with documented satisfaction rates in the medical literature for each treatment or procedure were recorded in table format and ranked by RealSelf "worth it" rating. Paired t-testing revealed that satisfaction rates documented in the literature were significantly higher than RealSelf "worth it" ratings for both non-surgical cosmetic treatments (p=0.00076) and surgical cosmetic procedures (p=0.00056), with no statistically significant difference for dental treatments. For prospective patients interested in cosmetic treatments or procedures, social media sites such as RealSelf may offer information helpful to decision-making as well enable cosmetic treatment providers to build reputations and expand practices. "Worth it" ratings on RealSelf may, in fact, represent a more transparent view of cosmetic treatment or procedural outcomes relative to the high satisfaction rates documented in medical literature. Massive online communication of patient experiences made possible through social media will continue to influence the practice of medicine, both aesthetic and otherwise.
Chopko, Bohdan; Caraway, David L
2010-01-01
Neurogenic claudication due to lumbar spinal stenosis is a common problem that can be caused by many factors including hypertrophic ligamentum flavum, facet hypertrophy, and disc protrusion. When standard medical therapies such as pain medication, epidural steroid injections, and physical therapy fail, or when the patient is unwilling, unable, or not severe enough to advance to more invasive surgical procedures, both physicians and patients are often left with a treatment dilemma. Patients in this study were treated with mild, an ultra-minimally invasive lumbar decompression procedure using a dorsal approach. The mild procedure is performed under fluoroscopic imaging to resect bone adjacent to, and achieve partial resection of, the hypertrophic ligamentum flavum with minimal disruption of surrounding muscular and skeletal structure. To assess the clinical application and patient safety and functional outcomes of the mild lumbar decompression procedure in the treatment of symptomatic central canal spinal stenosis. Multi-center, non-blinded, prospective clinical study. Fourteen US spine specialist practices. Between July 2008 and January 2010, 78 patients were enrolled in the MiDAS I Study and treated with the mild procedure for lumbar decompression. Of these patients, 6-week follow-up was available for 75 patients. Visual Analog Score (VAS), Oswestry Disability Index (ODI), Zurich Claudication Questionnaire (ZCQ), and SF-12v2 Health Survey. Outcomes were assessed at baseline and 6 weeks post-treatment. There were no major device or procedure-related complications reported in this patient cohort. At 6 weeks, the MiDAS I Study showed statistically and clinically significant reduction of pain as measured by VAS, ZCQ, and SF-12v2. In addition, improvement in physical function and mobility as measured by ODI, ZCQ, and SF-12v2 was statistically and clinically significant in this study. This is a preliminary report encompassing 6-week follow-up. There was no control group. In this 75-patient series, and in keeping with a previously published 90-patient safety cohort, the mild procedure proved to be safe. Further, based on near-term follow-up, the mild procedure demonstrated efficacy in improving mobility and reducing pain associated with lumbar spinal canal stenosis.
Palazón, L; Navas, A
2017-06-01
Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Statistical inference methods for two crossing survival curves: a comparison of methods.
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.
Statistical Inference Methods for Two Crossing Survival Curves: A Comparison of Methods
Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng
2015-01-01
A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman’s smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér—von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman’s smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests. PMID:25615624
Galeotti, Angela; Garret Bernardin, Annelyse; D'Antò, Vincenzo; Ferrazzano, Gianmaria Fabrizio; Gentile, Tina; Viarani, Valeria; Cassabgi, Giorgio; Cantile, Tiziana
2016-01-01
Aim . To evaluate the effectiveness and the tolerability of the nitrous oxide sedation for dental treatment on a large pediatric sample constituting precooperative, fearful, and disabled patients. Methods . 472 noncooperating patients (aged 4 to 17) were treated under conscious sedation. The following data were calculated: average age; gender distribution; success/failure; adverse effects; number of treatments; kind of dental procedure undertaken; number of dental procedures for each working session; number of working sessions for each patient; differences between males and females and between healthy and disabled patients in relation to success; success in relation to age; and level of cooperation using Venham score. Results . 688 conscious sedations were carried out. The success was 86.3%. Adverse effects occurred in 2.5%. 1317 dental procedures were performed. In relation to the success, there was a statistically significant difference between healthy and disabled patients. Sex and age were not significant factors for the success. Venham score was higher at the first contact with the dentist than during the treatment. Conclusions . Inhalation conscious sedation represented an effective and safe method to obtain cooperation, even in very young patients, and it could reduce the number of pediatric patients referred to hospitals for general anesthesia.
Galeotti, Angela; Garret Bernardin, Annelyse; D'Antò, Vincenzo; Viarani, Valeria; Cassabgi, Giorgio
2016-01-01
Aim. To evaluate the effectiveness and the tolerability of the nitrous oxide sedation for dental treatment on a large pediatric sample constituting precooperative, fearful, and disabled patients. Methods. 472 noncooperating patients (aged 4 to 17) were treated under conscious sedation. The following data were calculated: average age; gender distribution; success/failure; adverse effects; number of treatments; kind of dental procedure undertaken; number of dental procedures for each working session; number of working sessions for each patient; differences between males and females and between healthy and disabled patients in relation to success; success in relation to age; and level of cooperation using Venham score. Results. 688 conscious sedations were carried out. The success was 86.3%. Adverse effects occurred in 2.5%. 1317 dental procedures were performed. In relation to the success, there was a statistically significant difference between healthy and disabled patients. Sex and age were not significant factors for the success. Venham score was higher at the first contact with the dentist than during the treatment. Conclusions. Inhalation conscious sedation represented an effective and safe method to obtain cooperation, even in very young patients, and it could reduce the number of pediatric patients referred to hospitals for general anesthesia. PMID:27747238
A method for determining the weak statistical stationarity of a random process
NASA Technical Reports Server (NTRS)
Sadeh, W. Z.; Koper, C. A., Jr.
1978-01-01
A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Ostrouchov; W.E.Doll; D.A.Wolf
2003-07-01
Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.
Statistical evaluation of rainfall-simulator and erosion testing procedure : final report.
DOT National Transportation Integrated Search
1977-01-01
The specific aims of this study were (1) to supply documentation of statistical repeatability and precision of the rainfall-simulator and to document the statistical repeatabiity of the soil-loss data when using the previously recommended tentative l...
Eng, John; Wilson, Renee F; Subramaniam, Rathan M; Zhang, Allen; Suarez-Cuervo, Catalina; Turban, Sharon; Choi, Michael J; Sherrod, Cheryl; Hutfless, Susan; Iyoha, Emmanuel E; Bass, Eric B
2016-03-15
Iodine contrast media are essential components of many imaging procedures. An important potential side effect is contrast-induced nephropathy (CIN). To compare CIN risk for contrast media within and between osmolality classes in patients receiving diagnostic or therapeutic imaging procedures. PubMed, EMBASE, Cochrane Library, Clinical Trials.gov, and Scopus through June 2015. Randomized, controlled trials that reported CIN-related outcomes in patients receiving low-osmolar contrast media (LOCM) or iso-osmolar contrast media for imaging. Independent study selection and quality assessment by 2 reviewers and dual extraction of study characteristics and results. None of the 5 studies that compared types of LOCM reported a statistically significant or clinically important difference among study groups, but the strength of evidence was low. Twenty-five randomized, controlled trials found a slight reduction in CIN risk with the iso-osmolar contrast media agent iodixanol compared with a diverse group of LOCM that just reached statistical significance in a meta-analysis (pooled relative risk, 0.80 [95% CI, 0.65 to 0.99]; P = 0.045). This comparison's strength of evidence was moderate. In a meta regression of randomized, controlled trials of iodixanol, no relationship was found between route of administration and comparative CIN risk. Few studies compared LOCM. Procedural details about contrast administration were not uniformly reported. Few studies specified clinical indications or severity of baseline renal impairment. No differences were found in CIN risk among types of LOCM. Iodixanol had a slightly lower risk for CIN than LOCM, but the lower risk did not exceed a criterion for clinical importance. Agency for Healthcare Research and Quality.
Using SPSS to Analyze Book Collection Data.
ERIC Educational Resources Information Center
Townley, Charles T.
1981-01-01
Describes and illustrates Statistical Package for the Social Sciences (SPSS) procedures appropriate for book collection data analysis. Several different procedures for univariate, bivariate, and multivariate analysis are discussed, and applications of procedures for book collection studies are presented. Included are 24 tables illustrating output…
Geographic variations in the cost of spine surgery.
Goz, Vadim; Rane, Ajinkya; Abtahi, Amir M; Lawrence, Brandon D; Brodke, Darrel S; Spiker, William Ryan
2015-09-01
Retrospective review. To define the geographic variation in costs of anterior cervical discectomy and fusion (ACDF) and posterolateral fusion (PLF). ACDF and lumbar PLF are common procedures that are used in the treatment of spinal pathologies. To optimize value, both the benefits and costs of an intervention must be quantified. Data on costs are scarce in comparison with data on total charges. This study aims at defining the costs of ACDF and PLF and describing the geographic variation within the United States. Medicare Provider Utilization and Payment data were used to investigate the costs associated with ACDF, PLF, and total knee arthroplasty (TKA). Average total costs of the procedures were compared by state and geographic region. Combined professional and facility costs for a single-level ACDF had a national mean of $13,899. Total costs for a single-level PLF had a mean of $25,858. Total costs for a primary TKA had a national mean of $13,039. The cost increased to an average of $22,138 for TKA with major comorbidities. Analysis of geographic trends showed statistically significant differences in total costs of PLF, TKA, and TKA, with major complications or comorbidities between geographic regions (P < 0.01 for all). Three of the 4 procedures (PLF, TKA, and TKA with major complications or comorbidities) showed statistically significant variation in cost between geographic regions. The Midwest provided the lowest cost for all procedures. Similar geographic trends in the cost of spinal fusions and TKAs suggest that these trends may not be limited to spine-related procedures. Surgical costs were found to correlate with cost of living but were not associated with the population of the state. These data shed light on the actual cost of common surgical procedures throughout the United States and will allow further progress toward the development of cost-effective, value-driven care. 3.
Hindoyan, Kevork; Tilan, Justin; Buser, Zorica; Cohen, Jeremiah R; Brodke, Darrel S; Youssef, Jim A; Park, Jong-Beom; Yoon, S Tim; Meisel, Hans-Joerg; Wang, Jeffrey C
2017-04-01
Retrospective review. The aim of our study was to quantify the frequency of complications associated with recombinant human bone morphogenetic protein 2 (rhBMP-2) use in anterior lumbar interbody fusion (ALIF). The orthopedic subset of the Medicare database (PearlDiver) was queried for this retrospective cohort study using International Statistical Classification of Diseases 9 (ICD-9) and Current Procedure Terminology (CPT) codes for ALIF procedures with and without rhBMP-2 between 2005 and 2010. Frequencies of complications and reoperations were then identified within 1 year from the index procedure. Complications included reoperations, pulmonary embolus, deep vein thrombosis, myocardial infarction, nerve-related complications, incision and drainage procedures, wound, sepsis, pneumonia, urinary tract infections, respiratory, heterotopic ossification, retrograde ejaculation, radiculopathy, and other medical complications. Odds ratios (ORs) and 95% confidence intervals (CIs) were used to assess the statistical significance. We identified a total of 41 865 patients who had an ALIF procedure. A total of 14 384 patients received rhBMP-2 while 27 481 did not. Overall, 6016 (41.8%) complications within 1 year from surgery were noted within the group who received rhBMP-2 and 12 950 (47.1%) complications within 1 year from surgery were recorded in those who did not receive rhBMP-2 (OR = 0.81, CI = 0.77-0.84). Overall, exposure to rhBMP-2 was associated with significantly decreased odds of complications with exception to reoperation rates (0.9% rhBMP-2 vs 1.0% no rhBMP-2; OR = 0.88, CI = 0.71-1.09) and radiculopathy (4.4% rhBMP-2 vs 4.3% no rhBMP-2; OR = 1.02, CI = 0.93-1.13). The use of rhBMP-2 in patients undergoing ALIF procedure was associated with a significantly decreased rate of complications. Further studies are needed to elucidate a true incidence of complication.
Hindoyan, Kevork; Tilan, Justin; Cohen, Jeremiah R.; Brodke, Darrel S.; Youssef, Jim A.; Park, Jong-Beom; Yoon, S. Tim; Meisel, Hans-Joerg; Wang, Jeffrey C.
2017-01-01
Study Design: Retrospective review. Objective: The aim of our study was to quantify the frequency of complications associated with recombinant human bone morphogenetic protein 2 (rhBMP-2) use in anterior lumbar interbody fusion (ALIF). Methods: The orthopedic subset of the Medicare database (PearlDiver) was queried for this retrospective cohort study using International Statistical Classification of Diseases 9 (ICD-9) and Current Procedure Terminology (CPT) codes for ALIF procedures with and without rhBMP-2 between 2005 and 2010. Frequencies of complications and reoperations were then identified within 1 year from the index procedure. Complications included reoperations, pulmonary embolus, deep vein thrombosis, myocardial infarction, nerve-related complications, incision and drainage procedures, wound, sepsis, pneumonia, urinary tract infections, respiratory, heterotopic ossification, retrograde ejaculation, radiculopathy, and other medical complications. Odds ratios (ORs) and 95% confidence intervals (CIs) were used to assess the statistical significance. Results: We identified a total of 41 865 patients who had an ALIF procedure. A total of 14 384 patients received rhBMP-2 while 27 481 did not. Overall, 6016 (41.8%) complications within 1 year from surgery were noted within the group who received rhBMP-2 and 12 950 (47.1%) complications within 1 year from surgery were recorded in those who did not receive rhBMP-2 (OR = 0.81, CI = 0.77-0.84). Overall, exposure to rhBMP-2 was associated with significantly decreased odds of complications with exception to reoperation rates (0.9% rhBMP-2 vs 1.0% no rhBMP-2; OR = 0.88, CI = 0.71-1.09) and radiculopathy (4.4% rhBMP-2 vs 4.3% no rhBMP-2; OR = 1.02, CI = 0.93-1.13). Conclusions: The use of rhBMP-2 in patients undergoing ALIF procedure was associated with a significantly decreased rate of complications. Further studies are needed to elucidate a true incidence of complication. PMID:28507884
Ozel, Bora; Sezgin, Billur; Guney, Kirdar; Latifoglu, Osman; Celebi, Cemallettin
2015-02-01
Although aesthetic procedures are known to have a higher impact on women, men are becoming more inclined toward such procedures since the last decade. To determine the reason behind the increase in demand for male aesthetic procedures and to learn about the expectations and inquietude related to body contouring surgery, a prospective questionnaire study was conducted on 200 Turkish males from January 1, 2011-May 31, 2012. Demographic information, previous aesthetic procedures and thoughts on body contouring procedures with given reasons were questioned. The results of the study showed that 53 % of all participants considered undergoing body contouring surgery with the given reason that they believed their current body structure required it. For those who did not consider contouring operations, 92.5 % said they felt that they did not need such a procedure. The results of the statistical analysis showed that BMI was a significant factor in the decision making process for wanting to undergo body contouring procedures. The results of the study showed that men's consideration for aesthetic operations depends mainly on necessity and that the most considered region was the abdominal zone in regard to contouring. We can conclude that men are becoming more interested in body contouring operations and therefore different surgical procedures should be refined and re-defined according to the expectations of this new patient group.
Browne, Richard W; Whitcomb, Brian W
2010-07-01
Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.
Taylor-Brown, F E; Cardy, T J A; Liebel, F X; Garosi, L; Kenny, P J; Volk, H A; De Decker, S
2015-12-01
Early post-operative neurological deterioration is a well-known complication following dorsal cervical laminectomies and hemilaminectomies in dogs. This study aimed to evaluate potential risk factors for early post-operative neurological deterioration following these surgical procedures. Medical records of 100 dogs that had undergone a cervical dorsal laminectomy or hemilaminectomy between 2002 and 2014 were assessed retrospectively. Assessed variables included signalment, bodyweight, duration of clinical signs, neurological status before surgery, diagnosis, surgical site, type and extent of surgery and duration of procedure. Outcome measures were neurological status immediately following surgery and duration of hospitalisation. Univariate statistical analysis was performed to identify variables to be included in a multivariate model. Diagnoses included osseous associated cervical spondylomyelopathy (OACSM; n = 41), acute intervertebral disk extrusion (IVDE; 31), meningioma (11), spinal arachnoid diverticulum (10) and vertebral arch anomalies (7). Overall 54% (95% CI 45.25-64.75) of dogs were neurologically worse 48 h post-operatively. Multivariate statistical analysis identified four factors significantly related to early post-operative neurological outcome. Diagnoses of OACSM or meningioma were considered the strongest variables to predict early post-operative neurological deterioration, followed by higher (more severely affected) neurological grade before surgery and longer surgery time. This information can aid in the management of expectations of clinical staff and owners with dogs undergoing these surgical procedures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Seccia, Veronica; Dallan, Iacopo; Massimetti, Gabriele; Segnini, Giovanni; Navari, Elena; Fortunato, Susanna; Bajraktari, Arisa; Lenzi, Riccardo; Muscatello, Luca; Sellari-Franceschini, Stefano
2014-07-01
The objective was to explore the role of specific patient-related and operator-related factors in pain perception during flexible laryngoscopy, which is one of the most common ENT procedures. Monocentric, randomized, individual prospective study. A total of 532 patients (145 men and 387 women), without any relevant ENT diseases, underwent laryngoscopy performed by otolaryngologists with various degrees of experience. Patient discomfort was reported using visual analog scores, and willingness to repeat the experience was also recorded. Statistical analysis showed that greater pain was significantly associated with female patients and female otolaryngologists, whereas the pain was less severe in the cases of experienced laryngologists and older patients. Pain plays an important role in determining the willingness to repeat the examination; in fact, patients who experienced lower levels of pain during laryngoscopy were more prone to repeat the experience. This article explores the importance of the extrinsic factors that are related to the patient and the otolaryngologist in determining the level of pain associated with laryngoscopy. Our study indicated that laryngoscopy is generally a well-tolerated procedure, causing little overall discomfort, but that a subgroup of patients may experience more pain than others, which may affect the patient's perspective toward undergoing a similar future experience. Our analysis may be helpful for clinicians in understanding pain perception during a routine procedure, enabling them to focus more on that subgroup of patients who are more prone to pain. 1b. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.
Adams, Dean C
2014-09-01
Phylogenetic signal is the tendency for closely related species to display similar trait values due to their common ancestry. Several methods have been developed for quantifying phylogenetic signal in univariate traits and for sets of traits treated simultaneously, and the statistical properties of these approaches have been extensively studied. However, methods for assessing phylogenetic signal in high-dimensional multivariate traits like shape are less well developed, and their statistical performance is not well characterized. In this article, I describe a generalization of the K statistic of Blomberg et al. that is useful for quantifying and evaluating phylogenetic signal in highly dimensional multivariate data. The method (K(mult)) is found from the equivalency between statistical methods based on covariance matrices and those based on distance matrices. Using computer simulations based on Brownian motion, I demonstrate that the expected value of K(mult) remains at 1.0 as trait variation among species is increased or decreased, and as the number of trait dimensions is increased. By contrast, estimates of phylogenetic signal found with a squared-change parsimony procedure for multivariate data change with increasing trait variation among species and with increasing numbers of trait dimensions, confounding biological interpretations. I also evaluate the statistical performance of hypothesis testing procedures based on K(mult) and find that the method displays appropriate Type I error and high statistical power for detecting phylogenetic signal in high-dimensional data. Statistical properties of K(mult) were consistent for simulations using bifurcating and random phylogenies, for simulations using different numbers of species, for simulations that varied the number of trait dimensions, and for different underlying models of trait covariance structure. Overall these findings demonstrate that K(mult) provides a useful means of evaluating phylogenetic signal in high-dimensional multivariate traits. Finally, I illustrate the utility of the new approach by evaluating the strength of phylogenetic signal for head shape in a lineage of Plethodon salamanders. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Hofer, Marlis; MöLg, Thomas; Marzeion, Ben; Kaser, Georg
2010-06-01
Recently initiated observation networks in the Cordillera Blanca (Peru) provide temporally high-resolution, yet short-term, atmospheric data. The aim of this study is to extend the existing time series into the past. We present an empirical-statistical downscaling (ESD) model that links 6-hourly National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) reanalysis data to air temperature and specific humidity, measured at the tropical glacier Artesonraju (northern Cordillera Blanca). The ESD modeling procedure includes combined empirical orthogonal function and multiple regression analyses and a double cross-validation scheme for model evaluation. Apart from the selection of predictor fields, the modeling procedure is automated and does not include subjective choices. We assess the ESD model sensitivity to the predictor choice using both single-field and mixed-field predictors. Statistical transfer functions are derived individually for different months and times of day. The forecast skill largely depends on month and time of day, ranging from 0 to 0.8. The mixed-field predictors perform better than the single-field predictors. The ESD model shows added value, at all time scales, against simpler reference models (e.g., the direct use of reanalysis grid point values). The ESD model forecast 1960-2008 clearly reflects interannual variability related to the El Niño/Southern Oscillation but is sensitive to the chosen predictor type.
Using statistical process control to make data-based clinical decisions.
Pfadt, A; Wheeler, D J
1995-01-01
Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merritt, Z; Dave, J; Eschelman, D
Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-areamore » product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based systems was not associated with dose reduction for the most frequently performed FGI procedures, substantial dose reduction was noted with relatively newer systems and dose reduction software.« less
This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...
Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo
2015-04-01
Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
24 CFR 180.650 - Public document items.
Code of Federal Regulations, 2010 CFR
2010-04-01
... AND BUSINESS OPPORTUNITY CONSOLIDATED HUD HEARING PROCEDURES FOR CIVIL RIGHTS MATTERS Procedures at..., opinion, or published scientific or economic statistical data issued by any of the executive departments...
Optimizing Positioning for In-Office Otology Procedures.
Govil, Nandini; DeMayo, William M; Hirsch, Barry E; McCall, Andrew A
2017-01-01
Objective Surgeons often report musculoskeletal discomfort in relation to their practice, but few understand optimal ergonomic positioning. This study aims to determine which patient position-sitting versus supine-is ergonomically optimal for performing otologic procedures. Study Design Observational study. Setting Outpatient otolaryngology clinic setting in a tertiary care facility. Subjects and Methods We observed 3 neurotologists performing a standardized simulated cerumen debridement procedure on volunteers in 2 positions: sitting and supine. The Rapid Upper Limb Assessment (RULA)-a validated tool that calculates stress placed on the upper limb during a task-was used to evaluate ergonomic positioning. Scores on this instrument range from 1 to 7, with a score of 1 to 2 indicating negligible risk of developing posture-related injury. The risk of musculoskeletal disorders increases as the RULA score increases. Results In nearly every trial, RULA scores were lower when the simulated patient was placed in the supine position. When examined as a group, the median RULA scores were 5 with the patient sitting and 3 with the patient in the supine position ( P < .0001). When the RULA scores of the 3 neurotologists were examined individually, each had a statistically significant decrease in score with the patient in the supine position. Conclusion This study indicates that patient position may contribute to ergonomic stress placed on the otolaryngologist's upper limb during in-office otologic procedures. Otolaryngologists should consider performing otologic procedures with the patient in the supine position to decrease their own risk of developing upper-limb musculoskeletal disorders.
Estimating Selected Streamflow Statistics Representative of 1930-2002 in West Virginia
Wiley, Jeffrey B.
2008-01-01
Regional equations and procedures were developed for estimating 1-, 3-, 7-, 14-, and 30-day 2-year; 1-, 3-, 7-, 14-, and 30-day 5-year; and 1-, 3-, 7-, 14-, and 30-day 10-year hydrologically based low-flow frequency values for unregulated streams in West Virginia. Regional equations and procedures also were developed for estimating the 1-day, 3-year and 4-day, 3-year biologically based low-flow frequency values; the U.S. Environmental Protection Agency harmonic-mean flows; and the 10-, 25-, 50-, 75-, and 90-percent flow-duration values. Regional equations were developed using ordinary least-squares regression using statistics from 117 U.S. Geological Survey continuous streamflow-gaging stations as dependent variables and basin characteristics as independent variables. Equations for three regions in West Virginia - North, South-Central, and Eastern Panhandle - were determined. Drainage area, precipitation, and longitude of the basin centroid are significant independent variables in one or more of the equations. Estimating procedures are presented for determining statistics at a gaging station, a partial-record station, and an ungaged location. Examples of some estimating procedures are presented.
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder
2009-01-01
Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086
Generalized energy measurements and modified transient quantum fluctuation theorems
NASA Astrophysics Data System (ADS)
Watanabe, Gentaro; Venkatesh, B. Prasanna; Talkner, Peter
2014-05-01
Determining the work which is supplied to a system by an external agent provides a crucial step in any experimental realization of transient fluctuation relations. This, however, poses a problem for quantum systems, where the standard procedure requires the projective measurement of energy at the beginning and the end of the protocol. Unfortunately, projective measurements, which are preferable from the point of view of theory, seem to be difficult to implement experimentally. We demonstrate that, when using a particular type of generalized energy measurements, the resulting work statistics is simply related to that of projective measurements. This relation between the two work statistics entails the existence of modified transient fluctuation relations. The modifications are exclusively determined by the errors incurred in the generalized energy measurements. They are universal in the sense that they do not depend on the force protocol. Particularly simple expressions for the modified Crooks relation and Jarzynski equality are found for Gaussian energy measurements. These can be obtained by a sequence of sufficiently many generalized measurements which need not be Gaussian. In accordance with the central limit theorem, this leads to an effective error reduction in the individual measurements and even yields a projective measurement in the limit of infinite repetitions.
NASA Technical Reports Server (NTRS)
Holms, A. G.
1977-01-01
A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.
Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives
NASA Technical Reports Server (NTRS)
Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.
2002-01-01
An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.
Statistical Ensemble of Large Eddy Simulations
NASA Technical Reports Server (NTRS)
Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)
2001-01-01
A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M
2011-09-10
The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.
Monitoring Statistics Which Have Increased Power over a Reduced Time Range.
ERIC Educational Resources Information Center
Tang, S. M.; MacNeill, I. B.
1992-01-01
The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2018-06-01
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
Mainela-Arnold, Elina; Evans, Julia L.
2014-01-01
This study tested the predictions of the procedural deficit hypothesis by investigating the relationship between sequential statistical learning and two aspects of lexical ability, lexical-phonological and lexical-semantic, in children with and without specific language impairment (SLI). Participants included 40 children (ages 8;5–12;3), 20 children with SLI and 20 with typical development. Children completed Saffran’s statistical word segmentation task, a lexical-phonological access task (gating task), and a word definition task. Poor statistical learners were also poor at managing lexical-phonological competition during the gating task. However, statistical learning was not a significant predictor of semantic richness in word definitions. The ability to track statistical sequential regularities may be important for learning the inherently sequential structure of lexical-phonology, but not as important for learning lexical-semantic knowledge. Consistent with the procedural/declarative memory distinction, the brain networks associated with the two types of lexical learning are likely to have different learning properties. PMID:23425593
ERIC Educational Resources Information Center
Cook, Samuel A.; Fukawa-Connelly, Timothy
2016-01-01
Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…
Adaptive graph-based multiple testing procedures
Klinglmueller, Florian; Posch, Martin; Koenig, Franz
2016-01-01
Multiple testing procedures defined by directed, weighted graphs have recently been proposed as an intuitive visual tool for constructing multiple testing strategies that reflect the often complex contextual relations between hypotheses in clinical trials. Many well-known sequentially rejective tests, such as (parallel) gatekeeping tests or hierarchical testing procedures are special cases of the graph based tests. We generalize these graph-based multiple testing procedures to adaptive trial designs with an interim analysis. These designs permit mid-trial design modifications based on unblinded interim data as well as external information, while providing strong family wise error rate control. To maintain the familywise error rate, it is not required to prespecify the adaption rule in detail. Because the adaptive test does not require knowledge of the multivariate distribution of test statistics, it is applicable in a wide range of scenarios including trials with multiple treatment comparisons, endpoints or subgroups, or combinations thereof. Examples of adaptations are dropping of treatment arms, selection of subpopulations, and sample size reassessment. If, in the interim analysis, it is decided to continue the trial as planned, the adaptive test reduces to the originally planned multiple testing procedure. Only if adaptations are actually implemented, an adjusted test needs to be applied. The procedure is illustrated with a case study and its operating characteristics are investigated by simulations. PMID:25319733
Bilateral effects of hospital patient-safety procedures on nurses' job satisfaction.
Inoue, T; Karima, R; Harada, K
2017-09-01
The aim of this study was to examine how hospital patient-safety procedures affect the job satisfaction of hospital nurses. Additionally, we investigated the association between perceived autonomy and hospital patient-safety procedures and job satisfaction. Recently, measures for patient safety have been recognized as an essential requirement in hospitals. Hospital patient-safety procedures may enhance the job satisfaction of nurses by improving the quality of their work. However, such procedures may also decrease their job satisfaction by imposing excessive stress on nurses because they cannot make mistakes. The participants included 537 nurses at 10 private hospitals in Japan (The surveys were collected from March to July 2012). Factors related to hospital patient-safety procedures were demonstrated using factor analysis, and the associations between these factors and nurses' self-perceived autonomy and job satisfaction were examined using structural equation modelling. Five factors regarding hospital patient-safety procedures were extracted. Additionally, structural equation modelling revealed statistically significant associations between these factors and the nurses' self-perceived autonomy and job satisfaction. The findings showed that nurses' perceived autonomy of the workplace enhanced their job satisfaction and that their perceptions of hospital patient-safety procedures promoted their job satisfaction. However, some styles of chief nurses' leadership regarding patient safety restrict nurses' independent and autonomous decision-making and actions, resulting in a lowering of job satisfaction. This study demonstrated that hospital patient-safety procedures have ambiguous effects on nurses' job satisfaction. In particular, chief nurses' leadership relating to patient safety can have a positive or negative effect on nurses' job satisfaction. The findings indicated that hospital managers should demonstrate positive attitudes to improve patient safety for nurses' job satisfaction. In addition, policymakers in the hospitals should consider that chief nurses' leadership styles may reduce autonomy and suppress nurses' job satisfaction. © 2017 The Authors International Nursing Review published by John Wiley & Sons Ltd on behalf of International Council of Nurses.
Deep Venous Procedures Performed in the National Health Service in England between 2005 and 2015.
Lim, C S; Shalhoub, J; Davies, A H
2017-10-01
Recent advances in imaging technology and endovenous interventions have revolutionised the management of specific groups of patients with deep venous pathology. This study aimed to examine data published by Hospital Episode Statistics (HES) to assess trends in the number of endovascular and open surgical deep venous procedures performed in National Health Service (NHS) hospitals in England between 2005 and 2015. The main diagnosis of deep venous thrombosis (DVT), and total number of primary open and percutaneous procedures for deep venous pathology for patients admitted to the NHS hospitals in England from 2005 to 2015 were retrieved from the HES database and analysed. An overall declining trend in the annual number of admissions for a primary diagnosis of DVT was observed (linear regression r 2 = 0.9, p < .0001). The number of open surgical procedures for removal of thrombus remained largely unchanged (range 26-70); the frequency of percutaneous procedures increased steadily over the study period (range 0-311). The number of open surgical procedures relating to the vena cava fell between 2005 and 2009, and remained around 50 per year thereafter. Annual numbers of cases of deep venous bypass (range 17-33) and venous valve surgery (range 8-47) remained similar in trend over this period. The number of vena cava stent (range 0-405), other venous stent (range 0-316), and percutaneous venoplasty (range 0-972) procedures increased over the first 5 years of the study period. There is an increasing trend in relation to endovenous procedures but not open surgery, being carried out for deep venous pathology in the last decade in NHS hospitals in England. Despite a number of limitations with HES, the increase in the number of endovenous procedures shown is likely to have significant implications for the provision of care and healthcare resources for patients with deep venous pathology. Copyright © 2017 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Estimating procedure times for surgeries by determining location parameters for the lognormal model.
Spangler, William E; Strum, David P; Vargas, Luis G; May, Jerrold H
2004-05-01
We present an empirical study of methods for estimating the location parameter of the lognormal distribution. Our results identify the best order statistic to use, and indicate that using the best order statistic instead of the median may lead to less frequent incorrect rejection of the lognormal model, more accurate critical value estimates, and higher goodness-of-fit. Using simulation data, we constructed and compared two models for identifying the best order statistic, one based on conventional nonlinear regression and the other using a data mining/machine learning technique. Better surgical procedure time estimates may lead to improved surgical operations.
Additivity of nonlinear biomass equations
Bernard R. Parresol
2001-01-01
Two procedures that guarantee the property of additivity among the components of tree biomass and total tree biomass utilizing nonlinear functions are developed. Procedure 1 is a simple combination approach, and procedure 2 is based on nonlinear joint-generalized regression (nonlinear seemingly unrelated regressions) with parameter restrictions. Statistical theory is...
50 CFR 600.135 - Meeting procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Meeting procedures. 600.135 Section 600....135 Meeting procedures. Link to an amendment published at 75 FR 59150, Sept. 27, 2010. (a) Public notice of regular meetings of the Council, scientific statistical committee or advisory panels, including...
Firanescu, Cristina E; de Vries, Jolanda; Lodder, Paul; Venmans, Alexander; Schoemaker, Marinus C; Smeet, Albert J; Donga, Esther; Juttmann, Job R; Klazen, Caroline A H; Elgersma, Otto E H; Jansen, Frits H; Tielbeek, Alexander V; Boukrab, Issam; Schonenberg, Karen; van Rooij, Willem Jan J; Hirsch, Joshua A; Lohle, Paul N M
2018-05-09
To assess whether percutaneous vertebroplasty results in more pain relief than a sham procedure in patients with acute osteoporotic compression fractures of the vertebral body. Randomised, double blind, sham controlled clinical trial. Four community hospitals in the Netherlands, 2011-15. 180 participants requiring treatment for acute osteoporotic vertebral compression fractures were randomised to either vertebroplasty (n=91) or a sham procedure (n=89). Participants received local subcutaneous lidocaine (lignocaine) and bupivacaine at each pedicle. The vertebroplasty group also received cementation, which was simulated in the sham procedure group. Main outcome measure was mean reduction in visual analogue scale (VAS) scores at one day, one week, and one, three, six, and 12 months. Clinically significant pain relief was defined as a decrease of 1.5 points in VAS scores from baseline. Secondary outcome measures were the differences between groups for changes in the quality of life for osteoporosis and Roland-Morris disability questionnaire scores during 12 months' follow-up. The mean reduction in VAS score was statistically significant in the vertebroplasty and sham procedure groups at all follow-up points after the procedure compared with baseline. The mean difference in VAS scores between groups was 0.20 (95% confidence interval -0.53 to 0.94) at baseline, -0.43 (-1.17 to 0.31) at one day, -0.11 (-0.85 to 0.63) at one week, 0.41 (-0.33 to 1.15) at one month, 0.21 (-0.54 to 0.96) at three months, 0.39 (-0.37 to 1.15) at six months, and 0.45 (-0.37 to 1.24) at 12 months. These changes in VAS scores did not, however, differ statistically significantly between the groups during 12 months' follow-up. The results for secondary outcomes were not statistically significant. Use of analgesics (non-opioids, weak opioids, strong opioids) decreased statistically significantly in both groups at all time points, with no statistically significant differences between groups. Two adverse events occurred in the vertebroplasty group: one respiratory insufficiency and one vasovagal reaction. Percutaneous vertebroplasty did not result in statistically significantly greater pain relief than a sham procedure during 12 months' follow-up among patients with acute osteoporotic vertebral compression fractures. ClinicalTrials.gov NCT01200277. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Irradiation-hyperthermia in canine hemangiopericytomas: large-animal model for therapeutic response.
Richardson, R C; Anderson, V L; Voorhees, W D; Blevins, W E; Inskeep, T K; Janas, W; Shupe, R E; Babbs, C F
1984-11-01
Results of irradiation-hyperthermia treatment in 11 dogs with naturally occurring hemangiopericytoma were reported. Similarities of canine and human hemangiopericytomas were described. Orthovoltage X-irradiation followed by microwave-induced hyperthermia resulted in a 91% objective response rate. A statistical procedure was given to evaluate quantitatively the clinical behavior of locally invasive, nonmetastatic tumors in dogs that were undergoing therapy for control of local disease. The procedure used a small sample size and demonstrated distribution of the data on a scaled response as well as transformation of the data through classical parametric and nonparametric statistical methods. These statistical methods set confidence limits on the population mean and placed tolerance limits on a population percentage. Application of the statistical methods to human and animal clinical trials was apparent.
NASA Technical Reports Server (NTRS)
Howell, L. W.
2001-01-01
A simple power law model consisting of a single spectral index alpha-1 is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV. Two procedures for estimating alpha-1 the method of moments and maximum likelihood (ML), are developed and their statistical performance compared. It is concluded that the ML procedure attains the most desirable statistical properties and is hence the recommended statistical estimation procedure for estimating alpha-1. The ML procedure is then generalized for application to a set of real cosmic-ray data and thereby makes this approach applicable to existing cosmic-ray data sets. Several other important results, such as the relationship between collecting power and detector energy resolution, as well as inclusion of a non-Gaussian detector response function, are presented. These results have many practical benefits in the design phase of a cosmic-ray detector as they permit instrument developers to make important trade studies in design parameters as a function of one of the science objectives. This is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.
The transfer of analytical procedures.
Ermer, J; Limberger, M; Lis, K; Wätzig, H
2013-11-01
Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.
Schmitter, Marc; Kress, Bodo; Leckel, Michael; Henschel, Volkmar; Ohlmann, Brigitte; Rammelsberg, Peter
2008-06-01
This hypothesis-generating study was performed to determine which items in the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) and additional diagnostic tests have the best predictive accuracy for joint-related diagnoses. One hundred forty-nine TMD patients and 43 symptom-free subjects were examined in clinical examinations and with magnetic resonance imaging (MRI). The importance of each variable of the clinical examination for correct joint-related diagnosis was assessed by using MRI diagnoses. For this purpose, "random forest" statistical software (based on classification trees) was used. Maximum unassisted jaw opening, maximum assisted jaw opening, history of locked jaw, joint sound with and without compression, joint pain, facial pain, pain on palpation of the lateral pterygoid area, and overjet proved suitable for distinguishing between subtypes of joint-related TMD. Measurement of excursion, protrusion, and midline deviation were less important. The validity of clinical TMD examination procedures can be enhanced by using the 16 variables of greatest importance identified in this study. In addition to other variables, maximum unassisted and assisted opening and a history of locked jaw were important when assessing the status of the TMJ.
Intermediate/Advanced Research Design and Statistics
NASA Technical Reports Server (NTRS)
Ploutz-Snyder, Robert
2009-01-01
The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs
Analyzing Faculty Salaries When Statistics Fail.
ERIC Educational Resources Information Center
Simpson, William A.
The role played by nonstatistical procedures, in contrast to multivariant statistical approaches, in analyzing faculty salaries is discussed. Multivariant statistical methods are usually used to establish or defend against prima facia cases of gender and ethnic discrimination with respect to faculty salaries. These techniques are not applicable,…
Comparing Assessment Methods in Undergraduate Statistics Courses
ERIC Educational Resources Information Center
Baxter, Sarah E.
2017-01-01
The purpose of this study was to compare undergraduate students' academic performance and attitudes about statistics in the context of two different types of assessment structures for an introductory statistics course. One assessment structure used in-class quizzes that emphasized computation and procedural fluency as well as vocabulary…
5 CFR 532.215 - Establishments included in regular appropriated fund surveys.
Code of Federal Regulations, 2010 CFR
2010-01-01
... in surveys shall be selected under standard probability sample selection procedures. In areas with... establishment list drawn under statistical sampling procedures. [55 FR 46142, Nov. 1, 1990] ...
Random forests for classification in ecology
Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.
2007-01-01
Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.
Statistical analogues of thermodynamic extremum principles
NASA Astrophysics Data System (ADS)
Ramshaw, John D.
2018-05-01
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.
The kappa statistic in rehabilitation research: an examination.
Tooth, Leigh R; Ottenbacher, Kenneth J
2004-08-01
The number and sophistication of statistical procedures reported in medical rehabilitation research is increasing. Application of the principles and methods associated with evidence-based practice has contributed to the need for rehabilitation practitioners to understand quantitative methods in published articles. Outcomes measurement and determination of reliability are areas that have experienced rapid change during the past decade. In this study, distinctions between reliability and agreement are examined. Information is presented on analytical approaches for addressing reliability and agreement with the focus on the application of the kappa statistic. The following assumptions are discussed: (1) kappa should be used with data measured on a categorical scale, (2) the patients or objects categorized should be independent, and (3) the observers or raters must make their measurement decisions and judgments independently. Several issues related to using kappa in measurement studies are described, including use of weighted kappa, methods of reporting kappa, the effect of bias and prevalence on kappa, and sample size and power requirements for kappa. The kappa statistic is useful for assessing agreement among raters, and it is being used more frequently in rehabilitation research. Correct interpretation of the kappa statistic depends on meeting the required assumptions and accurate reporting.
Accurate mass measurement: terminology and treatment of data.
Brenton, A Gareth; Godfrey, A Ruth
2010-11-01
High-resolution mass spectrometry has become ever more accessible with improvements in instrumentation, such as modern FT-ICR and Orbitrap mass spectrometers. This has resulted in an increase in the number of articles submitted for publication quoting accurate mass data. There is a plethora of terms related to accurate mass analysis that are in current usage, many employed incorrectly or inconsistently. This article is based on a set of notes prepared by the authors for research students and staff in our laboratories as a guide to the correct terminology and basic statistical procedures to apply in relation to mass measurement, particularly for accurate mass measurement. It elaborates on the editorial by Gross in 1994 regarding the use of accurate masses for structure confirmation. We have presented and defined the main terms in use with reference to the International Union of Pure and Applied Chemistry (IUPAC) recommendations for nomenclature and symbolism for mass spectrometry. The correct use of statistics and treatment of data is illustrated as a guide to new and existing mass spectrometry users with a series of examples as well as statistical methods to compare different experimental methods and datasets. Copyright © 2010. Published by Elsevier Inc.
34 CFR 668.46 - Institutional security policies and crime statistics.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (11) A statement of policy regarding the institution's campus sexual assault programs to prevent sex offenses, and procedures to follow when a sex offense occurs. The statement must include— (i) A description... nonforcible sex offenses; (ii) Procedures students should follow if a sex offense occurs, including procedures...
Cost comparison between uterine-sparing fibroid treatments one year following treatment
2014-01-01
Background To compare one-year all-cause and uterine fibroid (UF)-related direct costs in patients treated with one of the following three uterine-sparing procedures: magnetic resonance-guided focused ultrasound (MRgFUS), uterine artery embolization (UAE) and myomectomy. Methods This retrospective observational cohort study used healthcare claims for several million individuals with healthcare coverage from employers in the MarketScan Database for the period 2003–2010. UF patients aged 25–54 on their first UF procedure (index) date with 366-day baseline experience, 366-day follow-up period, continuous health plan enrollment during baseline and follow-up, and absence of any baseline UF procedures were included in the final sample. Cost outcomes were measured by allowed charges (sum of insurer-paid and patient-paid amounts). UF-related cost was defined as difference in mean cost between study cohorts and propensity-score-matched control cohorts without UF. Multivariate adjustment of cost outcomes was conducted using generalized linear models. Results The study sample comprised 14,426 patients (MRgFUS = 14; UAE = 4,092; myomectomy = 10,320) with a higher percent of older patients in MRgFUS cohort (71% vs. 50% vs. 12% in age-group 45–54, P < 0.001). Adjusted all-cause mean cost was lowest for MRgFUS ($19,763; 95% CI: $10,425-$38,694) followed by myomectomy ($20,407; 95% CI: $19,483-$21,381) and UAE ($25,019; 95% CI: $23,738-$26,376) but without statistical significance. Adjusted UF-related costs were also not significantly different between the three procedures. Conclusions Adjusted all-cause and UF-related costs at one year were not significantly different between patients undergoing MRgFUS, myomectomy and UAE. PMID:25512868
15-Year-Experience of a Knee Arthroscopist
Tatari, Mehmet Hasan; Bektaş, Yunus Emre; Demirkıran, Demirhan; Ellidokuz, Hülya
2014-01-01
Objectives: Arthroscopic knee surgery is a an experience-demanding procedure throughout diagnostic and reconstructive parts. Altough the literature says that there must be no need for diagnostic arthroscopy today, most arthroscopic surgeons have gained experience and developed themselves by the help of diagnostic arthroscopy and some basic procedures like debridement and lavage. The purpose of this study was to observe what happenned in the 15-year-experience of an orthopaedic surgeon who deals with knee arthroscopy. The hypothesis was that the mean age of the patients, who have undergone arthroscopic procedures, would decrease, and the percentage of the diagnostic and debridement applications would diminish and reconstructive procedures would increase. Methods: For this purpose, 959 patients who have undergone knee arthroscopy in 15 years, were evaluated retrospectively. The gender, age, operation year and the procedure applied for the patients were enrolled on an Excel file. Chi-Square test was used for statistical evaluation. The patients were divided into three groups according to the year they were operated. Period 1 included the patients who were operated between the years 1999-2003, Period 2 between 2004-2008 and Period 3 between 2009-2013. According to their ages, the patients were evaluated in three groups; Group 1 included the patients ≤ 25 years old while Group 2 between 26-40 and Group 3 ≥ 41. Arthroscopic procedures were evaluated in three groups: Group X: meniscectomy, chondral debridement, lavage, synoviectomy, loose body removal. Group Y: ACL and PCL reconstruction, meniscal repair. Group Z: Microfracture, lateral release, meniscal normalization, second look arthroscopy, diagnostic arthroscopy before osteotomy. Results: Among all patients, 60 % was male and Group 3 (45.4 %) was the larger group in population. The procedures in Group X were used in most of the operations ( 59.2 %). The population of the patients in the periods increased gradually throughout the years: 24 % in Period 1, 36.6 % in Period 2 and 39.4 % in Period 3. While the population of Group 3 was higher than the others in the first two periods, Group 2 was the leader in the last period (p< 0.001). While male/female ratio was statistically insignificant in Periods 1 and 2, the number of the males in Period 3 was statistically higher than the females (p< 0.001). The procedures in Group Y were used significantly for males in Periods 2 and 3 (p< 0.001). The procedures in Group X were used significantly for females (p< 0.001) while the ones in Group Y were applied for males (p< 0.001). Among all arthroscopic procedures, Group X was the leader in Period 1 (85 %) but this frequency decreased throughout the years and the procedures in Group Y increased gradually more than twice consisting more than half of the procedures in Period 3 (p< 0.001). Conclusion: Throughout the years, the age of the patients, for whom arthroscopic procedures were done, and the percentage of debridement and diagnostic procedures have decreased, while the population of the patients and the number of the reconstructive procedures, especially for males, have increased. The results were statistically significant. In our opinion, this statistical conclusion must be the usual academic development of an orthopeadic surgeon who deals mostly with knee arthroscopy in his daily practice. This must be a guide for young arthroscopists.
Quantifying the impact of between-study heterogeneity in multivariate meta-analyses
Jackson, Dan; White, Ian R; Riley, Richard D
2012-01-01
Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950
Applications of satellite image processing to the analysis of Amazonian cultural ecology
NASA Technical Reports Server (NTRS)
Behrens, Clifford A.
1991-01-01
This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.
Non-statistical effects in bond fission reactions of 1,2-difluoroethane
NASA Astrophysics Data System (ADS)
Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.
1991-08-01
A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.
Statistics, Adjusted Statistics, and Maladjusted Statistics.
Kaufman, Jay S
2017-05-01
Statistical adjustment is a ubiquitous practice in all quantitative fields that is meant to correct for improprieties or limitations in observed data, to remove the influence of nuisance variables or to turn observed correlations into causal inferences. These adjustments proceed by reporting not what was observed in the real world, but instead modeling what would have been observed in an imaginary world in which specific nuisances and improprieties are absent. These techniques are powerful and useful inferential tools, but their application can be hazardous or deleterious if consumers of the adjusted results mistake the imaginary world of models for the real world of data. Adjustments require decisions about which factors are of primary interest and which are imagined away, and yet many adjusted results are presented without any explanation or justification for these decisions. Adjustments can be harmful if poorly motivated, and are frequently misinterpreted in the media's reporting of scientific studies. Adjustment procedures have become so routinized that many scientists and readers lose the habit of relating the reported findings back to the real world in which we live.
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
Satyarthee, Guru Dutta; Chandra, P. Sarat; Sharma, Bhawani S.; Mehta, V. S.
2017-01-01
Introduction: The computed tomography (CT) guided stereotactic biopsy (STB) is considered as method of choice for biopsy of intracranial mass lesions. However, it's disadvantages are frame fixation, time requirement for transportation between CT scan suit to the operation theater with added much higher equipment cost in the relatively resource scarred developing country. Ultrasound-guided biopsy (USGB) is relatively simpler, economical, less time consuming, and real-time procedure. Clinical Materials and Methods: Thirty-seven consecutively admitted patients with supratentorial brain tumors, who underwent biopsy of the lesion using CT compatible stereotactic and ultrasound-guided (USGB) procedure formed cohort of the study. Based on location and size of the lesions, the cases were divided into two groups, superficial and deep. Twenty-two patients underwent ultrasound-guided biopsy and 15 with STB. Results: The diagnostic yield of STB was 93% and 91% for ultrasound-guided biopsy. The mean operation time of STB group was 149.00 min and 94 min for USGB, which was statistically significant. Two cases in each group developed hematoma; however, one case in USGB group needed surgical evacuation. The real-time monitoring detected two hematoma intraoperatively, which were further also confirmed on postoperative CT scan head. Conclusions: The ultrasound-guided biopsy procedure (USGB) was simple, relatively shorter time-consuming procedure and equally efficacious and utilizing economical equipment and can act as a safer alternative to CT STB process for biopsy of the intracranial mass lesion. Furthermore, USGB also provided intra-operative real-time monitoring, which provided clue for close monitoring in the postoperative period after completion of biopsy to look for development of fresh hematoma development not only at the biopsy site but also along the biopsy track and adjoining area. Perhaps, a longer period of ultrasonic monitoring following the procedure would be of greater help to detect hematoma formation, which is one of the most common complications of the biopsy procedure. PMID:29114280
Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce
2013-03-01
New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.
Defining the ecological hydrology of Taiwan Rivers using multivariate statistical methods
NASA Astrophysics Data System (ADS)
Chang, Fi-John; Wu, Tzu-Ching; Tsai, Wen-Ping; Herricks, Edwin E.
2009-09-01
SummaryThe identification and verification of ecohydrologic flow indicators has found new support as the importance of ecological flow regimes is recognized in modern water resources management, particularly in river restoration and reservoir management. An ecohydrologic indicator system reflecting the unique characteristics of Taiwan's water resources and hydrology has been developed, the Taiwan ecohydrological indicator system (TEIS). A major challenge for the water resources community is using the TEIS to provide environmental flow rules that improve existing water resources management. This paper examines data from the extensive network of flow monitoring stations in Taiwan using TEIS statistics to define and refine environmental flow options in Taiwan. Multivariate statistical methods were used to examine TEIS statistics for 102 stations representing the geographic and land use diversity of Taiwan. The Pearson correlation coefficient showed high multicollinearity between the TEIS statistics. Watersheds were separated into upper and lower-watershed locations. An analysis of variance indicated significant differences between upstream, more natural, and downstream, more developed, locations in the same basin with hydrologic indicator redundancy in flow change and magnitude statistics. Issues of multicollinearity were examined using a Principal Component Analysis (PCA) with the first three components related to general flow and high/low flow statistics, frequency and time statistics, and quantity statistics. These principle components would explain about 85% of the total variation. A major conclusion is that managers must be aware of differences among basins, as well as differences within basins that will require careful selection of management procedures to achieve needed flow regimes.
78 FR 63568 - Proposed Collection; Comment Request for Rev. Proc. 2007-35
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES: Written comments... . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling for purposes of Section 199. OMB Number: 1545-2072... statistical sampling may be used in purposes of section 199, which provides a deduction for income...
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
Explorations in Statistics: Permutation Methods
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2012-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Illustrating Sampling Distribution of a Statistic: Minitab Revisited
ERIC Educational Resources Information Center
Johnson, H. Dean; Evans, Marc A.
2008-01-01
Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…
Statistical baseline assessment in cardiotocography.
Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura
2017-07-01
Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.
The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.
Christensen, G B; Knight, S; Camp, N J
2009-11-01
We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.
Chaudhry-Waterman, Nadia; Coombs, Sandra; Porras, Diego; Holzer, Ralf; Bergersen, Lisa
2014-01-01
The broad range of relatively rare procedures performed in pediatric cardiac catheterization laboratories has made the standardization of care and risk assessment in the field statistically quite problematic. However, with the growing number of patients who undergo cardiac catheterization, it has become imperative that the cardiology community overcomes these challenges to study patient outcomes. The Congenital Cardiac Catheterization Project on Outcomes was able to develop benchmarks, tools for measurement, and risk adjustment methods while exploring procedural efficacy. Based on the success of these efforts, the collaborative is pursuing a follow-up project, the Congenital Cardiac Catheterization Project on Outcomes-Quality Improvement, aimed at improving the outcomes for all patients undergoing catheterization for congenital heart disease by reducing radiation exposure.
International labour migration statistics in Asia: an appraisal.
Athukorala, P C; Wickramasekara, P
1996-01-01
"The present paper attempts a critical review of the data systems of seven major labour-exporting countries--Bangladesh, India, Indonesia, Pakistan, Philippines, Sri Lanka and Thailand--which account for over 90 per cent of labour outflows from Asia....Data...are discussed under separate sections focusing on limitations as well as potential for further exploitation.... For all countries reviewed here, these data significantly understate total labour outflows, and the magnitude of the error seems to vary between countries and reflect both differences relating to the coverage and efficiency of the approval and monitoring procedure. This throws serious doubts on the appropriateness of official outmigration series for cross country comparison. Frequent changes in reporting procedures also make for discrete changes and spurious shifts in data which render trend analysis quite hazardous." (SUMMARY IN FRE AND SPA) excerpt
Dent, Andrew W; Weiland, Tracey J; Paltridge, Debbie
2008-06-01
To report the preferences of Fellows of the Australasian College for Emergency Medicine for topics they would desire for their continuing professional development (CPD). A mailed survey of Fellows of the Australasian College for Emergency Medicine asked for Likert type responses on the desirability of CPD on 15 procedural skills, 13 management skills, 11 clinical emergency topics, 9 topics related to teaching, 7 related to diagnostics and 5 evidence based practice topics. CPD in procedural skills of advanced and surgical airways, ED ultrasound, ventilation, skills, plastic procedures and regional anaesthesia were nominated as desirable by 85% of emergency physicians (EP). More than 90% desired CPD in ophthalmological, otorhinolaryngeal, neonatal and paediatric emergencies. Of diagnostic skills, more than 80% considered CPD on computerized tomography, electrocardiography and plain X-ray interpretation as desirable, well as CPD about teaching in general, simulation and preparing candidates for fellowship exams. Of the 12 management skills, 11 were seen as desirable topics by more than 70%, with counter disaster planning, giving feedback and dealing with complaints the most popular. All evidence based practice related skills, including interpreting statistics and undertaking literature searches were seen as desirable topics by more than 80% of EP. This information may assist in the planning of future educational interventions for emergency physicians. EP seek CPD on management, educational and other non clinical skills, as well as topics relating directly to patient care.
NASA Technical Reports Server (NTRS)
Colvin, E. L.; Emptage, M. R.
1992-01-01
The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.
NASA Astrophysics Data System (ADS)
Mehrotra, Rajeshwar; Sharma, Ashish
2012-12-01
The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.
A conceptual weather-type classification procedure for the Philadelphia, Pennsylvania, area
McCabe, Gregory J.
1990-01-01
A simple method of weather-type classification, based on a conceptual model of pressure systems that pass through the Philadelphia, Pennsylvania, area, has been developed. The only inputs required for the procedure are daily mean wind direction and cloud cover, which are used to index the relative position of pressure systems and fronts to Philadelphia.Daily mean wind-direction and cloud-cover data recorded at Philadelphia, Pennsylvania, from January 1954 through August 1988 were used to categorize daily weather conditions. The conceptual weather types reflect changes in daily air and dew-point temperatures, and changes in monthly mean temperature and monthly and annual precipitation. The weather-type classification produced by using the conceptual model was similar to a classification produced by using a multivariate statistical classification procedure. Even though the conceptual weather types are derived from a small amount of data, they appear to account for the variability of daily weather patterns sufficiently to describe distinct weather conditions for use in environmental analyses of weather-sensitive processes.
Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0
Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.
2008-01-01
The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.
A method to estimate statistical errors of properties derived from charge-density modelling
Lecomte, Claude
2018-01-01
Estimating uncertainties of property values derived from a charge-density model is not straightforward. A methodology, based on calculation of sample standard deviations (SSD) of properties using randomly deviating charge-density models, is proposed with the MoPro software. The parameter shifts applied in the deviating models are generated in order to respect the variance–covariance matrix issued from the least-squares refinement. This ‘SSD methodology’ procedure can be applied to estimate uncertainties of any property related to a charge-density model obtained by least-squares fitting. This includes topological properties such as critical point coordinates, electron density, Laplacian and ellipticity at critical points and charges integrated over atomic basins. Errors on electrostatic potentials and interaction energies are also available now through this procedure. The method is exemplified with the charge density of compound (E)-5-phenylpent-1-enylboronic acid, refined at 0.45 Å resolution. The procedure is implemented in the freely available MoPro program dedicated to charge-density refinement and modelling. PMID:29724964
Futyma, Konrad; Nowakowski, Łukasz; Gałczyński, Krzysztof; Miotła, Paweł; Rechberger, Tomasz
2016-12-01
Those patients who failed to achieve continence after a procedure aimed to correct it, require a special attitude and precise management due to the sophisticated anatomical and functional field of interest. The purpose of the present study was to assess long-term clinical efficacy and evaluate the frequency and severity of any complications related to recurrent stress urinary incontinence treatment with a non-absorbable bulking agent periurethral injections. Between February 2012-September 2013, 66 patients with recurrent stress urinary incontinence were treated with Urolastic in the tertiary referral gynecologic department. The efficacy of the procedure was assessed objectively at each follow-up visit, scheduled at two, six weeks and 3, 6, 12 and 24 months after primary procedure. Material was injected under local anesthesia according to the manufacturer's instructions, at 10, 2, 4 and 8 o'clock positions with 0.5-1.25ccm per spot. Statistical analyses were performed with Statistica package version 8.0 (StatSoft Inc., Tulsa, OK, USA). A p value <0.05 was considered statistically significant. Objective success rate at 24 months was found in 32.7% of patients, including 22.4% patients who were completely dry. The efficacy of Urolastic, when considering the intention to treat, is 24.2% and 16.7%, respectively. In 4.5% patients an oval shaped material was found inside the bladder. Overall, complications were observed in 17 (25.8%) patients. Although only 30% of patients will benefit from Urolastic injection on the long-term basis it seems to be a safe procedure in the treatment of recurrent stress urinary incontinence. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Raatikainen, M J Pekka; Arnar, David O; Zeppenfeld, Katja; Merino, Jose Luis; Levya, Francisco; Hindriks, Gerhardt; Kuck, Karl-Heinz
2015-01-01
There has been large variations in the use of invasive electrophysiological therapies in the member countries of the European Society of Cardiology (ESC). The aim of this analysis was to provide comprehensive information on cardiac implantable electronic device (CIED) and catheter ablation therapy trends in the ESC countries over the last five years. The European Heart Rhythm Association (EHRA) has collected data on CIED and catheter ablation therapy since 2008. Last year 49 of the 56 ESC member countries provided data for the EHRA White Book. This analysis is based on the current and previous editions of the EHRA White Book. Data on procedure rates together with information on economic aspects, local reimbursement systems and training activities are presented for each ESC country and the five geographical ESC regions. In 2013, the electrophysiological procedure rates per million population were highest in Western Europe followed by the Southern and Northern European countries. The CIED implantation and catheter ablation rate was lowest in the Eastern European and in the non-European ESC countries, respectively. However, in some Eastern European countries with relative low gross domestic product procedure rates exceeded those of some wealthier Western countries, suggesting that economic resources are not the only driver for utilization of arrhythmia therapies. These statistics indicate that despite significant improvements, there still is considerable heterogeneity in the availability of arrhythmia therapies across the ESC area. Hopefully, these data will help identify areas for improvement and guide future activities in cardiac arrhythmia management. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Economic Statistics and Information Concerning the Japanese Auto Industry
DOT National Transportation Integrated Search
1980-12-01
The report examines the following aspects of the Japanese automobile Industry: Identification of Japanese agencies that receive statistical data on the automobile industry; Determination of research and development and capital investment procedures; ...
New robust statistical procedures for the polytomous logistic regression models.
Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro
2018-05-17
This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.
Improved Statistics for Determining the Patterson Symmetry fromUnmerged Diffraction Intensities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauter, Nicholas K.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.
We examine procedures for detecting the point-group symmetryof macromolecular datasets and propose enhancements. To validate apoint-group, it is sufficient to compare pairs of Bragg reflections thatare related by each of the group's component symmetry operators.Correlation is commonly expressed in the form of a single statisticalquantity (such as Rmerge) that incorporates information from all of theobserved reflections. However, the usual practice of weighting all pairsof symmetry-related intensities equally can obscure the fact that thevarious symmetry operators of the point-group contribute differingfractions of the total set. In some cases where particular symmetryelements are significantly under-represented, statistics calculatedglobally over all observations do notmore » permit conclusions about thepoint-group and Patterson symmetry. The problem can be avoided byrepartitioning the data in a way that explicitly takes note of individualoperators. The new analysis methods, incorporated into the programLABELIT (cci.lbl.gov/labelit), can be performed early enough during dataacquisition, and are quick enough, that it is feasible to pause tooptimize the data collection strategy.« less
Type I and type II residual stress in iron meteorites determined by neutron diffraction measurements
NASA Astrophysics Data System (ADS)
Caporali, Stefano; Pratesi, Giovanni; Kabra, Saurabh; Grazzi, Francesco
2018-04-01
In this work we present a preliminary investigation by means of neutron diffraction experiment to determine the residual stress state in three different iron meteorites (Chinga, Sikhote Alin and Nantan). Because of the very peculiar microstructural characteristic of this class of samples, all the systematic effects related to the measuring procedure - such as crystallite size and composition - were taken into account and a clear differentiation in the statistical distribution of residual stress in coarse and fine grained meteorites were highlighted. Moreover, the residual stress state was statistically analysed in three orthogonal directions finding evidence of the existence of both type I and type II residual stress components. Finally, the application of von Mises approach allowed to determine the distribution of type II stress.
Patidar, Gopal Kumar; Sharma, Ratti Ram; Marwaha, Neelam
2013-10-01
Although automated cell separators have undergone a lot of technical refinements, attention has been focused on the quality of platelet concentrates than on donor safety. We planned this prospective study to look into donor safety aspect by studying adverse events in normal healthy plateletpheresis donors. The study included 500 healthy, first-time (n=301) and repeat (n=199) plateletpheresis donors after informed consent. The plateletpheresis procedures were performed on Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. The adverse events during procedure were recorded and classified according to their nature. The pre and post procedure hematological and biochemical profiles of these donors were also assessed with the help of automated cell counter and analyser respectively. A total of 18% (n=90) adverse events were recorded in 500 plateletpheresis donors, of which 9% of were hypocalcaemia in nature followed by hematoma (7.4%), vasovagal reaction (0.8%) and kit related adverse events in (0.8%). There was significant post procedure drop in Hb, Hct, platelet count of the donors (p<0.0001) whereas WBC count showed a statistically significant rise (p<0.0001). Divalent cations (iCa(+), TCa(+), TMg(+)) also showed a statistically significant decline after donation (p<0.0001). However there were no statistically significance difference between adverse events in Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. Donor reactions can adversely affect the voluntary donor recruitment strategies to increase the public awareness regarding constant need for blood and blood products. Commonly observed adverse events in plateletpheresis donors were hypocalcemia, hematoma formation and vasovagal reactions which can be prevented by pre-donation education of the donors and change of machine configuration. Nevertheless, more prospective studies on this aspect are required in order to establish guidelines for donor safety in apheresis and also to help in assessing donor suitability, especially given the present trend of double product apheresis collections. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Procedure To Detect Test Bias Present Simultaneously in Several Items.
ERIC Educational Resources Information Center
Shealy, Robin; Stout, William
A statistical procedure is presented that is designed to test for unidirectional test bias existing simultaneously in several items of an ability test, based on the assumption that test bias is incipient within the two groups' ability differences. The proposed procedure--Simultaneous Item Bias (SIB)--is based on a multidimensional item response…
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...
Easterday, Thomas S; Moore, Joshua W; Redden, Meredith H; Feliciano, David V; Henderson, Vernon J; Humphries, Timothy; Kohler, Katherine E; Ramsay, Philip T; Spence, Stanston D; Walker, Mark; Wyrzykowski, Amy D
2017-07-01
Percutaneous tracheostomy is a safe and effective bedside procedure. Some advocate the use of bronchoscopy during the procedure to reduce the rate of complications. We evaluated our complication rate in trauma patients undergoing percutaneous tracheostomy with and without bronchoscopic guidance to ascertain if there was a difference in the rate of complications. A retrospective review of all tracheostomies performed in critically ill trauma patients was performed using the trauma registry from an urban, Level I Trauma Center. Bronchoscopy assistance was used based on surgeon preference. Standard statistical methodology was used to determine if there was a difference in complication rates for procedures performed with and without the bronchoscope. From January 2007, to April 2016, 649 patients underwent modified percuteaneous tracheostomy; 289 with the aid of a bronchoscope and 360 without. There were no statistically significant differences in any type of complication regardless of utilization of a bronchoscope. The addition of bronchoscopy provides several theoretical benefits when performing percutaneous tracheostomy. Our findings, however, do not demonstrate a statistically significant difference in complications between procedures performed with and without a bronchoscope. Use of the bronchoscope should, therefore, be left to the discretion of the performing physician.
Stepaniak, Pieter S; Soliman Hamad, Mohamed A; Dekker, Lukas R C; Koolen, Jacques J
2014-01-01
In this study, we sought to analyze the stochastic behavior of Catherization Laboratories (Cath Labs) procedures in our institution. Statistical models may help to improve estimated case durations to support management in the cost-effective use of expensive surgical resources. We retrospectively analyzed all the procedures performed in the Cath Labs in 2012. The duration of procedures is strictly positive (larger than zero) and has mostly a large minimum duration. Because of the strictly positive character of the Cath Lab procedures, a fit of a lognormal model may be desirable. Having a minimum duration requires an estimate of the threshold (shift) parameter of the lognormal model. Therefore, the 3-parameter lognormal model is interesting. To avoid heterogeneous groups of observations, we tested every group-cardiologist-procedure combination for the normal, 2- and 3-parameter lognormal distribution. The total number of elective and emergency procedures performed was 6,393 (8,186 h). The final analysis included 6,135 procedures (7,779 h). Electrophysiology (intervention) procedures fit the 3-parameter lognormal model 86.1% (80.1%). Using Friedman test statistics, we conclude that the 3-parameter lognormal model is superior to the 2-parameter lognormal model. Furthermore, the 2-parameter lognormal is superior to the normal model. Cath Lab procedures are well-modelled by lognormal models. This information helps to improve and to refine Cath Lab schedules and hence their efficient use.
Round-off errors in cutting plane algorithms based on the revised simplex procedure
NASA Technical Reports Server (NTRS)
Moore, J. E.
1973-01-01
This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.
Evaluation of procedures for quality assurance specifications
DOT National Transportation Integrated Search
2004-10-01
The objective of this project was to develop a comprehensive quality assurance (QA) manual, supported by scientific evidence and statistical theory, which provides step-by-step procedures and instructions for developing effective and efficient QA spe...
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Impact of Uncertainty on the Porous Media Description in the Subsurface Transport Analysis
NASA Astrophysics Data System (ADS)
Darvini, G.; Salandin, P.
2008-12-01
In the modelling of flow and transport phenomena in naturally heterogeneous media, the spatial variability of hydraulic properties, typically the hydraulic conductivity, is generally described by use of a variogram of constant sill and spatial correlation. While some analyses reported in the literature discuss of spatial inhomogeneity related to a trend in the mean hydraulic conductivity, the effect in the flow and transport due to an inexact definition of spatial statistical properties of media as far as we know had never taken into account. The relevance of this topic is manifest, and it is related to the uncertainty in the definition of spatial moments of hydraulic log-conductivity from an (usually) little number of data, as well as to the modelling of flow and transport processes by the Monte Carlo technique, whose numerical fields have poor ergodic properties and are not strictly statistically homogeneous. In this work we investigate the effects related to mean log-conductivity (logK) field behaviours different from the constant one due to different sources of inhomogeneity as: i) a deterministic trend; ii) a deterministic sinusoidal pattern and iii) a random behaviour deriving from the hierarchical sedimentary architecture of porous formations and iv) conditioning procedure on available measurements of the hydraulic conductivity. These mean log-conductivity behaviours are superimposed to a correlated weakly fluctuating logK field. The time evolution of the spatial moments of the plume driven by a statistically inhomogeneous steady state random velocity field is analyzed in a 2-D finite domain by taking into account different sizes of injection area. The problem is approached by both a classical Monte Carlo procedure and SFEM (stochastic finite element method). By the latter the moments are achieved by space-time integration of the velocity field covariance structure derived according to the first- order Taylor series expansion. Two different goals are foreseen: 1) from the results it will be possible to distinguish the contribute in the plume dispersion of the uncertainty in the statistics of the medium hydraulic properties in all the cases considered, and 2) we will try to highlight the loss of performances that seems to affect the first-order approaches in the transport phenomena that take place in hierarchical architecture of porous formations.
Mittal, Manish; Harrison, Donald L; Thompson, David M; Miller, Michael J; Farmer, Kevin C; Ng, Yu-Tze
2016-01-01
While the choice of analytical approach affects study results and their interpretation, there is no consensus to guide the choice of statistical approaches to evaluate public health policy change. This study compared and contrasted three statistical estimation procedures in the assessment of a U.S. Food and Drug Administration (FDA) suicidality warning, communicated in January 2008 and implemented in May 2009, on antiepileptic drug (AED) prescription claims. Longitudinal designs were utilized to evaluate Oklahoma (U.S. State) Medicaid claim data from January 2006 through December 2009. The study included 9289 continuously eligible individuals with prevalent diagnoses of epilepsy and/or psychiatric disorder. Segmented regression models using three estimation procedures [i.e., generalized linear models (GLM), generalized estimation equations (GEE), and generalized linear mixed models (GLMM)] were used to estimate trends of AED prescription claims across three time periods: before (January 2006-January 2008); during (February 2008-May 2009); and after (June 2009-December 2009) the FDA warning. All three statistical procedures estimated an increasing trend (P < 0.0001) in AED prescription claims before the FDA warning period. No procedures detected a significant change in trend during (GLM: -30.0%, 99% CI: -60.0% to 10.0%; GEE: -20.0%, 99% CI: -70.0% to 30.0%; GLMM: -23.5%, 99% CI: -58.8% to 1.2%) and after (GLM: 50.0%, 99% CI: -70.0% to 160.0%; GEE: 80.0%, 99% CI: -20.0% to 200.0%; GLMM: 47.1%, 99% CI: -41.2% to 135.3%) the FDA warning when compared to pre-warning period. Although the three procedures provided consistent inferences, the GEE and GLMM approaches accounted appropriately for correlation. Further, marginal models estimated using GEE produced more robust and valid population-level estimations. Copyright © 2016 Elsevier Inc. All rights reserved.
Huang, Shou-Guo; Chen, Bo; Lv, Dong; Zhang, Yong; Nie, Feng-Feng; Li, Wei; Lv, Yao; Zhao, Huan-Li; Liu, Hong-Mei
2017-01-01
Purpose Using a network meta-analysis approach, our study aims to develop a ranking of the six surgical procedures, that is, Plate, titanium elastic nail (TEN), tension band wire (TBW), hook plate (HP), reconstruction plate (RP) and Knowles pin, by comparing the post-surgery constant shoulder scores in patients with clavicular fracture (CF). Methods A comprehensive search of electronic scientific literature databases was performed to retrieve publications investigating surgical procedures in CF, with the stringent eligible criteria, and clinical experimental studies of high quality and relevance to our area of interest were selected for network meta-analysis. Statistical analyses were conducted using Stata 12.0. Results A total of 19 studies met our inclusion criteria were eventually enrolled into our network meta-analysis, representing 1164 patients who had undergone surgical procedures for CF (TEN group = 240; Plate group = 164; TBW group = 180; RP group = 168; HP group = 245; Knowles pin group = 167). The network meta-analysis results revealed that RP significantly improved constant shoulder score in patients with CF when compared with TEN, and the post-operative constant shoulder scores in patients with CF after Plate, TBW, HP, Knowles pin and TEN were similar with no statistically significant differences. The treatment relative ranking of predictive probabilities of constant shoulder scores in patients with CF after surgery revealed the surface under the cumulative ranking curves (SUCRA) value is the highest in RP. Conclusion The current network meta-analysis suggests that RP may be the optimum surgical treatment among six inventions for patients with CF, and it can improve the shoulder score of patients with CF. Implications for Rehabilitation RP improves shoulder joint function after surgical procedure. RP achieves stability with minimal complications after surgery. RP may be the optimum surgical treatment for rehabilitation of patients with CF.
Comparison of revision surgeries for one- to two-level cervical TDR and ACDF from 2002 to 2011.
Nandyala, Sreeharsha V; Marquez-Lara, Alejandro; Fineberg, Steven J; Singh, Kern
2014-12-01
Cervical total disc replacement (TDR) and anterior cervical discectomy and fusion (ACDF) provide comparable outcomes for degenerative cervical pathology. However, revisions of these procedures are not well characterized. The purpose of this study is to examine the rates, epidemiology, perioperative complications, and costs between the revision procedures and to compare these outcomes with those of primary cases. This study is a retrospective database analysis. A total of 3,792 revision and 183,430 primary cases from the Nationwide Inpatient Sample (NIS) database from 2002 to 2011 were included. Incidence of revision cases, patient demographics, length of stay (LOS), in-hospital costs, mortality, and perioperative complications. Patients who underwent revision for either one- to two-level cervical TDR or ACDF were identified. SPSS v.20 was used for statistical analysis with χ(2) test for categorical data and independent sample t test for continuous data. The relative risk for perioperative complications with revisions was calculated in comparison with primary cases using a 95% confidence interval. An alpha level of less than 0.05 denoted statistical significance. There were 3,536 revision one- to two-level ACDFs and 256 revision cervical TDRs recorded in the NIS database from 2002 to 2011. The revision cervical TDR cohort demonstrated a significantly greater LOS (3.18 vs. 2.25, p<.001), cost ($16,998 vs. $15,222, p=.03), and incidence of perioperative wound infections (13.6 vs. 5.3 per 1,000, p<.001) compared with the ACDF revision cohort (p<.001). There were no differences in mortality between the revision surgical cohorts. Compared with primary cases, both revision cohorts demonstrated a significantly greater LOS and cost. Furthermore, patients who underwent revision demonstrated a greater incidence and risk for perioperative wound infections, hematomas, dysphagia, and neurologic complications relative to the primary procedures. This study demonstrated a significantly greater incidence of perioperative wound infection, LOS, and costs associated with a TDR revision compared with a revision ACDF. We propose that these differences are by virtue of the inherently more invasive nature of revising TDRs. In addition, compared with primary cases, revision procedures are associated with greater costs, LOS, and complications including wound infections, dysphagia, hematomas, and neurologic events. These additional risks must be considered before opting for a revision procedure. Copyright © 2014 Elsevier Inc. All rights reserved.
Uttley, M; Crawford, M H
1994-02-01
In 1980 and 1981 Mennonite descendants of a group of Russian immigrants participated in a multidisciplinary study of biological aging. The Mennonites live in Goessel, Kansas, and Henderson, Nebraska. In 1991 the survival status of the participants was documented by each church secretary. Data are available for 1009 individuals, 177 of whom are now deceased. They ranged from 20 to 95 years in age when the data were collected. Biological ages were computed using a stepwise multiple regression procedure based on 38 variables previously identified as being related to survival, with chronological age as the dependent variable. Standardized residuals place participants in either a predicted-younger or a predicted-older group. The independence of the variables biological age and survival status is tested with the chi-square statistic. The significance of biological age differences between surviving and deceased Mennonites is determined by t test values. The two statistics provide consistent results. Predicted age group classification and survival status are related. The group of deceased participants is generally predicted to be older than the group of surviving participants, although neither statistic is significant for all subgroups of Mennonites. In most cases, however, individuals in the predicted-older groups are at a relatively higher risk of dying compared with those in the predicted-younger groups, although the increased risk is not always significant.
Bayesian analyses of seasonal runoff forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.; Reese, S.
1991-12-01
Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.
A Bayesian test for Hardy–Weinberg equilibrium of biallelic X-chromosomal markers
Puig, X; Ginebra, J; Graffelman, J
2017-01-01
The X chromosome is a relatively large chromosome, harboring a lot of genetic information. Much of the statistical analysis of X-chromosomal information is complicated by the fact that males only have one copy. Recently, frequentist statistical tests for Hardy–Weinberg equilibrium have been proposed specifically for dealing with markers on the X chromosome. Bayesian test procedures for Hardy–Weinberg equilibrium for the autosomes have been described, but Bayesian work on the X chromosome in this context is lacking. This paper gives the first Bayesian approach for testing Hardy–Weinberg equilibrium with biallelic markers at the X chromosome. Marginal and joint posterior distributions for the inbreeding coefficient in females and the male to female allele frequency ratio are computed, and used for statistical inference. The paper gives a detailed account of the proposed Bayesian test, and illustrates it with data from the 1000 Genomes project. In that implementation, a novel approach to tackle multiple testing from a Bayesian perspective through posterior predictive checks is used. PMID:28900292
Turrentine, Florence E; Wang, Hongkun; Young, Jeffrey S; Calland, James Forrest
2010-08-01
Ever-increasing numbers of in-house acute care surgeons and competition for operating room time during normal daytime business hours have led to an increased frequency of nonemergent general and vascular surgery procedures occurring at night when there are fewer residents, consultants, nurses, and support staff available for assistance. This investigation tests the hypothesis that patients undergoing such procedures after hours are at increased risk for postoperative morbidity and mortality. Clinical data for 10,426 operative procedures performed over a 5-year period at a single academic tertiary care hospital were obtained from the American College of Surgeons National Surgical Quality Improvement Program Database. The prevalence of preoperative comorbid conditions, postoperative length of stay, morbidity, and mortality was compared between two cohorts of patients: one who underwent nonemergent operative procedures at night and other who underwent similar procedures during the day. Subsequent statistical comparisons utilized chi tests for comparisons of categorical variables and F-tests for continuous variables. Patients undergoing procedures at night had a greater prevalence of serious preoperative comorbid conditions. Procedure complexity as measured by relative value unit did not differ between groups, but length of stay was longer after night procedures (7.8 days vs. 4.3 days, p < 0.0001). Patients undergoing nonemergent general and vascular surgery procedures at night in an academic medical center do not seem to be at increased risk for postoperative morbidity or mortality. Performing nonemergent procedures at night seems to be a safe solution for daytime overcrowding of operating rooms.
Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D
2010-10-01
The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Santori, G; Fontana, I; Bertocchi, M; Gasloli, G; Valente, U
2010-05-01
Following the example of many Western countries, where a "minimum volume rule" policy has been adopted as a quality parameter for complex surgical procedures, the Italian National Transplant Centre set the minimum number of kidney transplantation procedures/y at 30/center. The number of procedures performed in a single center over a large period may be treated as a time series to evaluate trends, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1983, and December 31, 2007, we performed 1376 procedures in adult or pediatric recipients from living or cadaveric donors. The greatest numbers of cases/y were performed in 1998 (n = 86) followed by 2004 (n = 82), 1996 (n = 75), and 2003 (n = 73). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed a whole incremental trend after exponential smoothing as well as after seasonal decomposition. However, starting from 2005, we observed a decreased trend in the series. The number of kidney transplants expected to be performed for 2008 by using the Holt-Winters exponential smoothing applied to the period 1983 to 2007 suggested 58 procedures, while in that year there were 52. The time series approach may be helpful to establish a minimum volume/y at a single-center level. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Statistical Research of Investment Development of Russian Regions
ERIC Educational Resources Information Center
Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.
2016-01-01
This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…
Decision Support Systems: Applications in Statistics and Hypothesis Testing.
ERIC Educational Resources Information Center
Olsen, Christopher R.; Bozeman, William C.
1988-01-01
Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…
Use of Management Statistics in ARL Libraries. SPEC Kit #153.
ERIC Educational Resources Information Center
Vasi, John
A Systems and Procedures Exchange Center (SPEC) survey conducted in 1986 investigated the collection and use of management statistics in Association of Research Libraries (ARL) member libraries, and SPEC Kit #134 (May 1987) summarized the kinds of statistics collected and the reasons given by the 91 respondents for collecting them. This more…
Strategies Used by Students to Compare Two Data Sets
ERIC Educational Resources Information Center
Reaburn, Robyn
2012-01-01
One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…
Statistics for People Who (Think They) Hate Statistics. Third Edition
ERIC Educational Resources Information Center
Salkind, Neil J.
2007-01-01
This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…
ERIC Educational Resources Information Center
Nitko, Anthony J.; Hsu, Tse-chi
Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…
Foley, J
2008-03-01
To develop baseline data in relation to paediatric minor oral surgical procedures undertaken with both general anaesthesia and nitrous oxide inhalation sedation within a Hospital Dental Service. Data were collected prospectively over a three-year period from May 2003 to June 2006 for patients attending the Departments of Paediatric Dentistry, Dundee Dental Hospital and Ninewells Hospital, NHS Tayside, Great Britain, for all surgical procedures undertaken with either inhalation sedation or general anaesthetic. Both operator status and the procedure being undertaken were noted. In addition, the operating time was recorded. Data for 166 patients (F: 102; M: 64) with a median age of 12.50 (inter-quartile range 10.00, 14.20) years showed that 195 surgical procedures were undertaken. Of these 160 and 35 were with general anaesthetic and sedation respectively. The surgical removal of impacted, carious and supernumerary unit(s) accounted for 53.8% of all procedures, whilst the exposure of impacted teeth and soft tissue surgery represented 34.9% and 11.3% of procedures respectively. The median surgical time for techniques undertaken with sedation was 30.00 (inter-quartile range 25.00, 43.50) minutes whilst that for general anaesthetic was similar at 30.00 (inter-quartile range 15.25, 40.00) minutes (not statistically significant, (Mann Whitney U, W = 3081.5, P = 0.331). The majority of paediatric minor oral surgical procedures entail surgical exposure or removal of impacted teeth. The median treatment time for most procedures undertaken with either general anaesthetic or nitrous oxide sedation was 30 minutes.
Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives
NASA Technical Reports Server (NTRS)
Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.
2001-01-01
This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.
Wilson, Sally; Bremner, Alexandra P; Mathews, Judy; Pearson, Diane
2013-12-01
The aim of this study was to evaluate the effectiveness of oral sucrose in decreasing pain during minor procedures in infants of 1-6 months corrected age. A blinded randomized controlled trial with infants aged 4-26 weeks who underwent venipuncture, heel lance or intravenous cannulation were stratified by corrected age into > 4-12 weeks and > 12-26 weeks. They received 2 mL of either 25% sucrose or sterile water orally 2 minutes before the painful procedure. Nonnutritional sucking and parental comfort, provided in adherence to hospital guidelines, were recorded. Pain behavior was recorded using a validated 10 point scale at baseline, during and following the procedure. Data collectors were blinded to the intervention. A total of 21 and 20 infants received sucrose and water, respectively, in the > 4-12-week age group, and 21 and 22, respectively, in the > 12-26-week age group. No statistical differences were found in pain scores between treatment and control groups at any data collection points in either age group. Infants aged > 4-12 weeks who did nonnutritional sucking showed statistically significantly lower median pain scores at 1, 2, and 3 minutes after the procedure than those who did not suck. Infants aged > 4-26 weeks exhibited pain behavior scores that indicated moderate to large pain during painful procedures; however, there was insufficient evidence to show that 2 mL 25% sucrose had a statistically significant effect in decreasing pain. Infants should be offered nonnutritional sucking in compliance with the Baby Friendly Health Initiative during painful procedures. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.
Systematic and fully automated identification of protein sequence patterns.
Hart, R K; Royyuru, A K; Stolovitzky, G; Califano, A
2000-01-01
We present an efficient algorithm to systematically and automatically identify patterns in protein sequence families. The procedure is based on the Splash deterministic pattern discovery algorithm and on a framework to assess the statistical significance of patterns. We demonstrate its application to the fully automated discovery of patterns in 974 PROSITE families (the complete subset of PROSITE families which are defined by patterns and contain DR records). Splash generates patterns with better specificity and undiminished sensitivity, or vice versa, in 28% of the families; identical statistics were obtained in 48% of the families, worse statistics in 15%, and mixed behavior in the remaining 9%. In about 75% of the cases, Splash patterns identify sequence sites that overlap more than 50% with the corresponding PROSITE pattern. The procedure is sufficiently rapid to enable its use for daily curation of existing motif and profile databases. Third, our results show that the statistical significance of discovered patterns correlates well with their biological significance. The trypsin subfamily of serine proteases is used to illustrate this method's ability to exhaustively discover all motifs in a family that are statistically and biologically significant. Finally, we discuss applications of sequence patterns to multiple sequence alignment and the training of more sensitive score-based motif models, akin to the procedure used by PSI-BLAST. All results are available at httpl//www.research.ibm.com/spat/.
Harris, Alex H S; Reeder, Rachelle; Hyun, Jenny K
2009-10-01
Journal editors and statistical reviewers are often in the difficult position of catching serious problems in submitted manuscripts after the research is conducted and data have been analyzed. We sought to learn from editors and reviewers of major psychiatry journals what common statistical and design problems they most often find in submitted manuscripts and what they wished to communicate to authors regarding these issues. Our primary goal was to facilitate communication between journal editors/reviewers and researchers/authors and thereby improve the scientific and statistical quality of research and submitted manuscripts. Editors and statistical reviewers of 54 high-impact psychiatry journals were surveyed to learn what statistical or design problems they encounter most often in submitted manuscripts. Respondents completed the survey online. The authors analyzed survey text responses using content analysis procedures to identify major themes related to commonly encountered statistical or research design problems. Editors and reviewers (n=15) who handle manuscripts from 39 different high-impact psychiatry journals responded to the survey. The most commonly cited problems regarded failure to map statistical models onto research questions, improper handling of missing data, not controlling for multiple comparisons, not understanding the difference between equivalence and difference trials, and poor controls in quasi-experimental designs. The scientific quality of psychiatry research and submitted reports could be greatly improved if researchers became sensitive to, or sought consultation on frequently encountered methodological and analytic issues.
PROC IRT: A SAS Procedure for Item Response Theory
Matlock Cole, Ki; Paek, Insu
2017-01-01
This article reviews the procedure for item response theory (PROC IRT) procedure in SAS/STAT 14.1 to conduct item response theory (IRT) analyses of dichotomous and polytomous datasets that are unidimensional or multidimensional. The review provides an overview of available features, including models, estimation procedures, interfacing, input, and output files. A small-scale simulation study evaluates the IRT model parameter recovery of the PROC IRT procedure. The use of the IRT procedure in Statistical Analysis Software (SAS) may be useful for researchers who frequently utilize SAS for analyses, research, and teaching.
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Poudel, Sashi; Weir, Lori; Dowling, Dawn; Medich, David C
2016-08-01
A statistical pilot study was retrospectively performed to analyze potential changes in occupational radiation exposures to Interventional Radiology (IR) staff at Lawrence General Hospital after implementation of the i2 Active Radiation Dosimetry System (Unfors RaySafe Inc, 6045 Cochran Road Cleveland, OH 44139-3302). In this study, the monthly OSL dosimetry records obtained during the eight-month period prior to i2 implementation were normalized to the number of procedures performed during each month and statistically compared to the normalized dosimetry records obtained for the 8-mo period after i2 implementation. The resulting statistics included calculation of the mean and standard deviation of the dose equivalences per procedure and included appropriate hypothesis tests to assess for statistically valid differences between the pre and post i2 study periods. Hypothesis testing was performed on three groups of staff present during an IR procedure: The first group included all members of the IR staff, the second group consisted of the IR radiologists, and the third group consisted of the IR technician staff. After implementing the i2 active dosimetry system, participating members of the Lawrence General IR staff had a reduction in the average dose equivalence per procedure of 43.1% ± 16.7% (p = 0.04). Similarly, Lawrence General IR radiologists had a 65.8% ± 33.6% (p=0.01) reduction while the technologists had a 45.0% ± 14.4% (p=0.03) reduction.
Estimation of descriptive statistics for multiply censored water quality data
Helsel, Dennis R.; Cohn, Timothy A.
1988-01-01
This paper extends the work of Gilliom and Helsel (1986) on procedures for estimating descriptive statistics of water quality data that contain “less than” observations. Previously, procedures were evaluated when only one detection limit was present. Here we investigate the performance of estimators for data that have multiple detection limits. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures (e.g., substitution of the detection limit) should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them.
[Bronchoscopy in Germany. Cross-sectional inquiry with 681 institutions].
Markus, A; Häussinger, K; Kohlhäufl, M; Hauck, R W
2000-11-01
Bronchoscopy represents an integral part of the diagnostic tools in pulmonary medicine. Recently, it has also gained considerable attention for its therapeutic properties. To elucidate equipment, indications and procedural techniques of bronchoscopy units, a retrospective survey of 1232 hospitals and practices is conducted. 687 questionnaires are received back (response rate 56%). 681 of which are statistically evaluated. Two thirds of the physicians in charge are internists, one third are pulmonary care specialists. A total of 200,596 endoscopic procedures is included. The majority of procedures is done with an average of 3 bronchoscopists and in over 57% (388) of cases with an average number of 100 or less procedures per year. The five main indications are tumor, hemoptysis, infection or pneumonia, drainage of secretions and suspected interstitial disease. Overall complication rate amounts to 2.7% with an incidence of 4.6% minor and 0.7% major complications and a bronchoscopy-related mortality of 0.02%. The patterns seen in premedication, intra- and post-procedural monitoring, disinfection practices as well as documentation are quite heterogeneous. It is suggested to establish revised and updated standards for bronchoscopy, which should take the data collected into particular account. Those standards should provide the basis for a high level bronchological care throughout Germany.
Deriving Color-Color Transformations for VRI Photometry
NASA Astrophysics Data System (ADS)
Taylor, B. J.; Joner, M. D.
2006-12-01
In this paper, transformations between Cousins R-I and other indices are considered. New transformations to Cousins V-R and Johnson V-K are derived, a published transformation involving T1-T2 on the Washington system is rederived, and the basis for a transformation involving b-y is considered. In addition, a statistically rigorous procedure for deriving such transformations is presented and discussed in detail. Highlights of the discussion include (1) the need for statistical analysis when least-squares relations are determined and interpreted, (2) the permitted forms and best forms for such relations, (3) the essential role played by accidental errors, (4) the decision process for selecting terms to appear in the relations, (5) the use of plots of residuals, (6) detection of influential data, (7) a protocol for assessing systematic effects from absorption features and other sources, (8) the reasons for avoiding extrapolation of the relations, (9) a protocol for ensuring uniformity in data used to determine the relations, and (10) the derivation and testing of the accidental errors of those data. To put the last of these subjects in perspective, it is shown that rms errors for VRI photometry have been as small as 6 mmag for more than three decades and that standard errors for quantities derived from such photometry can be as small as 1 mmag or less.
Rangel, Uesliz Vianna; Gomes, Saint Clair dos Santos; Costa, Ana Maria Aranha Magalhães; Moreira, Maria Elisabeth Lopes
2014-01-01
OBJECTIVE: to relate the variables from a surveillance form for intravenous devices in high risk newborn infants with peripherally inserted central catheter related infection. METHODOLOGY: approximately 15 variables were studied, being associated with peripherally inserted central catheter related infection, this being defined by blood culture results. The variables analyzed were obtained from the surveillance forms used with intravenous devices, attached to the medical records of newborn infants weighing between 500 and 1,499 g. The statistical association was defined using the Chi-squared and Student t tests. The study was approved by the Research Ethics Committee of the Instituto Fernandes Figueira under process N. 140.703/12. RESULTS: 63 medical records were analyzed. The infection rate observed was 25.4%. Of the variables analyzed, only three had a statistically-significant relationship with the blood culture - the use of drugs capable of inhibiting acid secretion, post-natal steroid use, and undertaking more than one invasive procedure (p-value of 0.0141, 0.0472 and 0.0277, respectively). CONCLUSION: the absence of significance of the variables of the form may be related to the quality of the records and to the absence of standardization. It is recommended that the teams be encouraged to adhere to the protocol and fill out the form. PMID:25493681
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...
ERIC Educational Resources Information Center
Tay, Louis; Vermunt, Jeroen K.; Wang, Chun
2013-01-01
We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…
Evaluation on the use of cerium in the NBL Titrimetric Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.
An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less
Relevance of the c-statistic when evaluating risk-adjustment models in surgery.
Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y
2012-05-01
The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.
A Simple Illustration for the Need of Multiple Comparison Procedures
ERIC Educational Resources Information Center
Carter, Rickey E.
2010-01-01
Statistical adjustments to accommodate multiple comparisons are routinely covered in introductory statistical courses. The fundamental rationale for such adjustments, however, may not be readily understood. This article presents a simple illustration to help remedy this.
Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.
Hypothesis testing for band size detection of high-dimensional banded precision matrices.
An, Baiguo; Guo, Jianhua; Liu, Yufeng
2014-06-01
Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.
Trends in Coronary Revascularization and Ischemic Heart Disease-Related Mortality in Israel.
Blumenfeld, Orit; Na'amnih, Wasef; Shapira-Daniels, Ayelet; Lotan, Chaim; Shohat, Tamy; Shapira, Oz M
2017-02-17
We investigated national trends in volume and outcomes of percutaneous coronary angioplasty (PCI), coronary artery bypass grafting (CABG), and ischemic heart disease-related mortality in Israel. Using International Classification of Diseases 9th and 10th revision codes, we linked 5 Israeli national databases, including the Israel Center for Disease Control National PCI and CABG Registries, the Ministry of Health Hospitalization Report, the Center of Bureau of Statistics, and the Ministry of Interior Mortality Report, to assess the annual PCI and CABG volume, procedural mortality, comorbidities, and ischemic heart disease-related mortality between 2002 and 2014. Trends over time were analyzed using linear regression, assuming a Poisson distribution. A total of 298 390 revascularization procedures (PCI: 255 724, CABG: 42 666) were performed during the study period. PCI volume increased by 9% from 2002 to 2008 (387.4/100 000 to 423.2/100 000), steadily decreasing by 10.5% to 378.5/100 000 in 2014 ( P =0.70 for the trend). CABG volume decreased by 59% (109.0/100 000 to 45.2/100 000) from 2002 to 2013, leveling at 46.4/100 000 ( P <0.0001). PCI/CABG ratio increased from 3.6 in 2002 to 8.5 in 2013, slightly decreasing to 8.2 by 2014 ( P <0.0001). In-hospital procedural mortality remained stable (PCI: 1.2-1.6%, P =0.34, CABG: 3.7-4.4%, P =0.29) despite a significant change in patient clinical profile. During the course of the study, ischemic heart disease-related mortality decreased by 46% (84.6-46/100 000, P <0.001). We observed a dramatic change in coronary revascularization procedures type and volume, and a marked decrease in ischemic heart disease-related mortality in Israel. The reasons for the observed changes remain unclear and need to be further investigated. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Compound estimation procedures in reliability
NASA Technical Reports Server (NTRS)
Barnes, Ron
1990-01-01
At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the Consael Process (a bivariate Poisson process) were developed. Possible short comings of the models are noted. An example is given to illustrate the procedures. These investigations are ongoing with the aim of developing estimators that extend to components (and subsystems) with three or more design stages.
Luthra, Suvitesh; Ramady, Omar; Monge, Mary; Fitzsimons, Michael G; Kaleta, Terry R; Sundt, Thoralf M
2015-06-01
Markers of operation room (OR) efficiency in cardiac surgery are focused on "knife to skin" and "start time tardiness." These do not evaluate the middle and later parts of the cardiac surgical pathway. The purpose of this analysis was to evaluate knife to skin time as an efficiency marker in cardiac surgery. We looked at knife to skin time, procedure time, and transfer times in the cardiac operational pathway for their correlation with predefined indices of operational efficiency (Index of Operation Efficiency - InOE, Surgical Index of Operational Efficiency - sInOE). A regression analysis was performed to test the goodness of fit of the regression curves estimated for InOE relative to the times on the operational pathway. The mean knife to skin time was 90.6 ± 13 minutes (23% of total OR time). The mean procedure time was 282 ± 123 minutes (71% of total OR time). Utilization efficiencies were highest for aortic valve replacement and coronary artery bypass grafting and least for complex aortic procedures. There were no significant procedure-specific or team-specific differences for standard procedures. Procedure times correlated the strongest with InOE (r = -0.98, p < 0.01). Compared to procedure times, knife to skin is not as strong an indicator of efficiency. A statistically significant linear dependence on InOE was observed with "procedure times" only. Procedure times are a better marker of OR efficiency than knife to skin in cardiac cases. Strategies to increase OR utilization and efficiency should address procedure times in addition to knife to skin times. © 2015 Wiley Periodicals, Inc.
Pilot study of proposed revisions to specifications for hydraulic cement concrete.
DOT National Transportation Integrated Search
1985-01-01
This report summarizes the results of a pilot study of the statistical acceptance procedures proposed for adoption by the Virginia Department of Highways and Transportation. The proposed procedures were recommended in the report titled "Improved Spec...
Behavior analytic approaches to problem behavior in intellectual disabilities.
Hagopian, Louis P; Gregory, Meagan K
2016-03-01
The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.
A perspective of percutaneous transluminal angioplasty.
Stanson, A W
1983-01-01
PTA is a relatively new procedure which is still evolving. More technical improvements are needed. Stiffer balloon plastics and devices to measure arterial wall compliance during balloon inflation are predicted to lead to better long-term success rates. Increasing case numbers provide greater expertise and subsequent refinements in performance and case selection. These factors will lead to improved statistics. Other features of overall patient care must be considered also. The procedure is easy for patients to tolerate, and they can return to activities and work in three or four days. The overall cost is much cheaper than surgery, even at a conservative success rate of 65 percent. There is minimal risk and morbidity, and virtually no mortality. PTA can be repeated if the lesion recurs. Severe complications are rare and almost always surgically treatable. If PTA fails to achieve success, a traditional surgical procedure can be performed. Percutaneous transluminal angioplasty is an important therapeutic alternative to traditional medical and surgical treatment for occlusive arterial disease. It can save legs, veins, time, and money. We need to refine and accurately record the use of this procedure. Total cooperation among clinicians, surgeons, and radiologists is essential for proper utilization of PTA.
ERIC Educational Resources Information Center
Zheng, Yinggan; Gierl, Mark J.; Cui, Ying
2010-01-01
This study combined the kernel smoothing procedure and a nonparametric differential item functioning statistic--Cochran's Z--to statistically test the difference between the kernel-smoothed item response functions for reference and focal groups. Simulation studies were conducted to investigate the Type I error and power of the proposed…
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
Kepler AutoRegressive Planet Search: Motivation & Methodology
NASA Astrophysics Data System (ADS)
Caceres, Gabriel; Feigelson, Eric; Jogesh Babu, G.; Bahamonde, Natalia; Bertin, Karine; Christen, Alejandra; Curé, Michel; Meza, Cristian
2015-08-01
The Kepler AutoRegressive Planet Search (KARPS) project uses statistical methodology associated with autoregressive (AR) processes to model Kepler lightcurves in order to improve exoplanet transit detection in systems with high stellar variability. We also introduce a planet-search algorithm to detect transits in time-series residuals after application of the AR models. One of the main obstacles in detecting faint planetary transits is the intrinsic stellar variability of the host star. The variability displayed by many stars may have autoregressive properties, wherein later flux values are correlated with previous ones in some manner. Auto-Regressive Moving-Average (ARMA) models, Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH), and related models are flexible, phenomenological methods used with great success to model stochastic temporal behaviors in many fields of study, particularly econometrics. Powerful statistical methods are implemented in the public statistical software environment R and its many packages. Modeling involves maximum likelihood fitting, model selection, and residual analysis. These techniques provide a useful framework to model stellar variability and are used in KARPS with the objective of reducing stellar noise to enhance opportunities to find as-yet-undiscovered planets. Our analysis procedure consisting of three steps: pre-processing of the data to remove discontinuities, gaps and outliers; ARMA-type model selection and fitting; and transit signal search of the residuals using a new Transit Comb Filter (TCF) that replaces traditional box-finding algorithms. We apply the procedures to simulated Kepler-like time series with known stellar and planetary signals to evaluate the effectiveness of the KARPS procedures. The ARMA-type modeling is effective at reducing stellar noise, but also reduces and transforms the transit signal into ingress/egress spikes. A periodogram based on the TCF is constructed to concentrate the signal of these periodic spikes. When a periodic transit is found, the model is displayed on a standard period-folded averaged light curve. We also illustrate the efficient coding in R.
Uncertainties in Estimates of Fleet Average Fuel Economy : A Statistical Evaluation
DOT National Transportation Integrated Search
1977-01-01
Research was performed to assess the current Federal procedure for estimating the average fuel economy of each automobile manufacturer's new car fleet. Test vehicle selection and fuel economy estimation methods were characterized statistically and so...
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
1977-05-20
the knowledge of first- and second-line Navy civilian supervisors concerning the dynamics of sexism and racism , with implications for behavior...Identification of the problem ( sexism and racism )-Define what these things are, how they came to be, and what can be done about them. 2. Identification of...managers become reflected in the organization’s EEO statistics ; and an overview of the steps and procedures —Continued DD | j0R^3 1473 EDITION OF
Impact of gastrectomy procedural complexity on surgical outcomes and hospital comparisons.
Mohanty, Sanjay; Paruch, Jennifer; Bilimoria, Karl Y; Cohen, Mark; Strong, Vivian E; Weber, Sharon M
2015-08-01
Most risk adjustment approaches adjust for patient comorbidities and the primary procedure. However, procedures done at the same time as the index case may increase operative risk and merit inclusion in adjustment models for fair hospital comparisons. Our objectives were to evaluate the impact of surgical complexity on postoperative outcomes and hospital comparisons in gastric cancer surgery. Patients who underwent gastric resection for cancer were identified from a large clinical dataset. Procedure complexity was characterized using secondary procedure CPT codes and work relative value units (RVUs). Regression models were developed to evaluate the association between complexity variables and outcomes. The impact of complexity adjustment on model performance and hospital comparisons was examined. Among 3,467 patients who underwent gastrectomy for adenocarcinoma, 2,171 operations were distal and 1,296 total. A secondary procedure was reported for 33% of distal gastrectomies and 59% of total gastrectomies. Six of 10 secondary procedures were associated with adverse outcomes. For example, patients who underwent a synchronous bowel resection had a higher risk of mortality (odds ratio [OR], 2.14; 95% CI, 1.07-4.29) and reoperation (OR, 2.09; 95% CI, 1.26-3.47). Model performance was slightly better for nearly all outcomes with complexity adjustment (mortality c-statistics: standard model, 0.853; secondary procedure model, 0.858; RVU model, 0.855). Hospital ranking did not change substantially after complexity adjustment. Surgical complexity variables are associated with adverse outcomes in gastrectomy, but complexity adjustment does not affect hospital rankings appreciably. Copyright © 2015 Elsevier Inc. All rights reserved.
Taxonomy and clustering in collaborative systems: The case of the on-line encyclopedia Wikipedia
NASA Astrophysics Data System (ADS)
Capocci, A.; Rao, F.; Caldarelli, G.
2008-01-01
In this paper we investigate the nature and structure of the relation between imposed classifications and real clustering in a particular case of a scale-free network given by the on-line encyclopedia Wikipedia. We find a statistical similarity in the distributions of community sizes both by using the top-down approach of the categories division present in the archive and in the bottom-up procedure of community detection given by an algorithm based on the spectral properties of the graph. Regardless of the statistically similar behaviour, the two methods provide a rather different division of the articles, thereby signaling that the nature and presence of power laws is a general feature for these systems and cannot be used as a benchmark to evaluate the suitability of a clustering method.
Determination of precipitation profiles from airborne passive microwave radiometric measurements
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Hakkarinen, Ida M.; Pierce, Harold F.; Weinman, James A.
1991-01-01
This study presents the first quantitative retrievals of vertical profiles of precipitation derived from multispectral passive microwave radiometry. Measurements of microwave brightness temperature (Tb) obtained by a NASA high-altitude research aircraft are related to profiles of rainfall rate through a multichannel piecewise-linear statistical regression procedure. Statistics for Tb are obtained from a set of cloud radiative models representing a wide variety of convective, stratiform, and anvil structures. The retrieval scheme itself determines which cloud model best fits the observed meteorological conditions. Retrieved rainfall rate profiles are converted to equivalent radar reflectivity for comparison with observed reflectivities from a ground-based research radar. Results for two case studies, a stratiform rain situation and an intense convective thunderstorm, show that the radiometrically derived profiles capture the major features of the observed vertical structure of hydrometer density.
NASA Astrophysics Data System (ADS)
Kneringer, Philipp; Dietz, Sebastian; Mayr, Georg J.; Zeileis, Achim
2017-04-01
Low-visibility conditions have a large impact on aviation safety and economic efficiency of airports and airlines. To support decision makers, we develop a statistical probabilistic nowcasting tool for the occurrence of capacity-reducing operations related to low visibility. The probabilities of four different low visibility classes are predicted with an ordered logistic regression model based on time series of meteorological point measurements. Potential predictor variables for the statistical models are visibility, humidity, temperature and wind measurements at several measurement sites. A stepwise variable selection method indicates that visibility and humidity measurements are the most important model inputs. The forecasts are tested with a 30 minute forecast interval up to two hours, which is a sufficient time span for tactical planning at Vienna Airport. The ordered logistic regression models outperform persistence and are competitive with human forecasters.
A Geometrical-Statistical Approach to Outlier Removal for TDOA Measurements
NASA Astrophysics Data System (ADS)
Compagnoni, Marco; Pini, Alessia; Canclini, Antonio; Bestagini, Paolo; Antonacci, Fabio; Tubaro, Stefano; Sarti, Augusto
2017-08-01
The curse of outlier measurements in estimation problems is a well known issue in a variety of fields. Therefore, outlier removal procedures, which enables the identification of spurious measurements within a set, have been developed for many different scenarios and applications. In this paper, we propose a statistically motivated outlier removal algorithm for time differences of arrival (TDOAs), or equivalently range differences (RD), acquired at sensor arrays. The method exploits the TDOA-space formalism and works by only knowing relative sensor positions. As the proposed method is completely independent from the application for which measurements are used, it can be reliably used to identify outliers within a set of TDOA/RD measurements in different fields (e.g. acoustic source localization, sensor synchronization, radar, remote sensing, etc.). The proposed outlier removal algorithm is validated by means of synthetic simulations and real experiments.
Commodity Movements on the Texas Highway System: Data Collection and Survey Results
DOT National Transportation Integrated Search
1991-11-01
This report presents the survey procedures used and data collected in the : development of commodity flow statistics for movements over Texas Highways. : Response rates, sampling procedures, questionnaire design and the types of data : provided by th...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Metal Halide Lamp Ballasts and Fixtures Energy Conservation Standards § 431.329 Enforcement. Process for Metal Halide Lamp Ballasts. This section sets forth procedures DOE will follow in pursuing alleged... with the following statistical sampling procedures for metal halide lamp ballasts, with the methods...
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
Understanding administrative abdominal aortic aneurysm mortality data.
Hussey, K; Siddiqui, T; Burton, P; Welch, G H; Stuart, W P
2015-03-01
Administrative data in the form of Hospital Episode Statistics (HES) and the Scottish Morbidity Record (SMR) have been used to describe surgical activity. These data have also been used to compare outcomes from different hospitals and regions, and to corroborate data submitted to national audits and registries. The aim of this observational study was to examine the completeness and accuracy of administrative data relating to abdominal aortic aneurysm (AAA) repair. Administrative data (SMR-01 returns) from a single health board relating to AAA repair were requested (September 2007 to August 2012). A complete list of validated procedures; termed the reference data set was compiled from all available sources (clinical and administrative). For each patient episode electronic health records were scrutinised to confirm urgency of admission, diagnosis, and operative repair. The 30-day mortality was recorded. The reference data set was used to systematically validate the SMR-01 returns. The reference data set contained 608 verified procedures. SMR-01 returns identified 2433 episodes of care (1724 patients) in which a discharge diagnosis included AAA. This included 574 operative repairs. There were 34 missing cases (5.6%) from SMR-01 returns; nine of these patients died within 30 days of the index procedure. Omission of these cases made a statistically significant improvement to perceived 30-day mortality (p < .05, chi-square test). If inconsistent SMR-01 data (in terms of ICD-10 and OPCS-4 codes) were excluded only 81.9% of operative repairs were correctly identified and only 30.9% of deaths were captured. The SMR-01 returns contain multiple errors. There also appears to be a systematic bias that reduces apparent 30-day mortality. Using these data alone to describe or compare activity or outcomes must be done with caution. Copyright © 2014 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Ezabadi, Zahra; Mollaahmadi, Fahimeh; Mohammadi, Maryam; Omani Samani, Reza; Vesali, Samira
2017-01-01
Background In order to empower infertile individuals and provide high quality patient-centered infertility care, it is necessary to recognize and meet infertile individuals’ educational needs. This study aims to examine infertility patients’ knowledge and subsequently their education needs given their attitudinal approach to infertility education in terms of patients who undergo assisted reproduction treatment. Materials and Methods This descriptive study enrolled 150 subjects by conveni- ence sampling of all patients who received their first assisted reproductive treatment between July and September 2015 at a referral fertility clinic, Royan Institute, Tehran, Iran. We used a questionnaire that measured fertility and infertility information (8 questions) as well as attitude toward education on the causes and treatment of infertility (5 questions). Chi-square, independent sample t test, and one way ANOVA analyses were conducted to examine differences by sex. P<0.05 was considered statistically significant. Results Total mean knowledge was 3.08 ± 0.99. Clients’ responses indicated that the highest mean knowledge scores related to knowledge of factors that affected pregnancy (3.97 ± 1.11) and infertility treatment (3.97 ± 1.16). The lowest mean knowledge scores related to knowledge of the natural reproductive cycle (2.96 ± 1.12) and anatomy of the genital organs (2.94 ± 1.16). Most females (92.1%) and males (83.3%) were of the opinion that infertility education programs should include causes of infertility and types of treatment associated with diagnostic and laboratory procedures. No statistically significant difference existed between male and female participants (P=0.245). Conclusion Most participants in this study expressed awareness of factors that affect pregnancy and infertility treatment. It is imperative to educate and empower infertile individuals who seek reproduction treatment in terms of infertility causes and types of treatment, as well as diagnostic and laboratory procedures to enable them to make informed decisions about their assisted reproductive procedures. PMID:28367301
Sebastião, E; Gobbi, S; Chodzko-Zajko, W; Schwingel, A; Papini, C B; Nakamura, P M; Netto, A V; Kokubun, E
2012-11-01
To explore issues associated with measuring physical activity using the International Physical Activity Questionnaire (IPAQ)-long form in adults living in a mid-sized Brazilian city. A stratified random sampling procedure was used to select a representative sample of adults living in Rio Claro. This yielded 1572 participants who were interviewed using the IPAQ-long form. The data were analysed using standard statistical procedures. Overall, 83% of men and 89% of women reported at least 150 min of combined moderate and/or vigorous physical activity per week. Reliable values of leisure and transportation-related physical activity were observed for both males and females. With regard to the household and work-related physical activity domains, both males and females reported unusually high levels of participation. The IPAQ-long form appears to overestimate levels of physical activity for both males and females, suggesting that the instrument has problems in measuring levels of physical activity in Brazilian adults. Accordingly, caution is warranted before using IPAQ data to support public policy decisions related to physical activity. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Eng, Kevin H; Schiller, Emily; Morrell, Kayla
2015-11-03
Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.
Statistics of Optical Coherence Tomography Data From Human Retina
de Juan, Joaquín; Ferrone, Claudia; Giannini, Daniela; Huang, David; Koch, Giorgio; Russo, Valentina; Tan, Ou; Bruni, Carlo
2010-01-01
Optical coherence tomography (OCT) has recently become one of the primary methods for noninvasive probing of the human retina. The pseudoimage formed by OCT (the so-called B-scan) varies probabilistically across pixels due to complexities in the measurement technique. Hence, sensitive automatic procedures of diagnosis using OCT may exploit statistical analysis of the spatial distribution of reflectance. In this paper, we perform a statistical study of retinal OCT data. We find that the stretched exponential probability density function can model well the distribution of intensities in OCT pseudoimages. Moreover, we show a small, but significant correlation between neighbor pixels when measuring OCT intensities with pixels of about 5 µm. We then develop a simple joint probability model for the OCT data consistent with known retinal features. This model fits well the stretched exponential distribution of intensities and their spatial correlation. In normal retinas, fit parameters of this model are relatively constant along retinal layers, but varies across layers. However, in retinas with diabetic retinopathy, large spikes of parameter modulation interrupt the constancy within layers, exactly where pathologies are visible. We argue that these results give hope for improvement in statistical pathology-detection methods even when the disease is in its early stages. PMID:20304733
Allen, Peter J; Roberts, Lynne D; Baughman, Frank D; Loxton, Natalie J; Van Rooy, Dirk; Rock, Adam J; Finlay, James
2016-01-01
Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer) are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students' statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.
Stey, Anne M; Ko, Clifford Y; Hall, Bruce Lee; Louie, Rachel; Lawson, Elise H; Gibbons, Melinda M; Zingmond, David S; Russell, Marcia M
2014-08-01
Identifying iatrogenic injuries using existing data sources is important for improved transparency in the occurrence of intraoperative events. There is evidence that procedure codes are reliably recorded in claims data. The objective of this study was to assess whether concurrent splenic procedure codes in patients undergoing colectomy procedures are reliably coded in claims data as compared with clinical registry data. Patients who underwent colectomy procedures in the absence of neoplastic diagnosis codes were identified from American College of Surgeons (ACS) NSQIP data linked with Medicare inpatient claims data file (2005 to 2008). A κ statistic was used to assess coding concordance between ACS NSQIP and Medicare inpatient claims, with ACS NSQIP serving as the reference standard. A total of 11,367 colectomy patients were identified from 212 hospitals. There were 114 patients (1%) who had a concurrent splenic procedure code recorded in either ACS NSQIP or Medicare inpatient claims. There were 7 patients who had a splenic injury diagnosis code recorded in either data source. Agreement of splenic procedure codes between the data sources was substantial (κ statistic 0.72; 95% CI, 0.64-0.79). Medicare inpatient claims identified 81% of the splenic procedure codes recorded in ACS NSQIP, and 99% of the patients without a splenic procedure code. It is feasible to use Medicare claims data to identify splenic injuries occurring during colectomy procedures, as claims data have moderate sensitivity and excellent specificity for capturing concurrent splenic procedure codes compared with ACS NSQIP. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Malempati, Harsha; Wadey, Veronica M R; Paquette, Scott; Kreder, Hans J; Massicotte, Eric M; Rampersaud, Raja; Fisher, Charles; Dvorak, Marcel F; Fehlings, Michael G; Backstein, David; Yee, Albert
2013-01-01
A cross-sectional survey of spine surgery fellowship educators and trainees. To determine educator and trainee perspectives on the relative importance of core cognitive and procedural competencies in fellowship training. To determine perceptions of confidence in competencies by trainees near the end of their fellowship. Finally, to determine potential differences comparing surgeons by background specialty training (neurosurgical or orthopedic) of their views on competencies. Spine surgery is a growing subspecialty with increasing collaboration among specialists of varied specialty backgrounds involved in education. With the recent implementation of competency-based curricula during specialty training, opportunities may exist in enhancing fellowship education. A questionnaire on cognitive and procedural competencies was administered (online and paper) to fellowship educators and trainees across Canada. A follow-up questionnaire was administered to nonresponders 3 months later. Survey results were summarized using qualitative and descriptive statistics with comparative analyses performed. Of the identified respondents, the response rate was 91%, (15/17 fellow trainees; 47/51 educators). Twelve of the 13 core cognitive skill categories were rated as being important to acquire by the end of fellowship. Trainees were not comfortable performing, and requested additional training in 8 of the 29 less common and technically demanding procedural skills. There were different perceptions on the relative importance of competencies comparing trainees by specialty background as well as different perceptions on the types of competencies where additional training was desired to achieve competency (P < 0.05). Fellowship educators and trainees possessed similar perceptions on the relative importance of core cognitive and procedural competencies required for successful training. Background specialty influenced the perceptions of both fellowship educators and trainees. This study identified potential gaps or perceived deficiencies in the competency of current fellows. Improvements in spine fellowship education should target these areas through developing evidence-based curriculum changes.
Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.
ERIC Educational Resources Information Center
Pickett, John C.
1984-01-01
AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
Statistical methods for the quality control of steam cured concrete : final report.
DOT National Transportation Integrated Search
1971-01-01
Concrete strength test results from three prestressing plants utilizing steam curing were evaluated statistically in terms of the concrete as received and the effectiveness of the plants' steaming procedures. Control charts were prepared to show tren...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
NASA Technical Reports Server (NTRS)
Trosset, Michael W.
1999-01-01
Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Liu, Zhiyi; Zhang, Luyao; Liu, Yanling; Gu, Yang; Sun, Tieliang
2017-11-01
We aimed to evaluate the efficiency and safety of one-step procedure combined endoscopic retrograde cholangiopancreatography (ERCP) and laparoscopic cholecystectomy (LC) for treatment of patients with cholecysto-choledocholithiasis. A prospective randomized study was performed on 63 consecutive cholecysto-choledocholithiasis patients during 2008 and 2011. The efficiency and safety of one-step procedure was assessed by comparing the two-step LC with ERCP + endoscopic sphincterotomy (EST). Outcomes including intraoperative features, postoperative features (length of stay and postoperative complications) were evaluated. One- or two-step procedure of LC with ERCP + EST was successfully performed in all patients, and common bile duct stones were completely removed. Statistical analyses showed that length of stay and pulmonary infection rate were significantly lower in the test group compared with that in the control group (P < 0.05), whereas no statistical difference in other outcomes was found between the two groups (all P > 0.05). The one-step procedure of LC with ERCP + EST is superior to the two-step procedure for treatment of patients with cholecysto-choledocholithiasis regarding to the reduced hospital stay and inhibited occurrence of pulmonary infections. Compared with two-step procedure, one-step procedure of LC with ERCP + EST may be a superior option for cholecysto-choledocholithiasis patients treatment regarding to hospital stay and pulmonary infections.
The Effects of Twitter Sentiment on Stock Price Returns.
Ranco, Gabriele; Aleksovski, Darko; Caldarelli, Guido; Grčar, Miha; Mozetič, Igor
2015-01-01
Social media are increasingly reflecting and influencing behavior of other complex systems. In this paper we investigate the relations between a well-known micro-blogging platform Twitter and financial markets. In particular, we consider, in a period of 15 months, the Twitter volume and sentiment about the 30 stock companies that form the Dow Jones Industrial Average (DJIA) index. We find a relatively low Pearson correlation and Granger causality between the corresponding time series over the entire time period. However, we find a significant dependence between the Twitter sentiment and abnormal returns during the peaks of Twitter volume. This is valid not only for the expected Twitter volume peaks (e.g., quarterly announcements), but also for peaks corresponding to less obvious events. We formalize the procedure by adapting the well-known "event study" from economics and finance to the analysis of Twitter data. The procedure allows to automatically identify events as Twitter volume peaks, to compute the prevailing sentiment (positive or negative) expressed in tweets at these peaks, and finally to apply the "event study" methodology to relate them to stock returns. We show that sentiment polarity of Twitter peaks implies the direction of cumulative abnormal returns. The amount of cumulative abnormal returns is relatively low (about 1-2%), but the dependence is statistically significant for several days after the events.
Gao, Y Nina
2018-04-06
The Resource-Based Relative Value Scale Update Committee (RUC) submits recommended reimbursement values for physician work (wRVUs) under Medicare Part B. The RUC includes rotating representatives from medical specialties. To identify changes in physician reimbursements associated with RUC rotating seat representation. Relative Value Scale Update Committee members 1994-2013; Medicare Part B Relative Value Scale 1994-2013; Physician/Supplier Procedure Summary Master File 2007; Part B National Summary Data File 2000-2011. I match service and procedure codes to specialties using 2007 Medicare billing data. Subsequently, I model wRVUs as a function of RUC rotating committee representation and level of code specialization. An annual RUC rotating seat membership is associated with a statistically significant 3-5 percent increase in Medicare expenditures for codes billed to that specialty. For codes that are performed by a small number of physicians, the association between reimbursement and rotating subspecialty representation is positive, 0.177 (SE = 0.024). For codes that are performed by a large number of physicians, the association is negative, -0.183 (SE = 0.026). Rotating representation on the RUC is correlated with overall reimbursement rates. The resulting differential changes may exacerbate existing reimbursement discrepancies between generalist and specialist practitioners. © Health Research and Educational Trust.
Marsch, L A
1998-04-01
To provide empirically based evaluation data regarding the efficacy of psychopharmacological interventions in opiate substance abuse, the present study employed meta-analytic statistical procedures to determine the effectiveness of methadone hydrochloride as a pharmacotherapeutic agent. Empirical research findings from 11 studies investigating the effect of methadone maintenance treatment (MMT) on illicit opiate use, and eight and 24 studies investigating the effect of MMT on HIV risk behaviors and criminal activities, respectively, by individuals in such treatment were addressed. Results demonstrate a consistent, statistically significant relationship between MMT and the reduction of illicit opiate use, HIV risk behaviors and drug and property-related criminal behaviors. The effectiveness of MMT is most apparent in its ability to reduce drug-related criminal behaviors. MMT had a moderate effect in reducing illicit opiate use and drug and property-related criminal behaviors, and a small to moderate effect in reducing HIV risk behaviors. Results clarify discrepancies in the literature and are useful in predicting the outcomes of individuals in treatment. The treatment's effectiveness is evident among opiate-dependent individuals across a variety of contexts, cultural and ethnic groups, and study designs.
Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M
2017-04-01
The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.
Faulkner, Austin R; Bourgeois, Austin C; Bradley, Yong C; Hudson, Kathleen B; Heidel, R Eric; Pasciak, Alexander S
2015-05-01
Fluoroscopically guided lumbar puncture (FGLP) is a commonly performed procedure with increased success rates relative to bedside technique. However, FGLP also exposes both patient and staff to ionizing radiation. The purpose of this study was to determine if the use of a simulation-based FGLP training program using an original, inexpensive lumbar spine phantom could improve operator confidence and efficiency, while also reducing patient dose. A didactic and simulation-based FGLP curriculum was designed, including a 1-hour lecture and hands-on training with a lumbar spine phantom prototype developed at our institution. Six incoming post-graduate year 2 (PGY-2) radiology residents completed a short survey before taking the course, and each resident practiced 20 simulated FGLPs using the phantom before their first clinical procedure. Data from the 114 lumbar punctures (LPs) performed by the six trained residents (prospective cohort) were compared to data from 514 LPs performed by 17 residents who did not receive simulation-based training (retrospective cohort). Fluoroscopy time (FT), FGLP success rate, and indication were compared. There was a statistically significant reduction in average FT for the 114 procedures performed by the prospective study cohort compared to the 514 procedures performed by the retrospective cohort. This held true for all procedures in aggregate, LPs for myelography, and all procedures performed for a diagnostic indication. Aggregate FT for the prospective group (0.87 ± 0.68 minutes) was significantly lower compared to the retrospective group (1.09 ± 0.65 minutes) and resulted in a 25% reduction in average FT (P = .002). There was no statistically significant difference in the number of failed FGLPs between the two groups. Our simulation-based FGLP curriculum resulted in improved operator confidence and reduced FT. These changes suggest that resident procedure efficiency was improved, whereas patient dose was reduced. The FGLP training program was implemented by radiology residents and required a minimal investment of time and resources. The LP spine phantom used during training was inexpensive, durable, and effective. In addition, the phantom is compatible with multiple modalities including fluoroscopy, computed tomography, and ultrasound and could be easily adapted to other applications such as facet injections or joint arthrograms. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Brightness temperature and attenuation statistics at 20.6 and 31.65 GHz
NASA Technical Reports Server (NTRS)
Westwater, Edgeworth R.; Falls, M. J.
1991-01-01
Attenuation and brightness temperature statistics at 20.6 and 31.65 GHz are analyzed for a year's worth of data. The data were collected in 1988 at Denver and Platteville, Colorado. The locations are separated by 49 km. Single-station statistics are derived for the entire year. Quality control procedures are discussed and examples of their application are given.
Data-driven inference for the spatial scan statistic.
Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C
2011-08-02
Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.
Effect of Audioanalgesia in 6- to 12-year-old Children during Dental Treatment Procedure.
Ramar, Kavitha; Hariharavel, V P; Sinnaduri, Gayathri; Sambath, Gayathri; Zohni, Fathima; Alagu, Palani J
2016-12-01
To evaluate the effect of audioanalgesia in 6- to 12-year-old children during dental treatment procedure. A total of 40 children were selected and divided into two groups, study group - with audioanalgesia and control group - without audioanalgesia. The value of their pain was evaluated using Venham's pain rating scale. Data were compared using one-sample t-test using Statistical Package for the Social Sciences (SPSS) (Inc.; Chicago, IL, USA), version 17.0. The difference in the control group and study group was statistically significant (p < 0.05). The method of distraction using audioanalgesia instills better positive dental attitude in children and decreases their pain perception. Playing or hearing music during dental procedure significantly alters the perception of pain in 6- to 12-year-old children.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kane, V.E.
1979-10-01
The standard maximum likelihood and moment estimation procedures are shown to have some undesirable characteristics for estimating the parameters in a three-parameter lognormal distribution. A class of goodness-of-fit estimators is found which provides a useful alternative to the standard methods. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Shapiro-Francia tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted-order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Bias and robustness of the procedures are examined and example data sets analyzed including geochemical datamore » from the National Uranium Resource Evaluation Program.« less
Quality assurance software inspections at NASA Ames: Metrics for feedback and modification
NASA Technical Reports Server (NTRS)
Wenneson, G.
1985-01-01
Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.
42 CFR 493.1256 - Standard: Control procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...
ERIC Educational Resources Information Center
Hester, Yvette
Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2013 CFR
2013-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2014 CFR
2014-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2011 CFR
2011-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
40 CFR 610.10 - Program purpose.
Code of Federal Regulations, 2012 CFR
2012-07-01
... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...
Xiao, Jian; Cao, Hongyuan; Chen, Jun
2017-09-15
Next generation sequencing technologies have enabled the study of the human microbiome through direct sequencing of microbial DNA, resulting in an enormous amount of microbiome sequencing data. One unique characteristic of microbiome data is the phylogenetic tree that relates all the bacterial species. Closely related bacterial species have a tendency to exhibit a similar relationship with the environment or disease. Thus, incorporating the phylogenetic tree information can potentially improve the detection power for microbiome-wide association studies, where hundreds or thousands of tests are conducted simultaneously to identify bacterial species associated with a phenotype of interest. Despite much progress in multiple testing procedures such as false discovery rate (FDR) control, methods that take into account the phylogenetic tree are largely limited. We propose a new FDR control procedure that incorporates the prior structure information and apply it to microbiome data. The proposed procedure is based on a hierarchical model, where a structure-based prior distribution is designed to utilize the phylogenetic tree. By borrowing information from neighboring bacterial species, we are able to improve the statistical power of detecting associated bacterial species while controlling the FDR at desired levels. When the phylogenetic tree is mis-specified or non-informative, our procedure achieves a similar power as traditional procedures that do not take into account the tree structure. We demonstrate the performance of our method through extensive simulations and real microbiome datasets. We identified far more alcohol-drinking associated bacterial species than traditional methods. R package StructFDR is available from CRAN. chen.jun2@mayo.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh
2017-01-01
Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. This was a randomized, split-mouth, cross-over study. A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Paired t -test and repeated measure ANOVA. The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures.
Toppi, J; Petti, M; Vecchiato, G; Cincotti, F; Salinari, S; Mattia, D; Babiloni, F; Astolfi, L
2013-01-01
Partial Directed Coherence (PDC) is a spectral multivariate estimator for effective connectivity, relying on the concept of Granger causality. Even if its original definition derived directly from information theory, two modifies were introduced in order to provide better physiological interpretations of the estimated networks: i) normalization of the estimator according to rows, ii) squared transformation. In the present paper we investigated the effect of PDC normalization on the performances achieved by applying the statistical validation process on investigated connectivity patterns under different conditions of Signal to Noise ratio (SNR) and amount of data available for the analysis. Results of the statistical analysis revealed an effect of PDC normalization only on the percentages of type I and type II errors occurred by using Shuffling procedure for the assessment of connectivity patterns. No effects of the PDC formulation resulted on the performances achieved during the validation process executed instead by means of Asymptotic Statistic approach. Moreover, the percentages of both false positives and false negatives committed by Asymptotic Statistic are always lower than those achieved by Shuffling procedure for each type of normalization.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.
Carmichael, Owen; Sakhanenko, Lyudmila
2015-05-15
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.
Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data
Carmichael, Owen; Sakhanenko, Lyudmila
2015-01-01
We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
A scaling procedure for the response of an isolated system with high modal overlap factor
NASA Astrophysics Data System (ADS)
De Rosa, S.; Franco, F.
2008-10-01
The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).
Statistical analysis of the calibration procedure for personnel radiation measurement instruments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.
1980-11-01
Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Blotière, Pierre-Olivier; Hoen, Bruno; Lesclous, Philippe; Millot, Sarah; Rudant, Jérémie; Weill, Alain; Coste, Joel; Alla, François; Duval, Xavier
2017-01-01
Objective To assess the relation between invasive dental procedures and infective endocarditis associated with oral streptococci among people with prosthetic heart valves. Design Nationwide population based cohort and a case crossover study. Setting French national health insurance administrative data linked with the national hospital discharge database. Participants All adults aged more than 18 years, living in France, with medical procedure codes for positioning or replacement of prosthetic heart valves between July 2008 and July 2014. Main outcome measures Oral streptococcal infective endocarditis was identified using primary discharge diagnosis codes. In the cohort study, Poisson regression models were performed to estimate the rate of oral streptococcal infective endocarditis during the three month period after invasive dental procedures compared with non-exposure periods. In the case crossover study, conditional logistic regression models calculated the odds ratio and 95% confidence intervals comparing exposure to invasive dental procedures during the three month period preceding oral streptococcal infective endocarditis (case period) with three earlier control periods. Results The cohort included 138 876 adults with prosthetic heart valves (285 034 person years); 69 303 (49.9%) underwent at least one dental procedure. Among the 396 615 dental procedures performed, 103 463 (26.0%) were invasive and therefore presented an indication for antibiotic prophylaxis, which was performed in 52 280 (50.1%). With a median follow-up of 1.7 years, 267 people developed infective endocarditis associated with oral streptococci (incidence rate 93.7 per 100 000 person years, 95% confidence interval 82.4 to 104.9). Compared with non-exposure periods, no statistically significant increased rate of oral streptococcal infective endocarditis was observed during the three months after an invasive dental procedure (relative rate 1.25, 95% confidence interval 0.82 to 1.82; P=0.26) and after an invasive dental procedure without antibiotic prophylaxis (1.57, 0.90 to 2.53; P=0.08). In the case crossover analysis, exposure to invasive dental procedures was more frequent during case periods than during matched control periods (5.1% v 3.2%; odds ratio 1.66, 95% confidence interval 1.05 to 2.63; P=0.03). Conclusion Invasive dental procedures may contribute to the development of infective endocarditis in adults with prosthetic heart valves. PMID:28882817
A procedure for classifying textural facies in gravel‐bed rivers
Buffington, John M.; Montgomery, David R.
1999-01-01
Textural patches (i.e., grain‐size facies) are commonly observed in gravel‐bed channels and are of significance for both physical and biological processes at subreach scales. We present a general framework for classifying textural patches that allows modification for particular study goals, while maintaining a basic degree of standardization. Textures are classified using a two‐tier system of ternary diagrams that identifies the relative abundance of major size classes and subcategories of the dominant size. An iterative procedure of visual identification and quantitative grain‐size measurement is used. A field test of our classification indicates that it affords reasonable statistical discrimination of median grain size and variance of bed‐surface textures. We also explore the compromise between classification simplicity and accuracy. We find that statistically meaningful textural discrimination requires use of both tiers of our classification. Furthermore, we find that simplified variants of the two‐tier scheme are less accurate but may be more practical for field studies which do not require a high level of textural discrimination or detailed description of grain‐size distributions. Facies maps provide a natural template for stratifying other physical and biological measurements and produce a retrievable and versatile database that can be used as a component of channel monitoring efforts.
Supine or prone position for mini-PNL procedure: does it matter.
Tokatlı, Zafer; Gokce, Mehmet Ilker; Süer, Evren; Sağlam, Remzi
2015-06-01
In this study it is aimed to compare the success and complication rates of mini-PNL procedure in supine and prone positions. In this retrospective study data of 180 patients treated with MPNL either in supine (n = 54) or prone (n = 126) positions between May 2009 and August 2014 was investigated. Success was defined as no visible stones >2 mm. Perioperative complications were classified using the modified Clavien system. Groups were compared with Chi square test or Student t test and for statistical significance p value of 0.05 was accepted. Mean age of the population was 42.5 ± 8.2 years and mean stone size was 23.9 ± 4.1 mm. The two groups were similar with regard to demographic characteristics and stone related characteristics except the ASA status. Success rates of the supine and prone groups were 85.1 and 87.3%, respectively (p = 0.701). No statistically significant differences in terms of complications were observed. Mean operative time was the only parameter different between the two groups (55 vs 82 min, p = 0.001). Supine position for PNL seems to be promising and the complication and success rates are shown to be similar to the prone position with MPNL technique. The only significant benefit of this technique is shorter operative time.
Boomer, Laura A; Watkins, Daniel J; O'Donovan, Julie; Kenney, Brian D; Yates, Andrew R; Besner, Gail E
2015-03-01
Penetrating thoracic trauma is relatively rare in the pediatric population. Embolization of foreign bodies from penetrating trauma is very uncommon. We present a case of a 6-year-old boy with a penetrating foreign body from a projectile dislodged from a lawn mower. Imaging demonstrated a foreign body that embolized to the left pulmonary artery, which was successfully treated non-operatively. We reviewed the penetrating thoracic trauma patients in the trauma registry at our institution between 1/1/03 and 12/31/12. Data collected included demographic data, procedures performed, complications and outcome. Sixty-five patients were identified with a diagnosis of penetrating thoracic trauma. Fourteen of the patients had low velocity penetrating trauma and 51 had high velocity injuries. Patients with high velocity injuries were more likely to be older and less likely to be Caucasian. There were no statistically significant differences between patients with low vs. high velocity injuries regarding severity scores or length of stay. There were no statistically significant differences in procedures required between patients with low and high velocity injuries. Penetrating thoracic trauma is rare in children. The case presented here represents the only report of cardiac foreign body embolus we could identify in a pediatric patient.
Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole
2006-02-01
To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.
Evaluation Using Sequential Trials Methods.
ERIC Educational Resources Information Center
Cohen, Mark E.; Ralls, Stephen A.
1986-01-01
Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)
ERIC Educational Resources Information Center
Mauriello, David
1984-01-01
Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…
Development and application of a statistical quality assessment method for dense-graded mixes.
DOT National Transportation Integrated Search
2004-08-01
This report describes the development of the statistical quality assessment method and the procedure for mapping the measures obtained from the quality assessment method to a composite pay factor. The application to dense-graded mixes is demonstrated...
Colegrave, Nick
2017-01-01
A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912
Deciphering Sources of Variability in Clinical Pathology.
Tripathi, Niraj K; Everds, Nancy E; Schultze, A Eric; Irizarry, Armando R; Hall, Robert L; Provencher, Anne; Aulbach, Adam
2017-01-01
The objectives of this session were to explore causes of variability in clinical pathology data due to preanalytical and analytical variables as well as study design and other procedures that occur in toxicity testing studies. The presenters highlighted challenges associated with such variability in differentiating test article-related effects from the effects of experimental procedures and its impact on overall data interpretation. These presentations focused on preanalytical and analytical variables and study design-related factors and their influence on clinical pathology data, and the importance of various factors that influence data interpretation including statistical analysis and reference intervals. Overall, these presentations touched upon potential effect of many variables on clinical pathology parameters, including animal physiology, sample collection process, specimen handling and analysis, study design, and some discussion points on how to manage those variables to ensure accurate interpretation of clinical pathology data in toxicity studies. This article is a brief synopsis of presentations given in a session entitled "Deciphering Sources of Variability in Clinical Pathology-It's Not Just about the Numbers" that occurred at the 35th Annual Symposium of the Society of Toxicologic Pathology in San Diego, California.
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator); Hay, C. M.; Thomas, R. W.; Benson, A. S.
1977-01-01
Progress in the evaluation of the static stratification procedure and the development of alternative photointerpretive techniques to the present LACIE procedure for the identification of training fields is reported. Statistically significant signature controlling variables were defined for use in refining the stratification procedure. A subset of the 1973-74 Kansas LACIE segments for wheat was analyzed.
Advances in Significance Testing for Cluster Detection
NASA Astrophysics Data System (ADS)
Coleman, Deidra Andrea
Over the past two decades, much attention has been given to data driven project goals such as the Human Genome Project and the development of syndromic surveillance systems. A major component of these types of projects is analyzing the abundance of data. Detecting clusters within the data can be beneficial as it can lead to the identification of specified sequences of DNA nucleotides that are related to important biological functions or the locations of epidemics such as disease outbreaks or bioterrorism attacks. Cluster detection techniques require efficient and accurate hypothesis testing procedures. In this dissertation, we improve upon the hypothesis testing procedures for cluster detection by enhancing distributional theory and providing an alternative method for spatial cluster detection using syndromic surveillance data. In Chapter 2, we provide an efficient method to compute the exact distribution of the number and coverage of h-clumps of a collection of words. This method involves defining a Markov chain using a minimal deterministic automaton to reduce the number of states needed for computation. We allow words of the collection to contain other words of the collection making the method more general. We use our method to compute the distributions of the number and coverage of h-clumps in the Chi motif of H. influenza.. In Chapter 3, we provide an efficient algorithm to compute the exact distribution of multiple window discrete scan statistics for higher-order, multi-state Markovian sequences. This algorithm involves defining a Markov chain to efficiently keep track of probabilities needed to compute p-values of the statistic. We use our algorithm to identify cases where the available approximation does not perform well. We also use our algorithm to detect unusual clusters of made free throw shots by National Basketball Association players during the 2009-2010 regular season. In Chapter 4, we give a procedure to detect outbreaks using syndromic surveillance data while controlling the Bayesian False Discovery Rate (BFDR). The procedure entails choosing an appropriate Bayesian model that captures the spatial dependency inherent in epidemiological data and considers all days of interest, selecting a test statistic based on a chosen measure that provides the magnitude of the maximumal spatial cluster for each day, and identifying a cutoff value that controls the BFDR for rejecting the collective null hypothesis of no outbreak over a collection of days for a specified region.We use our procedure to analyze botulism-like syndrome data collected by the North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT).
Branzetti, Jeremy B; Adedipe, Adeyinka A; Gittinger, Matthew J; Rosenman, Elizabeth D; Brolliar, Sarah; Chipman, Anne K; Grand, James A; Fernandez, Rosemarie
2017-11-01
A subset of high-risk procedures present significant safety threats due to their (1) infrequent occurrence, (2) execution under time constraints and (3) immediate necessity for patient survival. A Just-in-Time (JIT) intervention could provide real-time bedside guidance to improve high-risk procedural performance and address procedural deficits associated with skill decay. To evaluate the impact of a novel JIT intervention on transvenous pacemaker (TVP) placement during a simulated patient event. This was a prospective, randomised controlled study to determine the effect of a JIT intervention on performance of TVP placement. Subjects included board-certified emergency medicine physicians from two hospitals. The JIT intervention consisted of a portable, bedside computer-based procedural adjunct. The primary outcome was performance during a simulated patient encounter requiring TVP placement, as assessed by trained raters using a technical skills checklist. Secondary outcomes included global performance ratings, time to TVP placement, number of critical omissions and System Usability Scale scores (intervention only). Groups were similar at baseline across all outcomes. Compared with the control group, the intervention group demonstrated statistically significant improvement in the technical checklist score (11.45 vs 23.44, p<0.001, Cohen's d effect size 4.64), the global rating scale (2.27 vs 4.54, p<0.001, Cohen's d effect size 3.76), and a statistically significant reduction in critical omissions (2.23 vs 0.68, p<0.001, Cohen's d effect size -1.86). The difference in time to procedural completion was not statistically significant between conditions (11.15 min vs 12.80 min, p=0.12, Cohen's d effect size 0.65). System Usability Scale scores demonstrated excellent usability. A JIT intervention improved procedure perfromance, suggesting a role for JIT interventions in rarely performed procedures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Differences in the bioenergetic potential of athletes participating in team sports.
Malacko, Julijan; Doder, Dragan; Djurdjević, Slavisa; Savić, Biljana; Doder, Radoslava
2013-07-01
In modern training technology, assessment of aerobic bioenergetic potential in athletes is commonly performed by standard laboratory procedures to determine basic or specific functional abilities for specific sport activity or discipline. The aim of study was to assess the aerobic bioenergetic potential of athletes participating in basketball, football and handball. The study included 87 athletes (29 basketball players, 29 football players, and 29 handball players) aged 21-24. Evaluation of the aerobic bioenergetic potential of athletes participating in basketball, football and handball was performed followed by both univariate (ANOVA) and multivariate (MANOVA) statistical methods to determine differences among the athletes in relative (VO2 mL/kg/min) and absolute oxygen consumption (VO2 L/min). Statistically significant differences between absolute and relative oxygen consumption were found in basketball players (Mb), football players (Mf), and handball players (Mh) (MANOVA, p = 0.00). ANOVA also revealed significant differences in relative oxygen consumption (VO2 mL/kg/min) (p = 0.00). The football players (55.32 mL/kg/min) had the highest relative oxygen consumption, followed by the handball players (51.84 mL/kg/min) and basketball players (47.00 mL/kg/min). The highest absolute oxygen consumption was recorded in the basketball players (4.47 L/min), followed by the handball players (4.40 L/min) and footballers (4.16 L/min). Statistically significant differences in the aerobic bioenergetic potential, expressed by the relative oxygen consumption were found among atletes participating in different team sports. It can be assumed that the player from the sports in which it is necessary to cross greater distance in total during the match have a greater need for aerobic capacity.
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses
Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.
Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.
Harrysson, Iliana J; Cook, Jonathan; Sirimanna, Pramudith; Feldman, Liane S; Darzi, Ara; Aggarwal, Rajesh
2014-07-01
To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment. Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary. A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria. Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121). Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.
Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.
Wang, Zuozhen
2018-01-01
Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.
Reddy, Vivek Y; Holmes, David; Doshi, Shephal K; Neuzil, Petr; Kar, Saibal
2011-02-01
The Watchman Left Atrial Appendage System for Embolic Protection in Patients With AF (PROTECT AF) randomized trial compared left atrial appendage closure against warfarin in atrial fibrillation (AF) patients with CHADS₂ ≥1. Although the study met the primary efficacy end point of being noninferior to warfarin therapy for the prevention of stroke/systemic embolism/cardiovascular death, there was a significantly higher risk of complications, predominantly pericardial effusion and procedural stroke related to air embolism. Here, we report the influence of experience on the safety of percutaneous left atrial appendage closure. The study cohort for this analysis included patients in the PROTECT AF trial who underwent attempted device left atrial appendage closure (n=542 patients) and those from a subsequent nonrandomized registry of patients undergoing Watchman implantation (Continued Access Protocol [CAP] Registry; n=460 patients). The safety end point included bleeding- and procedure-related events (pericardial effusion, stroke, device embolization). There was a significant decline in the rate of procedure- or device-related safety events within 7 days of the procedure across the 2 studies, with 7.7% and 3.7% of patients, respectively, experiencing events (P=0.007), and between the first and second halves of PROTECT AF and CAP, with 10.0%, 5.5%, and 3.7% of patients, respectively, experiencing events (P=0.006). The rate of serious pericardial effusion within 7 days of implantation, which had made up >50% of the safety events in PROTECT AF, was lower in the CAP Registry (5.0% versus 2.2%, respectively; P=0.019). There was a similar experience-related improvement in procedure-related stroke (0.9% versus 0%, respectively; P=0.039). Finally, the functional impact of these safety events, as defined by significant disability or death, was statistically superior in the Watchman group compared with the warfarin group in PROTECT AF. This remained true whether significance was defined as a change in the modified Rankin score of ≥1, ≥2, or ≥3 (1.8 versus 4.3 events per 100 patient-years; relative risk, 0.43; 95% confidence interval, 0.24 to 0.82; 1.5 versus 3.7 events per 100 patient-years; relative risk, 0.41; 95% confidence interval, 0.22 to 0.82; and 1.4 versus 3.3 events per 100 patient-years; relative risk, 0.43; 95% confidence interval, 0.22 to 0.88, respectively). As with all interventional procedures, there is a significant improvement in the safety of Watchman left atrial appendage closure with increased operator experience. Clinical Trial Registration- URL: http://clinicaltrials.gov. Unique identifier: NCT00129545.
Space Shuttle Main Engine performance analysis
NASA Technical Reports Server (NTRS)
Santi, L. Michael
1993-01-01
For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both measurement and balance uncertainty estimates. The reconciler attempts to select operational parameters that minimize the difference between theoretical prediction and observation. Selected values are further constrained to fall within measurement uncertainty limits and to satisfy fundamental physical relations (mass conservation, energy conservation, pressure drop relations, etc.) within uncertainty estimates for all SSME subsystems. The parameter selection problem described above is a traditional nonlinear programming problem. The reconciler employs a mixed penalty method to determine optimum values of SSME operating parameters associated with this problem formulation.
Ortiz, Gabriela S; OʼConnor, Todd; Carey, Jessa; Vella, Adam; Paul, Audrey; Rode, Diane; Weinberg, Alan
2017-02-21
Child life specialists and music therapists have a unique and integral role in providing psychosocial care to pediatric patients and families. These professionals are trained to provide clinical interventions that support coping and adjustment and reduce the risk of psychological trauma related to hospital visits and health care encounters. The researchers devised a multimodal approach using a combined child life and music therapy intervention to address procedure-related distress in patients receiving intravenous (IV) placement in the pediatric emergency department. The aim of this study was to investigate the efficacy of this collaborative intervention by evaluating parental perception of their child's distress. This study was a prospective analysis investigating the impact of a child life and music therapy intervention on children aged 4 to 11 years old receiving an IV placement in the pediatric emergency department. Efficacy was evaluated by comparing scores between a 4-question pretest and subsequent 4-question posttest that asked the child's parent to evaluate how they anticipated their child would respond to the procedure, and then to evaluate how they perceived their child to have responded after the procedure. Qualitative data were collected in the form of open-ended comments, which were accommodated at the end of the posttest. Data were analyzed by the Cochran-Mantel-Haenszel method for testing repeated ordinal responses and the PROC GENMOD procedure in the SAS system software. A total of 41 participants were enrolled in this study. Results of the statistical analysis revealed significant differences between all pre- and posttest scores (P < 0.05), and significant likelihood that the patient would improve relative to the 4 questions, as a result of the child life and music therapy intervention. Improvement was demonstrated across all 4 questions, suggesting that the child life and music therapy intervention supported healthy, adaptive coping and helped to minimize distress experienced by patients during IV placement. These results underscore the importance and potential clinical impact of child life psychological preparation and psychotherapy-based music therapy interventions in reducing distress in pediatric patients during common medical procedures.
Babiloni, Claudio; Pennica, Alfredo; Del Percio, Claudio; Noce, Giuseppe; Cordone, Susanna; Muratori, Chiara; Ferracuti, Stefano; Donato, Nicole; Di Campli, Francesco; Gianserra, Laura; Teti, Elisabetta; Aceti, Antonio; Soricelli, Andrea; Viscione, Magdalena; Limatola, Cristina; Andreoni, Massimo; Onorati, Paolo
2016-03-01
This study tested a simple statistical procedure to recognize single treatment-naïve HIV individuals having abnormal cortical sources of resting state delta (<4 Hz) and alpha (8-13 Hz) electroencephalographic (EEG) rhythms with reference to a control group of sex-, age-, and education-matched healthy individuals. Compared to the HIV individuals with a statistically normal EEG marker, those with abnormal values were expected to show worse cognitive status. Resting state eyes-closed EEG data were recorded in 82 treatment-naïve HIV (39.8 ys.±1.2 standard error mean, SE) and 59 age-matched cognitively healthy subjects (39 ys.±2.2 SE). Low-resolution brain electromagnetic tomography (LORETA) estimated delta and alpha sources in frontal, central, temporal, parietal, and occipital cortical regions. Ratio of the activity of parietal delta and high-frequency alpha sources (EEG marker) showed the maximum difference between the healthy and the treatment-naïve HIV group. Z-score of the EEG marker was statistically abnormal in 47.6% of treatment-naïve HIV individuals with reference to the healthy group (p<0.05). Compared to the HIV individuals with a statistically normal EEG marker, those with abnormal values exhibited lower mini mental state evaluation (MMSE) score, higher CD4 count, and lower viral load (p<0.05). This statistical procedure permitted for the first time to identify single treatment-naïve HIV individuals having abnormal EEG activity. This procedure might enrich the detection and monitoring of effects of HIV on brain function in single treatment-naïve HIV individuals. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Fuzzy Classification of Ocean Color Satellite Data for Bio-optical Algorithm Constituent Retrievals
NASA Technical Reports Server (NTRS)
Campbell, Janet W.
1998-01-01
The ocean has been traditionally viewed as a 2 class system. Morel and Prieur (1977) classified ocean water according to the dominant absorbent particle suspended in the water column. Case 1 is described as having a high concentration of phytoplankton (and detritus) relative to other particles. Conversely, case 2 is described as having inorganic particles such as suspended sediments in high concentrations. Little work has gone into the problem of mixing bio-optical models for these different water types. An approach is put forth here to blend bio-optical algorithms based on a fuzzy classification scheme. This scheme involves two procedures. First, a clustering procedure identifies classes and builds class statistics from in-situ optical measurements. Next, a classification procedure assigns satellite pixels partial memberships to these classes based on their ocean color reflectance signature. These membership assignments can be used as the basis for a weighting retrievals from class-specific bio-optical algorithms. This technique is demonstrated with in-situ optical measurements and an image from the SeaWiFS ocean color satellite.
The United States Environmental Protection Agency's (EPA) Office of Ground Water and Drinking Water (OGWDW) has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which fu...
DOT National Transportation Integrated Search
2008-10-01
This research effort sought to develop statistically justifiable means for developing a : schedule of liquidated damage (LD) rates to be adopted by the Alabama Department of : Transportation (ALDOT). The procedure outlined is to be used to review and...
14 CFR 21.303 - Replacement and modification parts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... determination can be made. Statistical quality control procedures may be employed where it is shown that a... AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Approval of Materials, Parts, Processes, and... the configuration of the part; and (ii) Information on dimensions, materials, and processes necessary...