Sample records for statistical analysis determined

  1. Recent statistical methods for orientation data

    NASA Technical Reports Server (NTRS)

    Batschelet, E.

    1972-01-01

    The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.

  2. Analysis and Report on SD2000: A Workshop to Determine Structural Dynamics Research for the Millenium

    DTIC Science & Technology

    2000-04-10

    interest. These include Statistical Energy Analysis (SEA), fuzzy structure theory, and approaches combining modal analysis and SEA. Non-determinism...34 arising with increasing frequency. This has led to Statistical Energy Analysis , in which a system is modelled as a collection of coupled subsystems...22. IUTAM Symposium on Statistical Energy Analysis . 1999 Ed. F.J. Fahy and W.G. Price. Kluwer Academic Publishing. • 23. R.S. Langley and P

  3. Tri-Center Analysis: Determining Measures of Trichotomous Central Tendency for the Parametric Analysis of Tri-Squared Test Results

    ERIC Educational Resources Information Center

    Osler, James Edward

    2014-01-01

    This monograph provides an epistemological rational for the design of a novel post hoc statistical measure called "Tri-Center Analysis". This new statistic is designed to analyze the post hoc outcomes of the Tri-Squared Test. In Tri-Center Analysis trichotomous parametric inferential parametric statistical measures are calculated from…

  4. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  5. Statistical Analysis For Nucleus/Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1989-01-01

    Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.

  6. The Shock and Vibration Digest. Volume 14, Number 12

    DTIC Science & Technology

    1982-12-01

    to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis

  7. Linearised and non-linearised isotherm models optimization analysis by error functions and statistical means

    PubMed Central

    2014-01-01

    In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878

  8. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  9. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  10. Statistical analysis of 59 inspected SSME HPFTP turbine blades (uncracked and cracked)

    NASA Technical Reports Server (NTRS)

    Wheeler, John T.

    1987-01-01

    The numerical results of statistical analysis of the test data of Space Shuttle Main Engine high pressure fuel turbopump second-stage turbine blades, including some with cracks are presented. Several statistical methods use the test data to determine the application of differences in frequency variations between the uncracked and cracked blades.

  11. Why campaigns for local transportation funding initiatives succeed or fail : an analysis of four communities and national data

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  12. Why Campaigns for Local Transportation Funding Initiatives Succeed or Fail: An Analysis of Four Communities and National Data (PDF file)

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  13. Communications Link Characterization Experiment (CLCE) technical data report, volume 2

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The results are presented of the long term rain rate statistical analysis and the investigation of determining the worst month statistical from the measured attenuation data caused by precipitation. The rain rate statistics cover a period of 11 months from July of 1974 to May of 1975 for measurements taken at the NASA, Rosman station. The rain rate statistical analysis is a continuation of the analysis of the rain rate data accumulated for the ATS-6 Millimeter Wave Progation Experiment. The statistical characteristics of the rain rate data through December of 1974 is also presented for the above experiment.

  14. Systems and methods for detection of blowout precursors in combustors

    DOEpatents

    Lieuwen, Tim C.; Nair, Suraj

    2006-08-15

    The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.

  15. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  16. A Didactic Experience of Statistical Analysis for the Determination of Glycine in a Nonaqueous Medium Using ANOVA and a Computer Program

    ERIC Educational Resources Information Center

    Santos-Delgado, M. J.; Larrea-Tarruella, L.

    2004-01-01

    The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.

  17. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  18. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  19. Tunneling Statistics for Analysis of Spin-Readout Fidelity

    NASA Astrophysics Data System (ADS)

    Gorman, S. K.; He, Y.; House, M. G.; Keizer, J. G.; Keith, D.; Fricke, L.; Hile, S. J.; Broome, M. A.; Simmons, M. Y.

    2017-09-01

    We investigate spin and charge dynamics of a quantum dot of phosphorus atoms coupled to a radio-frequency single-electron transistor (SET) using full counting statistics. We show how the magnetic field plays a role in determining the bunching or antibunching tunneling statistics of the donor dot and SET system. Using the counting statistics, we show how to determine the lowest magnetic field where spin readout is possible. We then show how such a measurement can be used to investigate and optimize single-electron spin-readout fidelity.

  20. Economic and statistical analysis of time limitations for spotting fluids and fishing operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, P.S.; Brinkmann, P.E.; Taneja, P.K.

    1984-05-01

    This paper reviews the statistics of ''Spotting Fluids'' to free stuck drill pipe as well as the economics and statistics of drill string fishing operations. Data were taken from Mobil Oil Exploration and Producing Southeast Inc.'s (MOEPSI) records from 1970-1981. Only those events which occur after a drill string becomes stuck are discussed. The data collected were categorized as Directional Wells and Straight Wells. Bar diagrams are presented to show the Success Ratio vs. Soaking Time for each of the two categories. An analysis was made to identify the elapsed time limit to place the spotting fluid for maximum probabilitymore » of success. Also determined was the statistical minimum soaking time and the maximum soaking time. For determining the time limit for fishing operations, the following criteria were used: 1. The Risked ''Economic Breakeven Analysis'' concept was developed based on the work of Harrison. 2. Statistical Probability of Success based on MOEPSI's records from 1970-1981.« less

  1. The Shock and Vibration Digest. Volume 15, Number 7

    DTIC Science & Technology

    1983-07-01

    systems noise -- for tant analytical tool, the statistical energy analysis example, from a specific metal, chain driven, con- method, has been the subject...34Experimental Determination of Vibration Parameters Re- ~~~quired in the Statistical Energy Analysis Meth- .,i. 31. Dubowsky, S. and Morris, T.L., "An...34Coupling Loss Factors for 55. Upton, R., "Sound Intensity -. A Powerful New Statistical Energy Analysis of Sound Trans- Measurement Tool," S/V, Sound

  2. Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.

    PubMed

    Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun

    2016-01-01

    We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. A study of the feasibility of statistical analysis of airport performance simulation

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  4. 10 CFR 431.445 - Determination of small electric motor efficiency.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3... statistical analysis, computer simulation or modeling, and other analytic evaluation of performance data on.... (ii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  5. Human Deception Detection from Whole Body Motion Analysis

    DTIC Science & Technology

    2015-12-01

    9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL

  6. On the blind use of statistical tools in the analysis of globular cluster stars

    NASA Astrophysics Data System (ADS)

    D'Antona, Francesca; Caloi, Vittoria; Tailo, Marco

    2018-04-01

    As with most data analysis methods, the Bayesian method must be handled with care. We show that its application to determine stellar evolution parameters within globular clusters can lead to paradoxical results if used without the necessary precautions. This is a cautionary tale on the use of statistical tools for big data analysis.

  7. 17 CFR 240.17g-2 - Records to be made and retained by nationally recognized statistical rating organizations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... rating. (9) Internal documents that contain information, analysis, or statistics that were used to... account record for each subscriber to the credit ratings and/or credit analysis reports of the nationally... consideration the internal credit analysis of another person; or (iv) Determining credit ratings or private...

  8. 17 CFR 240.17g-2 - Records to be made and retained by nationally recognized statistical rating organizations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... rating. (9) Internal documents that contain information, analysis, or statistics that were used to... account record for each subscriber to the credit ratings and/or credit analysis reports of the nationally... consideration the internal credit analysis of another person; or (iv) Determining credit ratings or private...

  9. 17 CFR 240.17g-2 - Records to be made and retained by nationally recognized statistical rating organizations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... rating. (9) Internal documents that contain information, analysis, or statistics that were used to... account record for each subscriber to the credit ratings and/or credit analysis reports of the nationally... consideration the internal credit analysis of another person; or (iv) Determining credit ratings or private...

  10. 17 CFR 240.17g-2 - Records to be made and retained by nationally recognized statistical rating organizations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... rating. (9) Internal documents that contain information, analysis, or statistics that were used to... account record for each subscriber to the credit ratings and/or credit analysis reports of the nationally... consideration the internal credit analysis of another person; or (iv) Determining credit ratings or private...

  11. Determining the Statistical Significance of Relative Weights

    ERIC Educational Resources Information Center

    Tonidandel, Scott; LeBreton, James M.; Johnson, Jeff W.

    2009-01-01

    Relative weight analysis is a procedure for estimating the relative importance of correlated predictors in a regression equation. Because the sampling distribution of relative weights is unknown, researchers using relative weight analysis are unable to make judgments regarding the statistical significance of the relative weights. J. W. Johnson…

  12. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  13. Can Money Buy Happiness? A Statistical Analysis of Predictors for User Satisfaction

    ERIC Educational Resources Information Center

    Hunter, Ben; Perret, Robert

    2011-01-01

    2007 data from LibQUAL+[TM] and the ACRL Library Trends and Statistics database were analyzed to determine if there is a statistically significant correlation between library expenditures and usage statistics and library patron satisfaction across 73 universities. The results show that users of larger, better funded libraries have higher…

  14. On the Utility of Content Analysis in Author Attribution: "The Federalist."

    ERIC Educational Resources Information Center

    Martindale, Colin; McKenzie, Dean

    1995-01-01

    Compares the success of lexical statistics, content analysis, and function words in determining the true author of "The Federalist." The function word approach proved most successful in attributing the papers to James Madison. Lexical statistics contributed nothing, while content analytic measures resulted in some success. (MJP)

  15. Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics

    ERIC Educational Resources Information Center

    Olsen, Robert J.; Sattar, Simeen

    2013-01-01

    Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…

  16. 41 CFR 60-2.35 - Compliance status.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... status will be judged alone by whether it reaches its goals. The composition of the contractor's... obligations will be determined by analysis of statistical data and other non-statistical information which...

  17. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  18. 41 CFR 60-2.35 - Compliance status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... workforce (i.e., the employment of minorities or women at a percentage rate below, or above, the goal level... obligations will be determined by analysis of statistical data and other non-statistical information which...

  19. Common pitfalls in statistical analysis: Understanding the properties of diagnostic tests - Part 1.

    PubMed

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    In this article in our series on common pitfalls in statistical analysis, we look at some of the attributes of diagnostic tests (i.e., tests which are used to determine whether an individual does or does not have disease). The next article in this series will focus on further issues related to diagnostic tests.

  20. STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitch, S.H.; Morris, J.W.

    1962-12-15

    Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)

  1. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  2. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  3. Technical Note: The Initial Stages of Statistical Data Analysis

    PubMed Central

    Tandy, Richard D.

    1998-01-01

    Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489

  4. The Importance of Teaching Power in Statistical Hypothesis Testing

    ERIC Educational Resources Information Center

    Olinsky, Alan; Schumacher, Phyllis; Quinn, John

    2012-01-01

    In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…

  5. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  6. Statistical Estimation of Rollover Risk

    DOT National Transportation Integrated Search

    1989-08-01

    This report describes the results of a statistical analysis to determine the : probability of a rollover in a single vehicle accident. Over 39,000 accidents, : which included 4910 rollovers in the states of Texas, Maryland, and Washington were : exam...

  7. Approaching Career Criminals With An Intelligence Cycle

    DTIC Science & Technology

    2015-12-01

    including arrest statistics and “arrest statistics have been used as the main barometer of juvenile delinquent activity, (but) many juvenile... Statistical Briefing Book,” 187. 26 guided by theories about the causes of delinquent behavior, but there was no determination if those efforts achieved the...children.”110 However, the most evidence-based comparison of juvenile delinquency reduction programs is the statistical meta-analysis (a systematic

  8. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    EPA Pesticide Factsheets

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  9. Analysis of statistical and standard algorithms for detecting muscle onset with surface electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-01-01

    The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.

  10. Oregon ground-water quality and its relation to hydrogeological factors; a statistical approach

    USGS Publications Warehouse

    Miller, T.L.; Gonthier, J.B.

    1984-01-01

    An appraisal of Oregon ground-water quality was made using existing data accessible through the U.S. Geological Survey computer system. The data available for about 1,000 sites were separated by aquifer units and hydrologic units. Selected statistical moments were described for 19 constituents including major ions. About 96 percent of all sites in the data base were sampled only once. The sample data were classified by aquifer unit and hydrologic unit and analysis of variance was run to determine if significant differences exist between the units within each of these two classifications for the same 19 constituents on which statistical moments were determined. Results of the analysis of variance indicated both classification variables performed about the same, but aquifer unit did provide more separation for some constituents. Samples from the Rogue River basin were classified by location within the flow system and type of flow system. The samples were then analyzed using analysis of variance on 14 constituents to determine if there were significant differences between subsets classified by flow path. Results of this analysis were not definitive, but classification as to the type of flow system did indicate potential for segregating water-quality data into distinct subsets. (USGS)

  11. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  12. Multivariate analysis of heavy metal contamination using river sediment cores of Nankan River, northern Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, An-Sheng; Lu, Wei-Li; Huang, Jyh-Jaan; Chang, Queenie; Wei, Kuo-Yen; Lin, Chin-Jung; Liou, Sofia Ya Hsuan

    2016-04-01

    Through the geology and climate characteristic in Taiwan, generally rivers carry a lot of suspended particles. After these particles settled, they become sediments which are good sorbent for heavy metals in river system. Consequently, sediments can be found recording contamination footprint at low flow energy region, such as estuary. Seven sediment cores were collected along Nankan River, northern Taiwan, which is seriously contaminated by factory, household and agriculture input. Physico-chemical properties of these cores were derived from Itrax-XRF Core Scanner and grain size analysis. In order to interpret these complex data matrices, the multivariate statistical techniques (cluster analysis, factor analysis and discriminant analysis) were introduced to this study. Through the statistical determination, the result indicates four types of sediment. One of them represents contamination event which shows high concentration of Cu, Zn, Pb, Ni and Fe, and low concentration of Si and Zr. Furthermore, three possible contamination sources of this type of sediment were revealed by Factor Analysis. The combination of sediment analysis and multivariate statistical techniques used provides new insights into the contamination depositional history of Nankan River and could be similarly applied to other river systems to determine the scale of anthropogenic contamination.

  13. IUTAM Symposium on Statistical Energy Analysis, 8-11 July 1997, Programme

    DTIC Science & Technology

    1997-01-01

    distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum200 words) This was the first international scientific gathering devoted...energy flow, continuum dynamics, vibrational energy, statistical energy analysis (SEA) 15. NUMBER OF PAGES 16. PRICE CODE INSECURITY... correlation v=V(ɘ ’• • determination of the correlation n^, =11^, (<?). When harmonic motion and time-average are considered, the following I

  14. Evaluation of data generated by statistically oriented end-result specifications : final report.

    DOT National Transportation Integrated Search

    1979-01-01

    This report is concerned with the review of data generated on projects let under the statistically oriented end-result specifications (ERS) on asphaltic concrete and Portland cement concrete. The review covered analysis of data for determination of p...

  15. Analysis of statistical and standard algorithms for detecting muscle onset with surface electromyography

    PubMed Central

    Tweedell, Andrew J.; Haynes, Courtney A.

    2017-01-01

    The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60–90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity. PMID:28489897

  16. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  17. Identifiability of PBPK Models with Applications to ...

    EPA Pesticide Factsheets

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  18. SU-F-T-386: Analysis of Three QA Methods for Predicting Dose Deviation Pass Percentage for Lung SBRT VMAT Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, M; To, D; Giaddui, T

    2016-06-15

    Purpose: To investigate the significance of using pinpoint ionization chambers (IC) and RadCalc (RC) in determining the quality of lung SBRT VMAT plans with low dose deviation pass percentage (DDPP) as reported by ScandiDos Delta4 (D4). To quantify the relationship between DDPP and point dose deviations determined by IC (ICDD), RadCalc (RCDD), and median dose deviation reported by D4 (D4DD). Methods: Point dose deviations and D4 DDPP were compiled for 45 SBRT VMAT plans. Eighteen patients were treated on Varian Truebeam linear accelerators (linacs); the remaining 27 were treated on Elekta Synergy linacs with Agility collimators. A one-way analysis ofmore » variance (ANOVA) was performed to determine if there were any statistically significant differences between D4DD, ICDD, and RCDD. Tukey’s test was used to determine which pair of means was statistically different from each other. Multiple regression analysis was performed to determine if D4DD, ICDD, or RCDD are statistically significant predictors of DDPP. Results: Median DDPP, D4DD, ICDD, and RCDD were 80.5% (47.6%–99.2%), −0.3% (−2.0%–1.6%), 0.2% (−7.5%–6.3%), and 2.9% (−4.0%–19.7%), respectively. The ANOVA showed a statistically significant difference between D4DD, ICDD, and RCDD for a 95% confidence interval (p < 0.001). Tukey’s test revealed a statistically significant difference between two pairs of groups, RCDD-D4DD and RCDD-ICDD (p < 0.001), but no difference between ICDD-D4DD (p = 0.485). Multiple regression analysis revealed that ICDD (p = 0.04) and D4DD (p = 0.03) are statistically significant predictors of DDPP with an adjusted r{sup 2} of 0.115. Conclusion: This study shows ICDD predicts trends in D4 DDPP; however this trend is highly variable as shown by our low r{sup 2}. This work suggests that ICDD can be used as a method to verify DDPP in delivery of lung SBRT VMAT plans. RCDD may not validate low DDPP discovered in D4 QA for small field SBRT treatments.« less

  19. A Meta-Analysis: The Relationship between Father Involvement and Student Academic Achievement

    ERIC Educational Resources Information Center

    Jeynes, William H.

    2015-01-01

    A meta-analysis was undertaken, including 66 studies, to determine the relationship between father involvement and the educational outcomes of urban school children. Statistical analyses were done to determine the overall impact and specific components of father involvement. The possible differing effects of paternal involvement by race were also…

  20. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  1. Determining Differences in Efficacy of Two Disinfectants Using t-Tests.

    ERIC Educational Resources Information Center

    Brehm, Michael A.; And Others

    1996-01-01

    Presents an experiment to compare the effectiveness of 95% ethanol to 20% bleach as disinfectants using t-tests for the statistical analysis of the data. Reports that bleach is a better disinfectant. Discusses the statistical and practical significance of the results. (JRH)

  2. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  3. OASIS 2: online application for survival analysis 2 with features for the analysis of maximal lifespan and healthspan in aging research.

    PubMed

    Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk

    2016-08-30

    Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.

  4. STATISTICAL ANALYSIS OF SPECTROPHOTOMETRIC DETERMINATIONS OF BORON; Estudo Estatistico de Determinacoes Espectrofotometricas de Boro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, F.W.; Pagano, C.; Schneiderman, B.

    1959-07-01

    Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)

  5. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  6. U.S. Marine Corps Study of Establishing Time Criteria for Logistics Tasks

    DTIC Science & Technology

    2004-09-30

    STATISTICS FOR REQUESTS PER DAY FOR TWO BATTALIONS II-25 II-6 SUMMARY STATISTICS IN HOURS FOR RESOURCE REQUIREMENTS PER DAY FOR TWO BATTALIONS II-26 II-7...SUMMARY STATISTICS FOR INDIVIDUALS FOR RESOURCE REQUIREMENTS PER DAY FOR TWO BATTALIONS II-27 Study of Establishing Time Criteria for Logistics...developed and run to provide statistical information for analysis. In Task Four, the study team used Task Three findings to determine data requirements

  7. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  8. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  9. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  10. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    PubMed

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.

  11. The Relationship between Parental Involvement and Urban Secondary School Student Academic Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Jeynes, William H.

    2007-01-01

    A meta-analysis is undertaken, including 52 studies, to determine the influence of parental involvement on the educational outcomes of urban secondary school children. Statistical analyses are done to determine the overall impact of parental involvement as well as specific components of parental involvement. Four different measures of educational…

  12. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  13. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  14. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  15. Secondary Analysis of National Longitudinal Transition Study 2 Data

    ERIC Educational Resources Information Center

    Hicks, Tyler A.; Knollman, Greg A.

    2015-01-01

    This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…

  16. STATISTICAL TECHNIQUES FOR DETERMINATION AND PREDICTION OF FUNDAMENTAL FISH ASSEMBLAGES OF THE MID-ATLANTIC HIGHLANDS

    EPA Science Inventory

    A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...

  17. Determination of apparent coupling factors for adhesive bonded acrylic plates using SEAL approach

    NASA Astrophysics Data System (ADS)

    Pankaj, Achuthan. C.; Shivaprasad, M. V.; Murigendrappa, S. M.

    2018-04-01

    Apparent coupling loss factors (CLF) and velocity responses has been computed for two lap joined adhesive bonded plates using finite element and experimental statistical energy analysis like approach. A finite element model of the plates has been created using ANSYS software. The statistical energy parameters have been computed using the velocity responses obtained from a harmonic forced excitation analysis. Experiments have been carried out for two different cases of adhesive bonded joints and the results have been compared with the apparent coupling factors and velocity responses obtained from finite element analysis. The results obtained from the studies signify the importance of modeling of adhesive bonded joints in computation of the apparent coupling factors and its further use in computation of energies and velocity responses using statistical energy analysis like approach.

  18. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  19. Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation

    NASA Technical Reports Server (NTRS)

    Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.

    1998-01-01

    The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.

  20. Recovering incomplete data using Statistical Multiple Imputations (SMI): a case study in environmental chemistry.

    PubMed

    Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D

    2011-10-15

    This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics.

    PubMed

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  2. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  3. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  4. Noise induced hearing loss of forest workers in Turkey.

    PubMed

    Tunay, M; Melemez, K

    2008-09-01

    In this study, a total number of 114 workers who were in 3 different groups in terms of age and work underwent audiometric analysis. In order to determine whether there was a statistically significant difference between the hearing loss levels of the workers who were included in the study, variance analysis was applied with the help of the data obtained as a result of the evaluation. Correlation and regression analysis were applied in order to determine the relations between hearing loss and their age and their time of work. As a result of the variance analysis, statistically significant differences were found at 500, 2000 and 4000 Hz frequencies. The most specific difference was observed among chainsaw machine operators at 4000 Hz frequency, which was determined by the variance analysis. As a result of the correlation analysis, significant relations were found between time of work and hearing loss in 0.01 confidence level and between age and hearing loss in 0.05 confidence level. Forest workers using chainsaw machines should be informed, they should wear or use protective materials and less noising chainsaw machines should be used if possible and workers should undergo audiometric tests when they start work and once a year.

  5. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  6. Multivariate statistical analysis of the polyphenolic constituents in kiwifruit juices to trace fruit varieties and geographical origins.

    PubMed

    Guo, Jing; Yuan, Yahong; Dou, Pei; Yue, Tianli

    2017-10-01

    Fifty-one kiwifruit juice samples of seven kiwifruit varieties from five regions in China were analyzed to determine their polyphenols contents and to trace fruit varieties and geographical origins by multivariate statistical analysis. Twenty-one polyphenols belonging to four compound classes were determined by ultra-high-performance liquid chromatography coupled with ultra-high-resolution TOF mass spectrometry. (-)-Epicatechin, (+)-catechin, procyanidin B1 and caffeic acid derivatives were the predominant phenolic compounds in the juices. Principal component analysis (PCA) allowed a clear separation of the juices according to kiwifruit varieties. Stepwise linear discriminant analysis (SLDA) yielded satisfactory categorization of samples, provided 100% success rate according to kiwifruit varieties and 92.2% success rate according to geographical origins. The result showed that polyphenolic profiles of kiwifruit juices contain enough information to trace fruit varieties and geographical origins. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    ERIC Educational Resources Information Center

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  8. Wavelet analysis of polarization maps of polycrystalline biological fluids networks

    NASA Astrophysics Data System (ADS)

    Ushenko, Y. A.

    2011-12-01

    The optical model of human joints synovial fluid is proposed. The statistic (statistic moments), correlation (autocorrelation function) and self-similar (Log-Log dependencies of power spectrum) structure of polarization two-dimensional distributions (polarization maps) of synovial fluid has been analyzed. It has been shown that differentiation of polarization maps of joint synovial fluid with different physiological state samples is expected of scale-discriminative analysis. To mark out of small-scale domain structure of synovial fluid polarization maps, the wavelet analysis has been used. The set of parameters, which characterize statistic, correlation and self-similar structure of wavelet coefficients' distributions of different scales of polarization domains for diagnostics and differentiation of polycrystalline network transformation connected with the pathological processes, has been determined.

  9. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  10. Inhibition of Orthopaedic Implant Infections by Immunomodulatory Effects of Host Defense Peptides

    DTIC Science & Technology

    2014-12-01

    significance was determined by t- tests or by one-way analysis of variance (ANOVA) followed by Bonferroni post hoc tests in experiments with multiple...groups. Non- parametric Mann-Whitney tests , Kruskal-Wallis ANOVA followed by Newman-Kuels post hoc tests , or van Elteren’s two-way tests were applied to...in D, and black symbols in A), statistical analysis was by one-way ANOVA followed by Bonferroni versus control, post hoc tests . Otherwise, statistical

  11. Statistical analysis of oil percolation through pressboard measured by optical recording

    NASA Astrophysics Data System (ADS)

    Rogalski, Przemysław; Kozak, Czesław

    2017-08-01

    The paper presents a measuring station used to measure the percolation of transformer oil by electrotechnical pressboard. Nytro Taurus insulating oil manufactured by Nynas company percolation rate by the Pucaro company pressboard investigation was made. Approximately 60 samples of Pucaro made pressboard, widely used for insulation of power transformers, was measured. Statistical analysis of oil percolation times were performed. The measurements made it possible to determine the distribution of capillary diameters occurring in the pressboard.

  12. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  13. Statistics used in current nursing research.

    PubMed

    Zellner, Kathleen; Boerst, Connie J; Tabb, Wil

    2007-02-01

    Undergraduate nursing research courses should emphasize the statistics most commonly used in the nursing literature to strengthen students' and beginning researchers' understanding of them. To determine the most commonly used statistics, we reviewed all quantitative research articles published in 13 nursing journals in 2000. The findings supported Beitz's categorization of kinds of statistics. Ten primary statistics used in 80% of nursing research published in 2000 were identified. We recommend that the appropriate use of those top 10 statistics be emphasized in undergraduate nursing education and that the nursing profession continue to advocate for the use of methods (e.g., power analysis, odds ratio) that may contribute to the advancement of nursing research.

  14. Statistical issues in the design, conduct and analysis of two large safety studies.

    PubMed

    Gaffney, Michael

    2016-10-01

    The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.

  15. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  16. Knowledge-Sharing Intention among Information Professionals in Nigeria: A Statistical Analysis

    ERIC Educational Resources Information Center

    Tella, Adeyinka

    2016-01-01

    In this study, the researcher administered a survey and developed and tested a statistical model to examine the factors that determine the intention of information professionals in Nigeria to share knowledge with their colleagues. The result revealed correlations between the overall score for intending to share knowledge and other…

  17. Quantity of dates trumps quality of dates for dense Bayesian radiocarbon sediment chronologies - Gas ion source 14C dating instructed by simultaneous Bayesian accumulation rate modeling

    NASA Astrophysics Data System (ADS)

    Rosenheim, B. E.; Firesinger, D.; Roberts, M. L.; Burton, J. R.; Khan, N.; Moyer, R. P.

    2016-12-01

    Radiocarbon (14C) sediment core chronologies benefit from a high density of dates, even when precision of individual dates is sacrificed. This is demonstrated by a combined approach of rapid 14C analysis of CO2 gas generated from carbonates and organic material coupled with Bayesian statistical modeling. Analysis of 14C is facilitated by the gas ion source on the Continuous Flow Accelerator Mass Spectrometry (CFAMS) system at the Woods Hole Oceanographic Institution's National Ocean Sciences Accelerator Mass Spectrometry facility. This instrument is capable of producing a 14C determination of +/- 100 14C y precision every 4-5 minutes, with limited sample handling (dissolution of carbonates and/or combustion of organic carbon in evacuated containers). Rapid analysis allows over-preparation of samples to include replicates at each depth and/or comparison of different sample types at particular depths in a sediment or peat core. Analysis priority is given to depths that have the least chronologic precision as determined by Bayesian modeling of the chronology of calibrated ages. Use of such a statistical approach to determine the order in which samples are run ensures that the chronology constantly improves so long as material is available for the analysis of chronologic weak points. Ultimately, accuracy of the chronology is determined by the material that is actually being dated, and our combined approach allows testing of different constituents of the organic carbon pool and the carbonate minerals within a core. We will present preliminary results from a deep-sea sediment core abundant in deep-sea foraminifera as well as coastal wetland peat cores to demonstrate statistical improvements in sediment- and peat-core chronologies obtained by increasing the quantity and decreasing the quality of individual dates.

  18. Criteria for a State-of-the-Art Vision Test System

    DTIC Science & Technology

    1985-05-01

    tests are enumerated for possible inclusion in a battery of candidate vision tests to be statistically examined for validity as predictors of aircrew...derived subset thereof) of vision tests may be given to a series of individuals, and statistical tests may be used to determine which visual functions...no target. Statistical analysis of the responses would set a threshold level, which would define the smallest size - (most distant target) or least

  19. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    PubMed

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  20. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  1. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  2. Point pattern analysis of FIA data

    Treesearch

    Chris Woodall

    2002-01-01

    Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...

  3. An improved method for determining force balance calibration accuracy

    NASA Technical Reports Server (NTRS)

    Ferris, Alice T.

    1993-01-01

    The results of an improved statistical method used at Langley Research Center for determining and stating the accuracy of a force balance calibration are presented. The application of the method for initial loads, initial load determination, auxiliary loads, primary loads, and proof loads is described. The data analysis is briefly addressed.

  4. A new strategy for statistical analysis-based fingerprint establishment: Application to quality assessment of Semen sojae praeparatum.

    PubMed

    Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting

    2018-08-30

    Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... method or methods used; the mathematical model, the engineering or statistical analysis, computer... accordance with § 431.16 of this subpart, or by application of an alternative efficiency determination method... must be: (i) Derived from a mathematical model that represents the mechanical and electrical...

  6. USAF (United States Air Force) Stability and Control DATCOM (Data Compendium)

    DTIC Science & Technology

    1978-04-01

    regression analysis involves the study of a group of variables to determine their effect on a given parameter. Because of the empirical nature of this...regression analysis of mathematical statistics. In general, a regression analysis involves the study of a group of variables to determine their effect on a...Excperiment, OSR TN 58-114, MIT Fluid Dynamics Research Group Rapt. 57-5, 1957. (U) 90. Kennet, H., and Ashley, H.: Review of Unsteady Aerodynamic Studies in

  7. Psychometric Analysis of Role Conflict and Ambiguity Scales in Academia

    ERIC Educational Resources Information Center

    Khan, Anwar; Yusoff, Rosman Bin Md.; Khan, Muhammad Muddassar; Yasir, Muhammad; Khan, Faisal

    2014-01-01

    A comprehensive Psychometric Analysis of Rizzo et al.'s (1970) Role Conflict & Ambiguity (RCA) scales were performed after its distribution among 600 academic staff working in six universities of Pakistan. The reliability analysis includes calculation of Cronbach Alpha Coefficients and Inter-Items statistics, whereas validity was determined by…

  8. Analysis of S-box in Image Encryption Using Root Mean Square Error Method

    NASA Astrophysics Data System (ADS)

    Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan

    2012-07-01

    The use of substitution boxes (S-boxes) in encryption applications has proven to be an effective nonlinear component in creating confusion and randomness. The S-box is evolving and many variants appear in literature, which include advanced encryption standard (AES) S-box, affine power affine (APA) S-box, Skipjack S-box, Gray S-box, Lui J S-box, residue prime number S-box, Xyi S-box, and S8 S-box. These S-boxes have algebraic and statistical properties which distinguish them from each other in terms of encryption strength. In some circumstances, the parameters from algebraic and statistical analysis yield results which do not provide clear evidence in distinguishing an S-box for an application to a particular set of data. In image encryption applications, the use of S-boxes needs special care because the visual analysis and perception of a viewer can sometimes identify artifacts embedded in the image. In addition to existing algebraic and statistical analysis already used for image encryption applications, we propose an application of root mean square error technique, which further elaborates the results and enables the analyst to vividly distinguish between the performances of various S-boxes. While the use of the root mean square error analysis in statistics has proven to be effective in determining the difference in original data and the processed data, its use in image encryption has shown promising results in estimating the strength of the encryption method. In this paper, we show the application of the root mean square error analysis to S-box image encryption. The parameters from this analysis are used in determining the strength of S-boxes

  9. Quality control analysis : part II : soil and aggregate base course.

    DOT National Transportation Integrated Search

    1966-07-01

    This is the second of the three reports on the quality control analysis of highway construction materials. : It deals with the statistical evaluation of results from several construction projects to determine the basic pattern of variability with res...

  10. Analysis of traffic accident data in Kentucky (1994-1998)

    DOT National Transportation Integrated Search

    1999-09-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 1994 through 1998. A primary objective of this study was to determine average accident statistics for Kentucky highways. Average and critical numbers and rates of ...

  11. Analysis of traffic crash data in Kentucky : 2002-2006.

    DOT National Transportation Integrated Search

    2007-09-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 2002 through 2006. A primary objective of this study was to determine average accident statistics for Kentucky highways. Average and critical numbers and rates of ...

  12. Analysis of traffic crash data in Kentucky : 2003-2007.

    DOT National Transportation Integrated Search

    2008-08-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 2003 through 2007. A primary objective of this study was to determine average accident statistics for Kentucky highways. Average and critical numbers and rates of ...

  13. Quality control analysis : part III : concrete and concrete aggregates.

    DOT National Transportation Integrated Search

    1966-11-01

    This is the third and last report on the Quality Control Analysis of highway construction materials. : It deals with the statistical evaluation of data from several construction projects to determine the basic pattern of variability with respect to s...

  14. Physical and genetic-interaction density reveals functional organization and informs significance cutoffs in genome-wide screens

    PubMed Central

    Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.

    2013-01-01

    Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890

  15. Adaptation of Lorke's method to determine and compare ED50 values: the cases of two anticonvulsants drugs.

    PubMed

    Garrido-Acosta, Osvaldo; Meza-Toledo, Sergio Enrique; Anguiano-Robledo, Liliana; Valencia-Hernández, Ignacio; Chamorro-Cevallos, Germán

    2014-01-01

    We determined the median effective dose (ED50) values for the anticonvulsants phenobarbital and sodium valproate using a modification of Lorke's method. This modification allowed appropriate statistical analysis and the use of a smaller number of mice per compound tested. The anticonvulsant activities of phenobarbital and sodium valproate were evaluated in male CD1 mice by maximal electroshock (MES) and intraperitoneal administration of pentylenetetrazole (PTZ). The anticonvulsant ED50 values were obtained through modifications of Lorke's method that involved changes in the selection of the three first doses in the initial test and the fourth dose in the second test. Furthermore, a test was added to evaluate the ED50 calculated by the modified Lorke's method, allowing statistical analysis of the data and determination of the confidence limits for ED50. The ED50 for phenobarbital against MES- and PTZ-induced seizures was 16.3mg/kg and 12.7mg/kg, respectively. The sodium valproate values were 261.2mg/kg and 159.7mg/kg, respectively. These results are similar to those found using the traditional methods of finding ED50, suggesting that the modifications made to Lorke's method generate equal results using fewer mice while increasing confidence in the statistical analysis. This adaptation of Lorke's method can be used to determine median letal dose (LD50) or ED50 for compounds with other pharmacological activities. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Quantitative Methods in Library and Information Science Literature: Descriptive vs. Inferential Statistics.

    ERIC Educational Resources Information Center

    Brattin, Barbara C.

    Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…

  17. Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model-Selection Criteria.

    DTIC Science & Technology

    1983-06-16

    has been advocated by Gnanadesikan and 𔃾ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In

  18. Appropriate Statistical Analysis for Two Independent Groups of Likert-Type Data

    ERIC Educational Resources Information Center

    Warachan, Boonyasit

    2011-01-01

    The objective of this research was to determine the robustness and statistical power of three different methods for testing the hypothesis that ordinal samples of five and seven Likert categories come from equal populations. The three methods are the two sample t-test with equal variances, the Mann-Whitney test, and the Kolmogorov-Smirnov test. In…

  19. Population Analysis of Disabled Children by Departments in France

    NASA Astrophysics Data System (ADS)

    Meidatuzzahra, Diah; Kuswanto, Heri; Pech, Nicolas; Etchegaray, Amélie

    2017-06-01

    In this study, a statistical analysis is performed by model the variations of the disabled about 0-19 years old population among French departments. The aim is to classify the departments according to their profile determinants (socioeconomic and behavioural profiles). The analysis is focused on two types of methods: principal component analysis (PCA) and multiple correspondences factorial analysis (MCA) to review which one is the best methods for interpretation of the correlation between the determinants of disability (independent variable). The PCA is the best method for interpretation of the correlation between the determinants of disability (independent variable). The PCA reduces 14 determinants of disability to 4 axes, keeps 80% of total information, and classifies them into 7 classes. The MCA reduces the determinants to 3 axes, retains only 30% of information, and classifies them into 4 classes.

  20. Representation of Probability Density Functions from Orbit Determination using the Particle Filter

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell

    2012-01-01

    Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.

  1. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    PubMed

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  2. Method and system of Jones-matrix mapping of blood plasma films with "fuzzy" analysis in differentiation of breast pathology changes

    NASA Astrophysics Data System (ADS)

    Zabolotna, Natalia I.; Radchenko, Kostiantyn O.; Karas, Oleksandr V.

    2018-01-01

    A fibroadenoma diagnosing of breast using statistical analysis (determination and analysis of statistical moments of the 1st-4th order) of the obtained polarization images of Jones matrix imaginary elements of the optically thin (attenuation coefficient τ <= 0,1 ) blood plasma films with further intellectual differentiation based on the method of "fuzzy" logic and discriminant analysis were proposed. The accuracy of the intellectual differentiation of blood plasma samples to the "norm" and "fibroadenoma" of breast was 82.7% by the method of linear discriminant analysis, and by the "fuzzy" logic method is 95.3%. The obtained results allow to confirm the potentially high level of reliability of the method of differentiation by "fuzzy" analysis.

  3. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data

    PubMed Central

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J.; Yanes, Oscar

    2012-01-01

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples. PMID:24957762

  4. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data.

    PubMed

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J; Yanes, Oscar

    2012-10-18

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.

  5. Analysis of traffic crash data in Kentucky : 1997-2001.

    DOT National Transportation Integrated Search

    2002-08-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 1997 through 2001. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of crash...

  6. Analysis of traffic crash data in Kentucky (2011-2015).

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2011 through 2015. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  7. Analysis of traffic crash data in Kentucky (1997-2001)

    DOT National Transportation Integrated Search

    2002-08-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 1997 through 2001. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of crash...

  8. Analysis of traffic accident data in Kentucky (1986-1990)

    DOT National Transportation Integrated Search

    1991-09-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 1986-1990. A primary objectve of this study was to determine average statistics for kentucky highways. Average and critical number and rates of accidents were calc...

  9. Analysis of Traffic Crash Data in Kentucky (2012-2016).

    DOT National Transportation Integrated Search

    2017-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2012 through 2016. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  10. Analysis of traffic crash data in Kentucky (2009-2013).

    DOT National Transportation Integrated Search

    2014-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2009 through 2013. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  11. Analysis of traffic crash data in Kentucky (1996-2000)

    DOT National Transportation Integrated Search

    2001-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 1996 through 2000. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of crash...

  12. Analysis of traffic crash data in Kentucky : 1996-2000.

    DOT National Transportation Integrated Search

    2001-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 1996 through 2000. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of crash...

  13. Analysis of traffic crash data in Kentucky : 2000-2004.

    DOT National Transportation Integrated Search

    2005-08-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 2000 through 2004. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of cra...

  14. Analysis of traffic crash data in Kentucky : 1998-2002.

    DOT National Transportation Integrated Search

    2003-09-01

    This report documents an analysis of traffic accident data in Kentucky for the years of 1998 through 2002. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of cr...

  15. Analysis of traffic crash data in Kentucky : 2001-2005.

    DOT National Transportation Integrated Search

    2006-08-01

    This report includes an analysis of traffic accident data in Kentucky for the years of 2001 through 2005. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical rates of crashes were ca...

  16. Analysis of traffic crash data in Kentucky : 2005-2009.

    DOT National Transportation Integrated Search

    2010-08-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2005 through 2009. A primary objectives of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of cras...

  17. Analysis of traffic crash data in Kentucky : 2004-2008.

    DOT National Transportation Integrated Search

    2009-09-01

    The report documents an analysis of traffic crash data in Kentucky for the years of 2004 through 2008. A primary objective of this study was to determine average crash statistics for Kentucky highways. Average and critical numbers and rates of crashe...

  18. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  19. Inverse statistics and information content

    NASA Astrophysics Data System (ADS)

    Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.

    2010-12-01

    Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.

  20. A statistical approach to deriving subsystem specifications. [for spacecraft shock and vibrational environment tests

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.

  1. Differentiation of commercial vermiculite based on statistical analysis of bulk chemical data: Fingerprinting vermiculite from Libby, Montana U.S.A

    USGS Publications Warehouse

    Gunter, M.E.; Singleton, E.; Bandli, B.R.; Lowers, H.A.; Meeker, G.P.

    2005-01-01

    Major-, minor-, and trace-element compositions, as determined by X-ray fluorescence (XRF) analysis, were obtained on 34 samples of vermiculite to ascertain whether chemical differences exist to the extent of determining the source of commercial products. The sample set included ores from four deposits, seven commercially available garden products, and insulation from four attics. The trace-element distributions of Ba, Cr, and V can be used to distinguish the Libby vermiculite samples from the garden products. In general, the overall composition of the Libby and South Carolina deposits appeared similar, but differed from the South Africa and China deposits based on simple statistical methods. Cluster analysis provided a good distinction of the four ore types, grouped the four attic samples with the Libby ore, and, with less certainty, grouped the garden samples with the South Africa ore.

  2. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  3. Response of SiC{sub f}/Si{sub 3}N{sub 4} composites under static and cyclic loading -- An experimental and statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.

    1997-04-01

    Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less

  4. Phosphorylated neurofilament heavy: A potential blood biomarker to evaluate the severity of acute spinal cord injuries in adults

    PubMed Central

    Singh, Ajai; Kumar, Vineet; Ali, Sabir; Mahdi, Abbas Ali; Srivastava, Rajeshwer Nath

    2017-01-01

    Aims: The aim of this study is to analyze the serial estimation of phosphorylated neurofilament heavy (pNF-H) in blood plasma that would act as a potential biomarker for early prediction of the neurological severity of acute spinal cord injuries (SCI) in adults. Settings and Design: Pilot study/observational study. Subjects and Methods: A total of 40 patients (28 cases and 12 controls) of spine injury were included in this study. In the enrolled cases, plasma level of pNF-H was evaluated in blood samples and neurological evaluation was performed by the American Spinal Injury Association Injury Scale at specified period. Serial plasma neurofilament heavy values were then correlated with the neurological status of these patients during follow-up visits and were analyzed statistically. Statistical Analysis Used: Statistical analysis was performed using GraphPad InStat software (version 3.05 for Windows, San Diego, CA, USA). The correlation analysis between the clinical progression and pNF-H expression was done using Spearman's correlation. Results: The mean baseline level of pNF-H in cases was 6.40 ± 2.49 ng/ml, whereas in controls it was 0.54 ± 0.27 ng/ml. On analyzing the association between the two by Mann–Whitney U–test, the difference in levels was found to be statistically significant. The association between the neurological progression and pNF-H expression was determined using correlation analysis (Spearman's correlation). At 95% confidence interval, the correlation coefficient was found to be 0.64, and the correlation was statistically significant. Conclusions: Plasma pNF-H levels were elevated in accordance with the severity of SCI. Therefore, pNF-H may be considered as a potential biomarker to determine early the severity of SCI in adult patients. PMID:29291173

  5. Coordinate based random effect size meta-analysis of neuroimaging studies.

    PubMed

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  7. Statistical analysis of polarization-inhomogeneous Fourier spectra of laser radiation scattered by human skin in the tasks of differentiation of benign and malignant formations

    NASA Astrophysics Data System (ADS)

    Ushenko, Alexander G.; Dubolazov, Alexander V.; Ushenko, Vladimir A.; Novakovskaya, Olga Y.

    2016-07-01

    The optical model of formation of polarization structure of laser radiation scattered by polycrystalline networks of human skin in Fourier plane was elaborated. The results of investigation of the values of statistical (statistical moments of the 1st to 4th order) parameters of polarization-inhomogeneous images of skin surface in Fourier plane were presented. The diagnostic criteria of pathological process in human skin and its severity degree differentiation were determined.

  8. Statistical analysis of field data for aircraft warranties

    NASA Astrophysics Data System (ADS)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  9. Mars Pathfinder Near-Field Rock Distribution Re-Evaluation

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Golombek, M. P.

    2003-01-01

    We have completed analysis of a new near-field rock count at the Mars Pathfinder landing site and determined that the previously published rock count suggesting 16% cumulative fractional area (CFA) covered by rocks is incorrect. The earlier value is not so much wrong (our new CFA is 20%), as right for the wrong reason: both the old and the new CFA's are consistent with remote sensing data, however the earlier determination incorrectly calculated rock coverage using apparent width rather than average diameter. Here we present details of the new rock database and the new statistics, as well as the importance of using rock average diameter for rock population statistics. The changes to the near-field data do not affect the far-field rock statistics.

  10. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  11. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  12. Effectiveness of Quantitative Real Time PCR in Long-Term Follow-up of Chronic Myeloid Leukemia Patients.

    PubMed

    Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin

    2015-08-01

    To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.

  13. SOME CHARACTERISTICS OF THE ORAL CAVITY AND TEETH OF COSMONAUTS ON MISSIONS TO THE INTERNATIONAL SPACE STATION.

    PubMed

    Ilyin, V K; Shumilina, G A; Solovieva, Z O; Nosovsky, A M; Kaminskaya, E V

    Earlier studies were furthered by examination of parodentium anaerobic microbiota and investigation of gingival liquid immunological factors in space flight. Immunoglobulins were measured using the .enzyme immunoassay (EM). The qualitative content of keya parodentium pathogens is determined with state-of-the-art molecular biology technologies such as the polymerase chain reaction. Statistical data processing was performed using the principle component analysis and ensuing standard statistical analysis. Thereupon, recommendations on cosmonaut's oral and dental hygiene during space mission were developed.

  14. Climatological Study to Determine the Impact of Icing on the Low Level Windshear Alert System. Volume I. Analysis.

    DOT National Transportation Integrated Search

    1989-09-01

    The climatological study was performed to determine the impact of icing on the performance of Low Level Windshear Alert System (LLWAS). : This report presents the icing statistical profile in the form of data tables and histograms of 106 LLWAS sites....

  15. 10 CFR 431.197 - Manufacturer's determination of efficiency for distribution transformers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... methods used; the mathematical model, the engineering or statistical analysis, computer simulation or... (b)(3) of this section, or by application of an alternative efficiency determination method (AEDM... section only if: (i) The AEDM has been derived from a mathematical model that represents the electrical...

  16. Determinants of Student Wastage in Higher Education.

    ERIC Educational Resources Information Center

    Johnes, Jill

    1990-01-01

    Statistical analysis of a sample of the 1979 entry cohort to Lancaster University indicates that the likelihood of non-completion is determined by various characteristics including the student's academic ability, gender, marital status, work experience prior to university, school background, and location of home in relation to university.…

  17. Use of statistical analysis to validate ecogenotoxicology findings arising from various comet assay components.

    PubMed

    Hussain, Bilal; Sultana, Tayyaba; Sultana, Salma; Al-Ghanim, Khalid Abdullah; Masoud, Muhammad Shahreef; Mahboob, Shahid

    2018-04-01

    Cirrhinus mrigala, Labeo rohita, and Catla catla are economically important fish for human consumption in Pakistan, but industrial and sewage pollution has drastically reduced their population in the River Chenab. Statistics are an important tool to analyze and interpret comet assay results. The specific aims of the study were to determine the DNA damage in Cirrhinus mrigala, Labeo rohita, and Catla catla due to chemical pollution and to assess the validity of statistical analyses to determine the viability of the comet assay for a possible use with these freshwater fish species as a good indicator of pollution load and habitat degradation. Comet assay results indicated a significant (P < 0.05) degree of DNA fragmentation in Cirrhinus mrigala followed by Labeo rohita and Catla catla in respect to comet head diameter, comet tail length, and % DNA damage. Regression analysis and correlation matrices conducted among the parameters of the comet assay affirmed the precision and the legitimacy of the results. The present study, therefore, strongly recommends that genotoxicological studies conduct appropriate analysis of the various components of comet assays to offer better interpretation of the assay data.

  18. Graph theory applied to noise and vibration control in statistical energy analysis models.

    PubMed

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  19. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  20. SMAR Sessions

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2011-01-01

    This slide presentation is a series of educational presentations that are on the statistical function of analysis of variance (ANOVA). Analysis of Variance (ANOVA) examines variability between groups, relative to within groups, to determine whether there's evidence that the groups are not from the same population. One other presentation reviews hypothesis testing.

  1. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  2. Accuracy of metric sex analysis of skeletal remains using Fordisc based on a recent skull collection.

    PubMed

    Ramsthaler, F; Kreutz, K; Verhoff, M A

    2007-11-01

    It has been generally accepted in skeletal sex determination that the use of metric methods is limited due to the population dependence of the multivariate algorithms. The aim of the study was to verify the applicability of software-based sex estimations outside the reference population group for which discriminant equations have been developed. We examined 98 skulls from recent forensic cases of known age, sex, and Caucasian ancestry from cranium collections in Frankfurt and Mainz (Germany) to determine the accuracy of sex determination using the statistical software solution Fordisc which derives its database and functions from the US American Forensic Database. In a comparison between metric analysis using Fordisc and morphological determination of sex, average accuracy for both sexes was 86 vs 94%, respectively, and males were identified more accurately than females. The ratio of the true test result rate to the false test result rate was not statistically different for the two methodological approaches at a significance level of 0.05 but was statistically different at a level of 0.10 (p=0.06). Possible explanations for this difference comprise different ancestry, age distribution, and socio-economic status compared to the Fordisc reference sample. It is likely that a discriminant function analysis on the basis of more similar European reference samples will lead to more valid and reliable sexing results. The use of Fordisc as a single method for the estimation of sex of recent skeletal remains in Europe cannot be recommended without additional morphological assessment and without a built-in software update based on modern European reference samples.

  3. The application of artificial intelligence to microarray data: identification of a novel gene signature to identify bladder cancer progression.

    PubMed

    Catto, James W F; Abbod, Maysam F; Wild, Peter J; Linkens, Derek A; Pilarsky, Christian; Rehman, Ishtiaq; Rosario, Derek J; Denzinger, Stefan; Burger, Maximilian; Stoehr, Robert; Knuechel, Ruth; Hartmann, Arndt; Hamdy, Freddie C

    2010-03-01

    New methods for identifying bladder cancer (BCa) progression are required. Gene expression microarrays can reveal insights into disease biology and identify novel biomarkers. However, these experiments produce large datasets that are difficult to interpret. To develop a novel method of microarray analysis combining two forms of artificial intelligence (AI): neurofuzzy modelling (NFM) and artificial neural networks (ANN) and validate it in a BCa cohort. We used AI and statistical analyses to identify progression-related genes in a microarray dataset (n=66 tumours, n=2800 genes). The AI-selected genes were then investigated in a second cohort (n=262 tumours) using immunohistochemistry. We compared the accuracy of AI and statistical approaches to identify tumour progression. AI identified 11 progression-associated genes (odds ratio [OR]: 0.70; 95% confidence interval [CI], 0.56-0.87; p=0.0004), and these were more discriminate than genes chosen using statistical analyses (OR: 1.24; 95% CI, 0.96-1.60; p=0.09). The expression of six AI-selected genes (LIG3, FAS, KRT18, ICAM1, DSG2, and BRCA2) was determined using commercial antibodies and successfully identified tumour progression (concordance index: 0.66; log-rank test: p=0.01). AI-selected genes were more discriminate than pathologic criteria at determining progression (Cox multivariate analysis: p=0.01). Limitations include the use of statistical correlation to identify 200 genes for AI analysis and that we did not compare regression identified genes with immunohistochemistry. AI and statistical analyses use different techniques of inference to determine gene-phenotype associations and identify distinct prognostic gene signatures that are equally valid. We have identified a prognostic gene signature whose members reflect a variety of carcinogenic pathways that could identify progression in non-muscle-invasive BCa. 2009 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  4. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  5. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  6. Determination of the pion-nucleon coupling constant and scattering lengths

    NASA Astrophysics Data System (ADS)

    Ericson, T. E.; Loiseau, B.; Thomas, A. W.

    2002-07-01

    We critically evaluate the isovector Goldberger-Miyazawa-Oehme (GMO) sum rule for forward πN scattering using the recent precision measurements of π-p and π-d scattering lengths from pionic atoms. We deduce the charged-pion-nucleon coupling constant, with careful attention to systematic and statistical uncertainties. This determination gives, directly from data, g2c(GMO)/ 4π=14.11+/-0.05(statistical)+/-0.19(systematic) or f2c/4π=0.0783(11). This value is intermediate between that of indirect methods and the direct determination from backward np differential scattering cross sections. We also use the pionic atom data to deduce the coherent symmetric and antisymmetric sums of the pion-proton and pion-neutron scattering lengths with high precision, namely, (aπ-p+aπ-n)/2=[- 12+/-2(statistical)+/-8(systematic)]×10-4 m-1π and (aπ-p-aπ- n)/2=[895+/-3(statistical)+/-13 (systematic)]×10-4 m-1π. For the need of the present analysis, we improve the theoretical description of the pion-deuteron scattering length.

  7. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  9. Statistical analysis of polarization interference images of biological fluids polycrystalline films in the tasks of optical anisotropy weak changes differentiation

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Dubolazov, O. V.; Ushenko, V. O.; Zhytaryuk, V. G.; Prydiy, O. G.; Pavlyukovich, N.; Pavlyukovich, O.

    2018-01-01

    In this paper, we present the results of a statistical analysis of polarization-interference images of optically thin histological sections of biological tissues and polycrystalline films of biological fluids of human organs. A new analytical parameter is introduced-the local contrast of the interference pattern in the plane of a polarizationinhomogeneous microscopic image of a biological preparation. The coordinate distributions of the given parameter and the sets of statistical moments of the first-fourth order that characterize these distributions are determined. On this basis, the differentiation of degenerative-dystrophic changes in the myocardium and the polycrystalline structure of the synovial fluid of the human knee with different pathologies is realized.

  10. Statistical trends of episiotomy around the world: Comparative systematic review of changing practices.

    PubMed

    Clesse, Christophe; Lighezzolo-Alnot, Joëlle; De Lavergne, Sylvie; Hamlin, Sandrine; Scheffler, Michèle

    2018-06-01

    The authors' purpose for this article is to identify, review and interpret all publications about the episiotomy rates worldwide. Based on the criteria from the PRISMA guidelines, twenty databases were scrutinized. All studies which include national statistics related to episiotomy were selected, as well as studies presenting estimated data. Sixty-one papers were selected with publication dates between 1995 and 2016. A static and dynamic analysis of all the results was carried out. The assumption for the decline in the number of episiotomies is discussed and confirmed, recalling that nowadays high rates of episiotomy remain in less industrialized countries and East Asia. Finally, our analysis aims to investigate the potential determinants which influence apparent statistical disparities.

  11. Statistical modeling of space shuttle environmental data

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.

    1983-01-01

    Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.

  12. GUIDANCE FOR STATISTICAL DETERMINATION OF APPROPRIATE PERCENT MINORITY AND PERCENT POVERTY DISTRIBUTIONAL CUTOFF VALUES USING CENSUS DATA FOR AND EPA REGION II ENVIRONMENTAL JUSTICE PROJECT

    EPA Science Inventory

    The purpose of this report is to assist Region H by providing a statistical analysis identifying the areas with minority and below poverty populations known as "Community of Concern" (COC). The aim was to find a cutoff value as a threshold to identify a COC using demographic data...

  13. The benefit of silicone stents in primary endonasal dacryocystorhinostomy: a systematic review and meta-analysis.

    PubMed

    Sarode, D; Bari, D A; Cain, A C; Syed, M I; Williams, A T

    2017-04-01

    To critically evaluate the evidence comparing success rates of endonasal dacryocystorhinostomy (EN-DCR) with and without silicone tubing and to thus determine whether silicone intubation is beneficial in primary EN-DCR. Systematic review and meta-analysis. A literature search was performed on AMED, EMBASE, HMIC, MEDLINE, PsycINFO, BNI, CINAHL, HEALTH BUSINESS ELITE, CENTRAL and Cochrane Ear, Nose and Throat disorders groups trials register using a combination of various MeSH. The date of last search was January 2016. This review was limited to randomised controlled trials (RCTs) in English language. Risk of bias was assessed using the Cochrane Collaboration's risk of bias tool. Chi-square and I 2 statistics were calculated to determine the presence and extent of statistical heterogeneity. Study selection, data extraction and risk of bias scoring were performed independently by two authors in concordance with the PRISMA statement. Five RCTs (447 primary EN-DCR procedures in 426 patients) were included for analysis. Moderate interstudy statistical heterogeneity was demonstrated (Chi 2 = 6.18; d.f. = 4; I 2 = 35%). Bicanalicular silicone stents were used in 229 and not used in 218 procedures. The overall success rate of EN-DCR was 92.8% (415/447). The success rate of EN-DCR was 93.4% (214/229) with silicone tubing and 92.2% (201/218) without silicone tubing. Meta-analysis using a random-effects model showed no statistically significant difference in outcomes between the two groups (P = 0.63; RR = 0.79; 95% CI = 0.3-2.06). Our review and meta-analysis did not demonstrate an additional advantage of silicone stenting. A high-quality well-powered prospective multicentre RCT is needed to further clarify on the benefit of silicone stents. © 2016 John Wiley & Sons Ltd.

  14. An Interinstitutional Analysis of Faculty Teaching Load.

    ERIC Educational Resources Information Center

    Ahrens, Stephen W.

    A two-year interinstitutional study among 15 cooperating universities was conducted to determine whether significant differences exist in teaching loads among the selected universities as measured by student credit hours produced by full-time equivalent faculty. The statistical model was a multivariate analysis of variance with fixed effects and…

  15. BOOTSTRAPPING AND MONTE CARLO METHODS OF POWER ANALYSIS USED TO ESTABLISH CONDITION CATEGORIES FOR BIOTIC INDICES

    EPA Science Inventory

    Biotic indices have been used ot assess biological condition by dividing index scores into condition categories. Historically the number of categories has been based on professional judgement. Alternatively, statistical methods such as power analysis can be used to determine the ...

  16. Determining Sample Sizes for Precise Contrast Analysis with Heterogeneous Variances

    ERIC Educational Resources Information Center

    Jan, Show-Li; Shieh, Gwowen

    2014-01-01

    The analysis of variance (ANOVA) is one of the most frequently used statistical analyses in practical applications. Accordingly, the single and multiple comparison procedures are frequently applied to assess the differences among mean effects. However, the underlying assumption of homogeneous variances may not always be tenable. This study…

  17. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  18. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  19. Information-dependent enrichment analysis reveals time-dependent transcriptional regulation of the estrogen pathway of toxicity.

    PubMed

    Pendse, Salil N; Maertens, Alexandra; Rosenberg, Michael; Roy, Dipanwita; Fasani, Rick A; Vantangoli, Marguerite M; Madnick, Samantha J; Boekelheide, Kim; Fornace, Albert J; Odwin, Shelly-Ann; Yager, James D; Hartung, Thomas; Andersen, Melvin E; McMullen, Patrick D

    2017-04-01

    The twenty-first century vision for toxicology involves a transition away from high-dose animal studies to in vitro and computational models (NRC in Toxicity testing in the 21st century: a vision and a strategy, The National Academies Press, Washington, DC, 2007). This transition requires mapping pathways of toxicity by understanding how in vitro systems respond to chemical perturbation. Uncovering transcription factors/signaling networks responsible for gene expression patterns is essential for defining pathways of toxicity, and ultimately, for determining the chemical modes of action through which a toxicant acts. Traditionally, transcription factor identification is achieved via chromatin immunoprecipitation studies and summarized by calculating which transcription factors are statistically associated with up- and downregulated genes. These lists are commonly determined via statistical or fold-change cutoffs, a procedure that is sensitive to statistical power and may not be as useful for determining transcription factor associations. To move away from an arbitrary statistical or fold-change-based cutoff, we developed, in the context of the Mapping the Human Toxome project, an enrichment paradigm called information-dependent enrichment analysis (IDEA) to guide identification of the transcription factor network. We used a test case of activation in MCF-7 cells by 17β estradiol (E2). Using this new approach, we established a time course for transcriptional and functional responses to E2. ERα and ERβ were associated with short-term transcriptional changes in response to E2. Sustained exposure led to recruitment of additional transcription factors and alteration of cell cycle machinery. TFAP2C and SOX2 were the transcription factors most highly correlated with dose. E2F7, E2F1, and Foxm1, which are involved in cell proliferation, were enriched only at 24 h. IDEA should be useful for identifying candidate pathways of toxicity. IDEA outperforms gene set enrichment analysis (GSEA) and provides similar results to weighted gene correlation network analysis, a platform that helps to identify genes not annotated to pathways.

  20. A statistical physics perspective on alignment-independent protein sequence comparison.

    PubMed

    Chattopadhyay, Amit K; Nasiev, Diar; Flower, Darren R

    2015-08-01

    Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from 'first passage probability distribution' to summarize statistics of ensemble averaged amino acid propensity values. In this article, we introduce and elaborate this approach. © The Author 2015. Published by Oxford University Press.

  1. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  2. 10 CFR 431.445 - Determination of small electric motor efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined either by testing in accordance with § 431.444 of this subpart, or by application of an... method. An AEDM applied to a basic model must be: (i) Derived from a mathematical model that represents... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3...

  3. Application of factor analysis of infrared spectra for quantitative determination of beta-tricalcium phosphate in calcium hydroxylapatite.

    PubMed

    Arsenyev, P A; Trezvov, V V; Saratovskaya, N V

    1997-01-01

    This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.

  4. An application of cluster analysis for determining homogeneous subregions: The agroclimatological point of view. [Rio Grande do Sul, Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.

    1982-01-01

    A stratification oriented to crop area and yield estimation problems was performed using an algorithm of clustering. The variables used were a set of agroclimatological characteristics measured in each one of the 232 municipalities of the State of Rio Grande do Sul, Brazil. A nonhierarchical cluster analysis was used and the pseudo F-statistics criterion was implemented for determining the "cut point" in the number of strata.

  5. Mathematical problem solving ability of sport students in the statistical study

    NASA Astrophysics Data System (ADS)

    Sari, E. F. P.; Zulkardi; Putri, R. I. I.

    2017-12-01

    This study aims to determine the problem-solving ability of sport students of PGRI Palembang semester V in the statistics course. Subjects in this study were sport students of PGRI Palembang semester V which amounted to 31 people. The research method used is quasi experiment type one case shoot study. Data collection techniques in this study use the test and data analysis used is quantitative descriptive statistics. The conclusion of this study shown that the mathematical problem solving ability of PGRI Palembang sport students of V semester in the statistical course is categorized well with the average of the final test score of 80.3.

  6. Procedures for determination of detection limits: application to high-performance liquid chromatography analysis of fat-soluble vitamins in human serum.

    PubMed

    Browne, Richard W; Whitcomb, Brian W

    2010-07-01

    Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.

  7. Analysis of video-recorded images to determine linear and angular dimensions in the growing horse.

    PubMed

    Hunt, W F; Thomas, V G; Stiefel, W

    1999-09-01

    Studies of growth and conformation require statistical methods that are not applicable to subjective conformation standards used by breeders and trainers. A new system was developed to provide an objective approach for both science and industry, based on analysis of video images to measure aspects of conformation that were represented by angles or lengths. A studio crush was developed in which video images of horses of different sizes were taken after bone protuberances, located by palpation, were marked with white paper stickers. Screen pixel coordinates of calibration marks, bone markers and points on horse outlines were digitised from captured images and corrected for aspect ratio and 'fish-eye' lens effects. Calculations from the corrected coordinates produced linear dimensions and angular dimensions useful for comparison of horses for conformation and experimental purposes. The precision achieved by the method in determining linear and angular dimensions was examined through systematically determining variance for isolated steps of the procedure. Angles of the front limbs viewed from in front were determined with a standard deviation of 2-5 degrees and effects of viewing angle were detectable statistically. The height of the rump and wither were determined with precision closely related to the limitations encountered in locating a point on a screen, which was greater for markers applied to the skin than for points at the edge of the image. Parameters determined from markers applied to the skin were, however, more variable (because their relation to bone position was affected by movement), but still provided a means by which a number of aspects of size and conformation can be determined objectively for many horses during growth. Sufficient precision was achieved to detect statistically relatively small effects on calculated parameters of camera height position.

  8. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  9. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    PubMed

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  10. Gain optimization with non-linear controls

    NASA Technical Reports Server (NTRS)

    Slater, G. L.; Kandadai, R. D.

    1984-01-01

    An algorithm has been developed for the analysis and design of controls for non-linear systems. The technical approach is to use statistical linearization to model the non-linear dynamics of a system by a quasi-Gaussian model. A covariance analysis is performed to determine the behavior of the dynamical system and a quadratic cost function. Expressions for the cost function and its derivatives are determined so that numerical optimization techniques can be applied to determine optimal feedback laws. The primary application for this paper is centered about the design of controls for nominally linear systems but where the controls are saturated or limited by fixed constraints. The analysis is general, however, and numerical computation requires only that the specific non-linearity be considered in the analysis.

  11. Characterizing chaotic melodies in automatic music composition

    NASA Astrophysics Data System (ADS)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  12. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  13. Sexual Harassment Retaliation Climate DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    exploratory factor analysis, and bivariate correlations (sample 1) 2) To determine the factor structure of the remaining (final) questions via...statistics, reliability analysis, exploratory factor analysis, and bivariate correlations of the prospective Sexual Harassment Retaliation Climate...reported by the survey requester). For information regarding the composition of sample, refer to Table 1. Table 1. Sample 1 Demographics n

  14. Linear and Nonlinear Analysis of Brain Dynamics in Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Sajedi, Firoozeh; Ahmadlou, Mehran; Vameghi, Roshanak; Gharib, Masoud; Hemmati, Sahel

    2013-01-01

    This study was carried out to determine linear and nonlinear changes of brain dynamics and their relationships with the motor dysfunctions in CP children. For this purpose power of EEG frequency bands (as a linear analysis) and EEG fractality (as a nonlinear analysis) were computed in eyes-closed resting state and statistically compared between 26…

  15. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  16. Statistics and quality assurance for the Northern Research Station Forest Inventory and Analysis Program, 2016

    Treesearch

    Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; Jim Westfall

    2017-01-01

    The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIA's primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nation's forest land through a comprehensive inventory and analysis of the Nation's forest resources. The...

  17. Comparison between Growth Patterns and Pharyngeal Widths in Different Skeletal Malocclusions in South Indian Population.

    PubMed

    Lakshmi, K Bhagya; Yelchuru, Sri Harsha; Chandrika, V; Lakshmikar, O G; Sagar, V Lakshmi; Reddy, G Vivek

    2018-01-01

    The main aim is to determine whether growth pattern had an effect on the upper airway by comparing different craniofacial patterns with pharyngeal widths and its importance during the clinical examination. Sixty lateral cephalograms of patients aged between 16 and 24 years with no pharyngeal pathology or nasal obstruction were selected for the study. These were divided into skeletal Class I ( n = 30) and skeletal Class II ( n = 30) using ANB angle subdivided into normodivergent, hyperdivergent, and hypodivergent facial patterns based on SN-GoGn angle. McNamara's airway analysis was used to determine the upper- and lower-airway dimensions. One-way ANOVA was used to do the intergroup comparisons and the Tukey's test as the secondary statistical analysis. Statistically significant difference exists between the upper-airway dimensions in both the skeletal malocclusions with hyperdivergent growth patterns when compared to other growth patterns. In both the skeletal malocclusions, vertical growers showed a significant decrease in the airway size than the horizontal and normal growers. There is no statistical significance between the lower airway and craniofacial growth pattern.

  18. [Influence of demographic and socioeconomic characteristics on the quality of life].

    PubMed

    Grbić, Gordana; Djokić, Dragoljub; Kocić, Sanja; Mitrašinović, Dejan; Rakić, Ljiljana; Prelević, Rade; Krivokapić, Žarko; Miljković, Snežana

    2011-01-01

    The quality of life is a multidimensional concept, which is best expressed by the subjective well-being. Evaluation of the quality of life is the basis for measuring the well-being, and the determination of factors that determine the quality of life quality is the basis for its improvement. To evaluate and assess the determinants of the perceived quality of life of group distinguishing features which characterize demographic and socioeconomic factors. This was a cross-sectional study of a representative sample of the population in Serbia aged over 20 years (9479 examinees). The quality of life was expressed by the perception of well-being (pleasure of life). Data on the examinees (demographic and socioeconomic characteristics) were collected by using a questionnaire for adults of each household. To process, analyze and present the data, we used the methods of parametric descriptive statistics (mean value, standard deviation, coefficient of variation), variance analysis and factor analysis. Although men evaluated the quality of life with a slightly higher grading, there was no statistically significant difference in the evaluation of the quality of life in relation to the examinee's gender (p > 0.005). Among the examinees there was a high statistically significant difference in grading the quality of life depending on age, level of education, marital status and type of job (p < 0.001). In relation to the number of children, there was no statistically significant difference in he grading of the quality of life (p > 0.005). The quality of life is influenced by numerous factors that characterize each person (demographic and socioeconomic characteristics of individual). Determining factors of the quality of life are numerous and diverse, and the manner and the strength of their influence are variable.

  19. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Color quality of pigments in cochineals (Dactylopius coccus Costa). Geographical origin characterization using multivariate statistical analysis.

    PubMed

    Méndez, Jesús; González, Mónica; Lobo, M Gloria; Carnero, Aurelio

    2004-03-10

    The commercial value of a cochineal (Dactylopius coccus Costa) sample is associated with its color quality. Because the cochineal is a legal food colorant, its color quality is generally understood as its pigment content. Simply put, the higher this content, the more valuable the sample is to the market. In an effort to devise a way to measure the color quality of a cochineal, the present study evaluates different parameters of color measurement such as chromatic attributes (L*, and a*), percentage of carminic acid, tint determination, and chromatographic profile of pigments. Tint determination did not achieve this objective because this parameter does not correlate with carminic acid content. On the other hand, carminic acid showed a highly significant correlation (r = - 0.922, p = 0.000) with L* values determined from powdered cochineal samples. The combination of the information from the spectrophotometric determination of carminic acid with that of the pigment profile acquired by liquid chromatography (LC) and the composition of the red and yellow pigment groups, also acquired by LC, enables greater accuracy in judging the quality of the final sample. As a result of this study, it was possible to achieve the separation of cochineal samples according to geographical origin using two statistical techniques: cluster analysis and principal component analysis.

  2. [Character of refractive errors in population study performed by the Area Military Medical Commission in Lodz].

    PubMed

    Nowak, Michał S; Goś, Roman; Smigielski, Janusz

    2008-01-01

    To determine the prevalence of refractive errors in population. A retrospective review of medical examinations for entry to the military service from The Area Military Medical Commission in Lodz. Ophthalmic examinations were performed. We used statistic analysis to review the results. Statistic analysis revealed that refractive errors occurred in 21.68% of the population. The most commen refractive error was myopia. 1) The most commen ocular diseases are refractive errors, especially myopia (21.68% in total). 2) Refractive surgery and contact lenses should be allowed as the possible correction of refractive errors for military service.

  3. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  4. Statistical analysis of experimental data for mathematical modeling of physical processes in the atmosphere

    NASA Astrophysics Data System (ADS)

    Karpushin, P. A.; Popov, Yu B.; Popova, A. I.; Popova, K. Yu; Krasnenko, N. P.; Lavrinenko, A. V.

    2017-11-01

    In this paper, the probabilities of faultless operation of aerologic stations are analyzed, the hypothesis of normality of the empirical data required for using the Kalman filter algorithms is tested, and the spatial correlation functions of distributions of meteorological parameters are determined. The results of a statistical analysis of two-term (0, 12 GMT) radiosonde observations of the temperature and wind velocity components at some preset altitude ranges in the troposphere in 2001-2016 are presented. These data can be used in mathematical modeling of physical processes in the atmosphere.

  5. Statistical evaluation of the metallurgical test data in the ORR-PSF-PVS irradiation experiment. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stallmann, F.W.

    1984-08-01

    A statistical analysis of Charpy test results of the two-year Pressure Vessel Simulation metallurgical irradiation experiment was performed. Determination of transition temperature and upper shelf energy derived from computer fits compare well with eyeball fits. Uncertainties for all results can be obtained with computer fits. The results were compared with predictions in Regulatory Guide 1.99 and other irradiation damage models.

  6. Adhesive properties and adhesive joints strength of graphite/epoxy composites

    NASA Astrophysics Data System (ADS)

    Rudawska, Anna; Stančeková, Dana; Cubonova, Nadezda; Vitenko, Tetiana; Müller, Miroslav; Valášek, Petr

    2017-05-01

    The article presents the results of experimental research of the adhesive joints strength of graphite/epoxy composites and the results of the surface free energy of the composite surfaces. Two types of graphite/epoxy composites with different thickness were tested which are used to aircraft structure. The single-lap adhesive joints of epoxy composites were considered. Adhesive properties were described by surface free energy. Owens-Wendt method was used to determine surface free energy. The epoxy two-component adhesive was used to preparing the adhesive joints. Zwick/Roell 100 strength device were used to determination the shear strength of adhesive joints of epoxy composites. The strength test results showed that the highest value was obtained for adhesive joints of graphite-epoxy composite of smaller material thickness (0.48 mm). Statistical analysis of the results obtained, the study showed statistically significant differences between the values of the strength of the confidence level of 0.95. The statistical analysis of the results also showed that there are no statistical significant differences in average values of surface free energy (0.95 confidence level). It was noted that in each of the results the dispersion component of surface free energy was much greater than polar component of surface free energy.

  7. Analysis of the Einstein sample of early-type galaxies

    NASA Technical Reports Server (NTRS)

    Eskridge, Paul B.; Fabbiano, Giuseppina

    1993-01-01

    The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.

  8. Multielement analysis of Canadian wines by inductively coupled plasma mass spectrometry (ICP-MS) and multivariate statistics.

    PubMed

    Taylor, Vivien F; Longerich, Henry P; Greenough, John D

    2003-02-12

    Trace element fingerprints were deciphered for wines from Canada's two major wine-producing regions, the Okanagan Valley and the Niagara Peninsula, for the purpose of examining differences in wine element composition with region of origin and identifying elements important to determining provenance. Analysis by ICP-MS allowed simultaneous determination of 34 trace elements in wine (Li, Be, Mg, Al, P, Cl, Ca, Ti, V, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Br, Rb, Sr, Mo, Ag, Cd, Sb, I, Cs, Ba, La, Ce, Tl, Pb, Bi, Th, and U) at low levels of detection, and patterns in trace element concentrations were deciphered by multivariate statistical analysis. The two regions were discriminated with 100% accuracy using 10 of these elements. Differences in soil chemistry between the Niagara and Okanagan vineyards were evident, without a good correlation between soil and wine composition. The element Sr was found to be a good indicator of provenance and has been reported in fingerprinting studies of other regions.

  9. Fisher statistics for analysis of diffusion tensor directional information.

    PubMed

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Heritability of and mortality prediction with a longevity phenotype: the healthy aging index.

    PubMed

    Sanders, Jason L; Minster, Ryan L; Barmada, M Michael; Matteini, Amy M; Boudreau, Robert M; Christensen, Kaare; Mayeux, Richard; Borecki, Ingrid B; Zhang, Qunyuan; Perls, Thomas; Newman, Anne B

    2014-04-01

    Longevity-associated genes may modulate risk for age-related diseases and survival. The Healthy Aging Index (HAI) may be a subphenotype of longevity, which can be constructed in many studies for genetic analysis. We investigated the HAI's association with survival in the Cardiovascular Health Study and heritability in the Long Life Family Study. The HAI includes systolic blood pressure, pulmonary vital capacity, creatinine, fasting glucose, and Modified Mini-Mental Status Examination score, each scored 0, 1, or 2 using approximate tertiles and summed from 0 (healthy) to 10 (unhealthy). In Cardiovascular Health Study, the association with mortality and accuracy predicting death were determined with Cox proportional hazards analysis and c-statistics, respectively. In Long Life Family Study, heritability was determined with a variance component-based family analysis using a polygenic model. Cardiovascular Health Study participants with unhealthier index scores (7-10) had 2.62-fold (95% confidence interval: 2.22, 3.10) greater mortality than participants with healthier scores (0-2). The HAI alone predicted death moderately well (c-statistic = 0.643, 95% confidence interval: 0.626, 0.661, p < .0001) and slightly worse than age alone (c-statistic = 0.700, 95% confidence interval: 0.684, 0.717, p < .0001; p < .0001 for comparison of c-statistics). Prediction increased significantly with adjustment for demographics, health behaviors, and clinical comorbidities (c-statistic = 0.780, 95% confidence interval: 0.765, 0.794, p < .0001). In Long Life Family Study, the heritability of the HAI was 0.295 (p < .0001) overall, 0.387 (p < .0001) in probands, and 0.238 (p = .0004) in offspring. The HAI should be investigated further as a candidate phenotype for uncovering longevity-associated genes in humans.

  11. Heritability of and Mortality Prediction With a Longevity Phenotype: The Healthy Aging Index

    PubMed Central

    2014-01-01

    Background. Longevity-associated genes may modulate risk for age-related diseases and survival. The Healthy Aging Index (HAI) may be a subphenotype of longevity, which can be constructed in many studies for genetic analysis. We investigated the HAI’s association with survival in the Cardiovascular Health Study and heritability in the Long Life Family Study. Methods. The HAI includes systolic blood pressure, pulmonary vital capacity, creatinine, fasting glucose, and Modified Mini-Mental Status Examination score, each scored 0, 1, or 2 using approximate tertiles and summed from 0 (healthy) to 10 (unhealthy). In Cardiovascular Health Study, the association with mortality and accuracy predicting death were determined with Cox proportional hazards analysis and c-statistics, respectively. In Long Life Family Study, heritability was determined with a variance component–based family analysis using a polygenic model. Results. Cardiovascular Health Study participants with unhealthier index scores (7–10) had 2.62-fold (95% confidence interval: 2.22, 3.10) greater mortality than participants with healthier scores (0–2). The HAI alone predicted death moderately well (c-statistic = 0.643, 95% confidence interval: 0.626, 0.661, p < .0001) and slightly worse than age alone (c-statistic = 0.700, 95% confidence interval: 0.684, 0.717, p < .0001; p < .0001 for comparison of c-statistics). Prediction increased significantly with adjustment for demographics, health behaviors, and clinical comorbidities (c-statistic = 0.780, 95% confidence interval: 0.765, 0.794, p < .0001). In Long Life Family Study, the heritability of the HAI was 0.295 (p < .0001) overall, 0.387 (p < .0001) in probands, and 0.238 (p = .0004) in offspring. Conclusion. The HAI should be investigated further as a candidate phenotype for uncovering longevity-associated genes in humans. PMID:23913930

  12. An Analysis of Mathematics Course Sequences for Low Achieving Students at a Comprehensive Technical High School

    ERIC Educational Resources Information Center

    Edge, D. Michael

    2011-01-01

    This non-experimental study attempted to determine how the different prescribed mathematic tracks offered at a comprehensive technical high school influenced the mathematics performance of low-achieving students on standardized assessments of mathematics achievement. The goal was to provide an analysis of any statistically significant differences…

  13. Determinants of Linear Judgment: A Meta-Analysis of Lens Model Studies

    ERIC Educational Resources Information Center

    Karelaia, Natalia; Hogarth, Robin M.

    2008-01-01

    The mathematical representation of E. Brunswik's (1952) lens model has been used extensively to study human judgment and provides a unique opportunity to conduct a meta-analysis of studies that covers roughly 5 decades. Specifically, the authors analyzed statistics of the "lens model equation" (L. R. Tucker, 1964) associated with 249 different…

  14. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  15. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  16. Pedagogy and the PC: Trends in the AIS Curriculum

    ERIC Educational Resources Information Center

    Badua, Frank

    2008-01-01

    The author investigated the array of course topics in accounting information systems (AIS), as course syllabi embody. The author (a) used exploratory data analysis to determine the topics that AIS courses most frequently offered and (b) used descriptive statistics and econometric analysis to trace the diversity of course topics through time,…

  17. Estimating annual bole biomass production using uncertainty analysis

    Treesearch

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  18. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  19. Determinants of health care expenditures and the contribution of associated factors: 16 cities and provinces in Korea, 2003-2010.

    PubMed

    Han, Kimyoung; Cho, Minho; Chun, Kihong

    2013-11-01

    The purpose of this study was to classify determinants of cost increases into two categories, negotiable factors and non-negotiable factors, in order to identify the determinants of health care expenditure increases and to clarify the contribution of associated factors selected based on a literature review. The data in this analysis was from the statistical yearbooks of National Health Insurance Service, the Economic Index from Statistics Korea and regional statistical yearbooks. The unit of analysis was the annual growth rate of variables of 16 cities and provinces from 2003 to 2010. First, multiple regression was used to identify the determinants of health care expenditures. We then used hierarchical multiple regression to calculate the contribution of associated factors. The changes of coefficients (R(2)) of predictors, which were entered into this analysis step by step based on the empirical evidence of the investigator could explain the contribution of predictors to increased medical cost. Health spending was mainly associated with the proportion of the elderly population, but the Medicare Economic Index (MEI) showed an inverse association. The contribution of predictors was as follows: the proportion of elderly in the population (22.4%), gross domestic product (GDP) per capita (4.5%), MEI (-12%), and other predictors (less than 1%). As Baby Boomers enter retirement, an increasing proportion of the population aged 65 and over and the GDP will continue to increase, thus accelerating the inflation of health care expenditures and precipitating a crisis in the health insurance system. Policy makers should consider providing comprehensive health services by an accountable care organization to achieve cost savings while ensuring high-quality care.

  20. Statistical analysis of the photodegradation of imazethapyr on the surface of extracted soybean (Glycine max) and corn (Zea mays) epicuticular waxes.

    PubMed

    Anderson, Scott C; Christiansen, Amy; Peterson, Alexa; Beukelman, Logan; Nienow, Amanda M

    2016-10-12

    The photodegradation rate of the herbicide imazethapyr on epicuticular waxes of soybean and corn plants was investigated. Plant age, relative humidity, temperature, and number of light banks were varied during plant growth, analyzed statistically, and examined to determine if these factors had an effect on the photodegradation of imazethapyr. Through ultraviolet/visible (UV-Vis) and fluorescence spectroscopy, epicuticular wax characteristics of soybean and corn plants were explored, were used to confirm observations determined statistically, and explain correlations between the rate constants and the composition of the epicuticular waxes. Plant age, the interaction between plant age and light, and the quadratic dependence on temperature were all determined to have a significant impact on the photodegradation rate of imazethapyr on the epicuticular waxes of soybean plants. As for the photodegradation rate on the epicuticular waxes of corn plants, the number of light banks used during growing and temperature were significant factors.

  1. Vaccine stability study design and analysis to support product licensure.

    PubMed

    Schofield, Timothy L

    2009-11-01

    Stability evaluation supporting vaccine licensure includes studies of bulk intermediates as well as final container product. Long-term and accelerated studies are performed to support shelf life and to determine release limits for the vaccine. Vaccine shelf life is best determined utilizing a formal statistical evaluation outlined in the ICH guidelines, while minimum release is calculated to help assure adequate potency through handling and storage of the vaccine. In addition to supporting release potency determination, accelerated stability studies may be used to support a strategy to recalculate product expiry after an unintended temperature excursion such as a cold storage unit failure or mishandling during transport. Appropriate statistical evaluation of vaccine stability data promotes strategic stability study design, in order to reduce the uncertainty associated with the determination of the degradation rate, and the associated risk to the customer.

  2. Quality control analysis : part I : asphaltic concrete.

    DOT National Transportation Integrated Search

    1964-11-01

    This report deals with the statistical evaluation of results from several hot mix plants to determine the pattern of variability with respect to bituminous hot mix characteristics. : Individual tests results when subjected to frequency distribution i...

  3. A fast algorithm for determining bounds and accurate approximate p-values of the rank product statistic for replicate experiments.

    PubMed

    Heskes, Tom; Eisinga, Rob; Breitling, Rainer

    2014-11-21

    The rank product method is a powerful statistical technique for identifying differentially expressed molecules in replicated experiments. A critical issue in molecule selection is accurate calculation of the p-value of the rank product statistic to adequately address multiple testing. Both exact calculation and permutation and gamma approximations have been proposed to determine molecule-level significance. These current approaches have serious drawbacks as they are either computationally burdensome or provide inaccurate estimates in the tail of the p-value distribution. We derive strict lower and upper bounds to the exact p-value along with an accurate approximation that can be used to assess the significance of the rank product statistic in a computationally fast manner. The bounds and the proposed approximation are shown to provide far better accuracy over existing approximate methods in determining tail probabilities, with the slightly conservative upper bound protecting against false positives. We illustrate the proposed method in the context of a recently published analysis on transcriptomic profiling performed in blood. We provide a method to determine upper bounds and accurate approximate p-values of the rank product statistic. The proposed algorithm provides an order of magnitude increase in throughput as compared with current approaches and offers the opportunity to explore new application domains with even larger multiple testing issue. The R code is published in one of the Additional files and is available at http://www.ru.nl/publish/pages/726696/rankprodbounds.zip .

  4. A Review of the Study Designs and Statistical Methods Used in the Determination of Predictors of All-Cause Mortality in HIV-Infected Cohorts: 2002–2011

    PubMed Central

    Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias

    2014-01-01

    Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313

  5. Determination of quality parameters from statistical analysis of routine TLD dosimetry data.

    PubMed

    German, U; Weinstein, M; Pelled, O

    2006-01-01

    Following the as low as reasonably achievable (ALARA) practice, there is a need to measure very low doses, of the same order of magnitude as the natural background, and the limits of detection of the dosimetry systems. The different contributions of the background signals to the total zero dose reading of thermoluminescence dosemeter (TLD) cards were analysed by using the common basic definitions of statistical indicators: the critical level (L(C)), the detection limit (L(D)) and the determination limit (L(Q)). These key statistical parameters for the system operated at NRC-Negev were quantified, based on the history of readings of the calibration cards in use. The electronic noise seems to play a minor role, but the reading of the Teflon coating (without the presence of a TLD crystal) gave a significant contribution.

  6. Proficiency Testing for Determination of Water Content in Toluene of Chemical Reagents by iteration robust statistic technique

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Wang, Qunwei; He, Ming

    2018-05-01

    In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.

  7. Measurements of experimental precision for trials with cowpea (Vigna unguiculata L. Walp.) genotypes.

    PubMed

    Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G

    2016-05-09

    The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.

  8. A Simple Test of Class-Level Genetic Association Can Reveal Novel Cardiometabolic Trait Loci.

    PubMed

    Qian, Jing; Nunez, Sara; Reed, Eric; Reilly, Muredach P; Foulkes, Andrea S

    2016-01-01

    Characterizing the genetic determinants of complex diseases can be further augmented by incorporating knowledge of underlying structure or classifications of the genome, such as newly developed mappings of protein-coding genes, epigenetic marks, enhancer elements and non-coding RNAs. We apply a simple class-level testing framework, termed Genetic Class Association Testing (GenCAT), to identify protein-coding gene association with 14 cardiometabolic (CMD) related traits across 6 publicly available genome wide association (GWA) meta-analysis data resources. GenCAT uses SNP-level meta-analysis test statistics across all SNPs within a class of elements, as well as the size of the class and its unique correlation structure, to determine if the class is statistically meaningful. The novelty of findings is evaluated through investigation of regional signals. A subset of findings are validated using recently updated, larger meta-analysis resources. A simulation study is presented to characterize overall performance with respect to power, control of family-wise error and computational efficiency. All analysis is performed using the GenCAT package, R version 3.2.1. We demonstrate that class-level testing complements the common first stage minP approach that involves individual SNP-level testing followed by post-hoc ascribing of statistically significant SNPs to genes and loci. GenCAT suggests 54 protein-coding genes at 41 distinct loci for the 13 CMD traits investigated in the discovery analysis, that are beyond the discoveries of minP alone. An additional application to biological pathways demonstrates flexibility in defining genetic classes. We conclude that it would be prudent to include class-level testing as standard practice in GWA analysis. GenCAT, for example, can be used as a simple, complementary and efficient strategy for class-level testing that leverages existing data resources, requires only summary level data in the form of test statistics, and adds significant value with respect to its potential for identifying multiple novel and clinically relevant trait associations.

  9. Statistical analysis of aerosol species, trace gasses, and meteorology in Chicago.

    PubMed

    Binaku, Katrina; O'Brien, Timothy; Schmeling, Martina; Fosco, Tinamarie

    2013-09-01

    Both canonical correlation analysis (CCA) and principal component analysis (PCA) were applied to atmospheric aerosol and trace gas concentrations and meteorological data collected in Chicago during the summer months of 2002, 2003, and 2004. Concentrations of ammonium, calcium, nitrate, sulfate, and oxalate particulate matter, as well as, meteorological parameters temperature, wind speed, wind direction, and humidity were subjected to CCA and PCA. Ozone and nitrogen oxide mixing ratios were also included in the data set. The purpose of statistical analysis was to determine the extent of existing linear relationship(s), or lack thereof, between meteorological parameters and pollutant concentrations in addition to reducing dimensionality of the original data to determine sources of pollutants. In CCA, the first three canonical variate pairs derived were statistically significant at the 0.05 level. Canonical correlation between the first canonical variate pair was 0.821, while correlations of the second and third canonical variate pairs were 0.562 and 0.461, respectively. The first canonical variate pair indicated that increasing temperatures resulted in high ozone mixing ratios, while the second canonical variate pair showed wind speed and humidity's influence on local ammonium concentrations. No new information was uncovered in the third variate pair. Canonical loadings were also interpreted for information regarding relationships between data sets. Four principal components (PCs), expressing 77.0 % of original data variance, were derived in PCA. Interpretation of PCs suggested significant production and/or transport of secondary aerosols in the region (PC1). Furthermore, photochemical production of ozone and wind speed's influence on pollutants were expressed (PC2) along with overall measure of local meteorology (PC3). In summary, CCA and PCA results combined were successful in uncovering linear relationships between meteorology and air pollutants in Chicago and aided in determining possible pollutant sources.

  10. Dynamic association rules for gene expression data analysis.

    PubMed

    Chen, Shu-Chuan; Tsai, Tsung-Hsien; Chung, Cheng-Han; Li, Wen-Hsiung

    2015-10-14

    The purpose of gene expression analysis is to look for the association between regulation of gene expression levels and phenotypic variations. This association based on gene expression profile has been used to determine whether the induction/repression of genes correspond to phenotypic variations including cell regulations, clinical diagnoses and drug development. Statistical analyses on microarray data have been developed to resolve gene selection issue. However, these methods do not inform us of causality between genes and phenotypes. In this paper, we propose the dynamic association rule algorithm (DAR algorithm) which helps ones to efficiently select a subset of significant genes for subsequent analysis. The DAR algorithm is based on association rules from market basket analysis in marketing. We first propose a statistical way, based on constructing a one-sided confidence interval and hypothesis testing, to determine if an association rule is meaningful. Based on the proposed statistical method, we then developed the DAR algorithm for gene expression data analysis. The method was applied to analyze four microarray datasets and one Next Generation Sequencing (NGS) dataset: the Mice Apo A1 dataset, the whole genome expression dataset of mouse embryonic stem cells, expression profiling of the bone marrow of Leukemia patients, Microarray Quality Control (MAQC) data set and the RNA-seq dataset of a mouse genomic imprinting study. A comparison of the proposed method with the t-test on the expression profiling of the bone marrow of Leukemia patients was conducted. We developed a statistical way, based on the concept of confidence interval, to determine the minimum support and minimum confidence for mining association relationships among items. With the minimum support and minimum confidence, one can find significant rules in one single step. The DAR algorithm was then developed for gene expression data analysis. Four gene expression datasets showed that the proposed DAR algorithm not only was able to identify a set of differentially expressed genes that largely agreed with that of other methods, but also provided an efficient and accurate way to find influential genes of a disease. In the paper, the well-established association rule mining technique from marketing has been successfully modified to determine the minimum support and minimum confidence based on the concept of confidence interval and hypothesis testing. It can be applied to gene expression data to mine significant association rules between gene regulation and phenotype. The proposed DAR algorithm provides an efficient way to find influential genes that underlie the phenotypic variance.

  11. Differentiation of chocolates according to the cocoa's geographical origin using chemometrics.

    PubMed

    Cambrai, Amandine; Marcic, Christophe; Morville, Stéphane; Sae Houer, Pierre; Bindler, Françoise; Marchioni, Eric

    2010-02-10

    The determination of the geographical origin of cocoa used to produce chocolate has been assessed through the analysis of the volatile compounds of chocolate samples. The analysis of the volatile content and their statistical processing by multivariate analyses tended to form independent groups for both Africa and Madagascar, even if some of the chocolate samples analyzed appeared in a mixed zone together with those from America. This analysis also allowed a clear separation between Caribbean chocolates and those from other origins. Height compounds (such as linalool or (E,E)-2,4-decadienal) characteristic of chocolate's different geographical origins were also identified. The method described in this work (hydrodistillation, GC analysis, and statistic treatment) may improve the control of the geographical origin of chocolate during its long production process.

  12. Statistical deprojection of galaxy pairs

    NASA Astrophysics Data System (ADS)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  13. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  14. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  15. Iterative Assessment of Statistically-Oriented and Standard Algorithms for Determining Muscle Onset with Intramuscular Electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-12-01

    The onset of muscle activity, as measured by electromyography (EMG), is a commonly applied metric in biomechanics. Intramuscular EMG is often used to examine deep musculature and there are currently no studies examining the effectiveness of algorithms for intramuscular EMG onset. The present study examines standard surface EMG onset algorithms (linear envelope, Teager-Kaiser Energy Operator, and sample entropy) and novel algorithms (time series mean-variance analysis, sequential/batch processing with parametric and nonparametric methods, and Bayesian changepoint analysis). Thirteen male and 5 female subjects had intramuscular EMG collected during isolated biceps brachii and vastus lateralis contractions, resulting in 103 trials. EMG onset was visually determined twice by 3 blinded reviewers. Since the reliability of visual onset was high (ICC (1,1) : 0.92), the mean of the 6 visual assessments was contrasted with the algorithmic approaches. Poorly performing algorithms were stepwise eliminated via (1) root mean square error analysis, (2) algorithm failure to identify onset/premature onset, (3) linear regression analysis, and (4) Bland-Altman plots. The top performing algorithms were all based on Bayesian changepoint analysis of rectified EMG and were statistically indistinguishable from visual analysis. Bayesian changepoint analysis has the potential to produce more reliable, accurate, and objective intramuscular EMG onset results than standard methodologies.

  16. Surface inspection of flat products by means of texture analysis: on-line implementation using neural networks

    NASA Astrophysics Data System (ADS)

    Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.

  17. Regression Analysis as a Cost Estimation Model for Unexploded Ordnance Cleanup at Former Military Installations

    DTIC Science & Technology

    2002-06-01

    fits our actual data . To determine the goodness of fit, statisticians typically use the following four measures: R2 Statistic. The R2 statistic...reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...mathematical model is developed to better estimate cleanup costs using historical cost data that could be used by the Defense Department prior to placing

  18. Spectral analysis of groove spacing on Ganymede

    NASA Technical Reports Server (NTRS)

    Grimm, R. E.

    1984-01-01

    The technique used to analyze groove spacing on Ganymede is presented. Data from Voyager images are used determine the surface topography and position of the grooves. Power spectal estimates are statistically analyzed and sample data is included.

  19. Response surface methodology as an approach to determine optimal activities of lipase entrapped in sol-gel matrix using different vegetable oils.

    PubMed

    Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M

    2008-03-01

    The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.

  20. Mueller-matrix of laser-induced autofluorescence of polycrystalline films of dried peritoneal fluid in diagnostics of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, Yuriy A.; Koval, Galina D.; Ushenko, Alexander G.; Dubolazov, Olexander V.; Ushenko, Vladimir A.; Novakovskaia, Olga Yu.

    2016-07-01

    This research presents investigation results of the diagnostic efficiency of an azimuthally stable Mueller-matrix method of analysis of laser autofluorescence of polycrystalline films of dried uterine cavity peritoneal fluid. A model of the generalized optical anisotropy of films of dried peritoneal fluid is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase (linear and circular birefringence) and amplitude (linear and circular dichroism) anisotropies is taken into consideration. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistical analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the first to the fourth order) of differentiation of polycrystalline films of dried peritoneal fluid, group 1 (healthy donors) and group 2 (uterus endometriosis patients), are determined.

  1. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  3. [Study on correlation between ITS sequence of Arctium lappa and quality of Fructus Arctii].

    PubMed

    Xu, Liang; Dou, Deqiang; Wang, Bing; Yang, Yanyun; Kang, Tingguo

    2011-07-01

    To study the correlation between ITS sequence of Arctium lappa and Fructus Arctii quality of different origin. The samples of Fructu arctii materials were collected from 26 different producing areas. Their ITS sequence were determined after polymerase chain reaction (PCR) and quality were evaluated through the determination of arctiin content by HPLC. Genetic diversity, genotype and correlation were analyzed by ClustalX (1.81), Mage 4.0, SPSS 13.0 statistical software. ITS sequence of A. was obtained from 26 samples, and was registered in the GenBank. Corresponding arctiin content of Fructus arctii and 1000-grain weight were determined. A. lappa genotype correlated with Fructus arctii quality by statistical analysis. The research provided a foundation for revealing the molecular mechanism of Fructus arctii geoherbs.

  4. Meta-analysis of neutropenia or leukopenia as a prognostic factor in patients with malignant disease undergoing chemotherapy.

    PubMed

    Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei

    2011-08-01

    We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.

  5. An Analysis of Construction Contractor Performance Evaluation System

    DTIC Science & Technology

    2009-03-01

    65 8. Summary of Determinant and KMO Values for Finalized...principle component analysis output is the KMO and Bartlett‘s Test. KMO or Kaiser-Meyer-Olkin measure of sampling adequacy is used to identify if a...set of variables, when factored together, yield distinct and reliable factors (Field, 2005). KMO statistics vary between values of 0 to 1. Kaiser

  6. Using Multi-Group Confirmatory Factor Analysis to Evaluate Cross-Cultural Research: Identifying and Understanding Non-Invariance

    ERIC Educational Resources Information Center

    Brown, Gavin T. L.; Harris, Lois R.; O'Quin, Chrissie; Lane, Kenneth E.

    2017-01-01

    Multi-group confirmatory factor analysis (MGCFA) allows researchers to determine whether a research inventory elicits similar response patterns across samples. If statistical equivalence in responding is found, then scale score comparisons become possible and samples can be said to be from the same population. This paper illustrates the use of…

  7. A Content Analysis of Dissertations in the Field of Educational Technology: The Case of Turkey

    ERIC Educational Resources Information Center

    Durak, Gurhan; Cankaya, Serkan; Yunkul, Eyup; Misirli, Zeynel Abidin

    2018-01-01

    The present study aimed at conducting content analysis on dissertations carried out so far in the field of Educational Technology in Turkey. A total of 137 dissertations were examined to determine the key words, academic discipline, research areas, theoretical frameworks, research designs and models, statistical analyses, data collection tools,…

  8. A Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An examination of four alternative models

    PubMed Central

    Vahedi, Shahram; Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM), proposed by Earp. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian adaptation of SAM. Results As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952530

  9. Comparative analysis on the selection of number of clusters in community detection

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro; Kabashima, Yoshiyuki

    2018-02-01

    We conduct a comparative analysis on various estimates of the number of clusters in community detection. An exhaustive comparison requires testing of all possible combinations of frameworks, algorithms, and assessment criteria. In this paper we focus on the framework based on a stochastic block model, and investigate the performance of greedy algorithms, statistical inference, and spectral methods. For the assessment criteria, we consider modularity, map equation, Bethe free energy, prediction errors, and isolated eigenvalues. From the analysis, the tendency of overfit and underfit that the assessment criteria and algorithms have becomes apparent. In addition, we propose that the alluvial diagram is a suitable tool to visualize statistical inference results and can be useful to determine the number of clusters.

  10. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  11. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.

    1977-01-01

    Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.

  12. The Determinants of Academic Performance of under Graduate Students: In the Case of Arba Minch University Chamo Campus

    ERIC Educational Resources Information Center

    Yigermal, Moges Endalamaw

    2017-01-01

    The main objective of the paper is to investigate the determinant factors affecting the academic performance of regular undergraduate students of Arba Minch university (AMU) chamo campus students. To meet the objective, the Pearson product moment correlation statistical tool and econometrics data analysis (OLS regression) method were used with the…

  13. Determination of the refractive index of dehydrated cells by means of digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Belashov, A. V.; Zhikhoreva, A. A.; Bespalov, V. G.; Vasyutinskii, O. S.; Zhilinskaya, N. T.; Novik, V. I.; Semenova, I. V.

    2017-10-01

    Spatial distributions of the integral refractive index in dehydrated cells of human oral cavity epithelium are obtained by means of digital holographic microscopy, and mean refractive index of the cells is determined. The statistical analysis of the data obtained is carried out, and absolute errors of the method are estimated for different experimental conditions.

  14. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].

  15. Mediation analysis in nursing research: a methodological review.

    PubMed

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  16. Numerical 3D flow simulation of attached cavitation structures at ultrasonic horn tips and statistical evaluation of flow aggressiveness via load collectives

    NASA Astrophysics Data System (ADS)

    Mottyll, S.; Skoda, R.

    2015-12-01

    A compressible inviscid flow solver with barotropic cavitation model is applied to two different ultrasonic horn set-ups and compared to hydrophone, shadowgraphy as well as erosion test data. The statistical analysis of single collapse events in wall-adjacent flow regions allows the determination of the flow aggressiveness via load collectives (cumulative event rate vs collapse pressure), which show an exponential decrease in agreement to studies on hydrodynamic cavitation [1]. A post-processing projection of event rate and collapse pressure on a reference grid reduces the grid dependency significantly. In order to evaluate the erosion-sensitive areas a statistical analysis of transient wall loads is utilised. Predicted erosion sensitive areas as well as temporal pressure and vapour volume evolution are in good agreement to the experimental data.

  17. Institutional authorisation and accreditation of Transfusion Services and Blood Donation Sites: results of a national survey

    PubMed Central

    Liumbruno, Giancarlo Maria; Panetta, Valentina; Bonini, Rosaria; Chianese, Rosa; Fiorin, Francesco; Lupi, Maria Antonietta; Tomasini, Ivana; Grazzini, Giuliano

    2011-01-01

    Introduction The aim of the survey described in this article was to determine decisional and strategic factors useful for redefining minimum structural, technological and organisational requisites for transfusion structures, as well as for the production of guidelines for accreditation of transfusion structures by the National Blood Centre. Materials and methods A structured questionnaire containing 65 questions was sent to all Transfusion Services in Italy. The questions covered: management of the quality system, accreditation, conformity with professional standards, structural and technological requisites, as well as potential to supply transfusion medicine-related health care services. All the questionnaires returned underwent statistical analysis. Results Replies were received from 64.7% of the Transfusion Services. Thirty-nine percent of these had an ISO 9001 certificate, with marked differences according to geographical location; location-related differences were also present for responses to other questions and were confirmed by multivariate statistical analysis. Over half of the Transfusion Services (53.6%) had blood donation sites run by donor associations. The statistical analysis revealed only one statistically significant difference between these donation sites: those connected to certified Transfusion Services were more likely themselves to have ISO 9001 certification than those connected to services who did not have such certification. Conclusions The data collected in this survey are representative of the Italian national transfusion system. A re-definition of the authorisation and accreditation requisites for transfusion activities must take into account European and national legislation when determining these requisites in order to facilitate their effective applicability, promote their efficient fulfilment and enhance the development of homogeneous and transparent quality systems. PMID:21839026

  18. Statistical Inference for Porous Materials using Persistent Homology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moon, Chul; Heath, Jason E.; Mitchell, Scott A.

    2017-12-01

    We propose a porous materials analysis pipeline using persistent homology. We rst compute persistent homology of binarized 3D images of sampled material subvolumes. For each image we compute sets of homology intervals, which are represented as summary graphics called persistence diagrams. We convert persistence diagrams into image vectors in order to analyze the similarity of the homology of the material images using the mature tools for image analysis. Each image is treated as a vector and we compute its principal components to extract features. We t a statistical model using the loadings of principal components to estimate material porosity, permeability,more » anisotropy, and tortuosity. We also propose an adaptive version of the structural similarity index (SSIM), a similarity metric for images, as a measure to determine the statistical representative elementary volumes (sREV) for persistence homology. Thus we provide a capability for making a statistical inference of the uid ow and transport properties of porous materials based on their geometry and connectivity.« less

  19. 12 & 15 passenger vans tire pressure study : preliminary results

    DOT National Transportation Integrated Search

    2005-05-01

    A study was conducted by the National Highway Traffic Safety Administration's (NHTSA's) National Center for Statistics and Analysis (NCSA) to determine the extent of underinflation and observe the tire condition in 12- and 15-passenger vans. This Res...

  20. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    NASA Astrophysics Data System (ADS)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)

  1. Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Hunt, Ronderio LaDavis

    In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.

  2. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  3. Relationship of saving habit determinants among undergraduate students: A case study of UiTM Negeri Sembilan, Kampus Seremban

    NASA Astrophysics Data System (ADS)

    Syahrom, N. S.; Nasrudin, N. S.; Mohamad Yasin, N.; Azlan, N.; Manap, N.

    2017-08-01

    It has been reported that students are already accumulating substantial debt from higher education fees and their spending habit are at a critical stage. These situations cause the youngsters facing bankruptcy if they cannot improve their saving habit. Many researches have acknowledged that the determinants of saving habit include financial management, parental socialization, peer influence, and self-control. However, there remains a need for investigating the relationship between these determinants in order to avoid bankruptcy among youngsters. The objectives of this study are to investigate the relationship between saving habit determinants and to generate a statistical model based on the determinants of saving habit. Data collection method used is questionnaire and its scope is undergraduate students of UiTM Negeri Sembilan, Kampus Seremban. Samples are chosen by using stratified probability sampling technique and cross-sectional method is used as the research design. The results are then analysed by using descriptive analysis, reliability test, Pearson Correlation, and Multiple Linear Regression analysis. It shows that saving habit and self-control has no relationship. It means that students with less self-control tend to save less. A statistical model has been generated that incorporated this relationship. This study is significant to help the students save more by having high self-control.

  4. Population data of five genetic markers in the Turkish population: comparison with four American population groups.

    PubMed

    Kurtuluş-Ulküer, M; Ulküer, U; Kesici, T; Menevşe, S

    2002-09-01

    In this study, the phenotype and allele frequencies of five enzyme systems were determined in a total of 611 unrelated Turkish individuals and analyzed by using the exact and the chi 2 test. The following five red cell enzymes were identified by cellulose acetate electrophoresis: phosphoglucomutase (PGM), adenosine deaminase (ADA), phosphoglucose isomerase (PGI), adenylate kinase (AK), and 6-phosphogluconate dehydrogenase (6-PGD). The ADA, PGM and AK enzymes were found to be polymorphic in the Turkish population. The results of the statistical analysis showed, that the phenotype frequencies of the five enzyme under study are in Hardy-Weinberg equilibrium. Statistical analysis was performed in order to examine whether there are significant differences in the phenotype frequencies between the Turkish population and four American population groups. This analysis showed, that there are some statistically significant differences between the Turkish and the other groups. Moreover, the observed phenotype and allele frequencies were compared with those obtained in other population groups of Turkey.

  5. Inverse Statistics and Asset Allocation Efficiency

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam

    In this paper using inverse statistics analysis, the effect of investment horizon on the efficiency of portfolio selection is examined. Inverse statistics analysis is a general tool also known as probability distribution of exit time that is used for detecting the distribution of the time in which a stochastic process exits from a zone. This analysis was used in Refs. 1 and 2 for studying the financial returns time series. This distribution provides an optimal investment horizon which determines the most likely horizon for gaining a specific return. Using samples of stocks from Tehran Stock Exchange (TSE) as an emerging market and S&P 500 as a developed market, effect of optimal investment horizon in asset allocation is assessed. It is found that taking into account the optimal investment horizon in TSE leads to more efficiency for large size portfolios while for stocks selected from S&P 500, regardless of portfolio size, this strategy does not only not produce more efficient portfolios, but also longer investment horizons provides more efficiency.

  6. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  7. Statistical analysis of the determinations of the Sun's Galactocentric distance

    NASA Astrophysics Data System (ADS)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  8. Comparison of corneal endothelial image analysis by Konan SP8000 noncontact and Bio-Optics Bambi systems.

    PubMed

    Benetz, B A; Diaconu, E; Bowlin, S J; Oak, S S; Laing, R A; Lass, J H

    1999-01-01

    Compare corneal endothelial image analysis by Konan SP8000 and Bio-Optics Bambi image-analysis systems. Corneal endothelial images from 98 individuals (191 eyes), ranging in age from 4 to 87 years, with a normal slit-lamp examination and no history of ocular trauma, intraocular surgery, or intraocular inflammation were obtained by the Konan SP8000 noncontact specular microscope. One observer analyzed these images by using the Konan system and a second observer by using the Bio-Optics Bambi system. Three methods of analyses were used: a fixed-frame method to obtain cell density (for both Konan and Bio-Optics Bambi) and a "dot" (Konan) or "corners" (Bio-Optics Bambi) method to determine morphometric parameters. The cell density determined by the Konan fixed-frame method was significantly higher (157 cells/mm2) than the Bio-Optics Bambi fixed-frame method determination (p<0.0001). However, the difference in cell density, although still statistically significant, was smaller and reversed comparing the Konan fixed-frame method with both Konan dot and Bio-Optics Bambi comers method (-74 cells/mm2, p<0.0001; -55 cells/mm2, p<0.0001, respectively). Small but statistically significant morphometric analyses differences between Konan and Bio-Optics Bambi were seen: cell density, +19 cells/mm2 (p = 0.03); cell area, -3.0 microm2 (p = 0.008); and coefficient of variation, +1.0 (p = 0.003). There was no statistically significant difference between these two methods in the percentage of six-sided cells detected (p = 0.55). Cell densities measured by the Konan fixed-frame method were comparable with Konan and Bio-Optics Bambi's morphometric analysis, but not with the Bio-Optics Bambi fixed-frame method. The two morphometric analyses were comparable with minimal or no differences for the parameters that were studied. The Konan SP8000 endothelial image-analysis system may be useful for large-scale clinical trials determining cell loss; its noncontact system has many clinical benefits (including patient comfort, safety, ease of use, and short procedure time) and provides reliable cell-density calculations.

  9. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    PubMed Central

    Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie

    2015-01-01

    Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115

  10. The CTS 11.7 GHz angle of arrival experiment

    NASA Technical Reports Server (NTRS)

    Kwan, B. W.; Hodge, D. B.

    1981-01-01

    The objective of the experiment was to determine the statistical behavior of attenuation and angle of arrival on an Earth-space propagation path using the CTS 11.7 GHz beacon. Measurements performed from 1976 to 1978 form the data base for analysis. The statistics of the signal attenuation and phase variations due to atmospheric disturbances are presented. Rainfall rate distributions are also included to provide a link between the above effects on wave propagation and meteorological conditions.

  11. A Comparison of Atmospheric Quantities Determined from Advanced WVR and Weather Analysis Data

    NASA Astrophysics Data System (ADS)

    Morabito, D.; Wu, L.; Slobin, S.

    2017-05-01

    Lower frequency bands used for deep space communications (e.g., 2.3 GHz and 8.4 GHz) are oversubscribed. Thus, NASA has become interested in using higher frequency bands (e.g., 26 GHz and 32 GHz) for telemetry, making use of the available wider bandwidth. However, these bands are more susceptible to atmospheric degradation. Currently, flight projects tend to be conservative in preparing their communications links by using worst-case or conservative assumptions, which result in nonoptimum data return. We previously explored the use of weather forecasting over different weather condition scenarios to determine more optimal values of atmospheric attenuation and atmospheric noise temperature for use in telecommunications link design. In this article, we present the results of a comparison of meteorological parameters (columnar water vapor and liquid water content) estimated from multifrequency Advanced Water Vapor Radiometer (AWVR) data with those estimated from weather analysis tools (FNL). We find that for the Deep Space Network's Goldstone and Madrid tracking sites, the statistics are in reasonable agreement between the two methods. We can then use the statistics of these quantities based on FNL runs to estimate statistics of atmospheric signal degradation for tracking sites that do not have the benefit of possessing multiyear WVR data sets, such as those of the NASA Near-Earth Network (NEN). The resulting statistics of atmospheric attenuation and atmospheric noise temperature increase can then be used in link budget calculations.

  12. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  13. A Monte Carlo analysis of the Viking lander dynamics at touchdown. [soft landing simulation

    NASA Technical Reports Server (NTRS)

    Muraca, R. J.; Campbell, J. W.; King, C. A.

    1975-01-01

    The performance of the Viking lander has been evaluated by using a Monte Carlo simulation, and all results are presented in statistical form. The primary objectives of this analysis were as follows: (1) to determine the three sigma design values of maximum rigid body accelerations and the minimum clearance of the lander body during landing; (2) to determine the probability of an unstable landing; and (3) to determine the probability of the lander body striking a rock. Two configurations were analyzed with the only difference being in the ability of the primary landing gear struts to carry tension loads.

  14. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    NASA Astrophysics Data System (ADS)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  15. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  16. Influence of science and technology magnet middle schools on students' motivation and achievement in science

    NASA Astrophysics Data System (ADS)

    Allen, David

    Some informal discussions among educators regarding motivation of students and academic performance have included the topic of magnet schools. The premise is that a focused theme, such as an aspect of science, positively affects student motivation and academic achievement. However, there is limited research involving magnet schools and their influence on student motivation and academic performance. This study provides empirical data for the discussion about magnet schools influence on motivation and academic ability. This study utilized path analysis in a structural equation modeling framework to simultaneously investigate the relationships between demographic exogenous independent variables, the independent variable of attending a science or technology magnet middle school, and the dependent variables of motivation to learn science and academic achievement in science. Due to the categorical nature of the variables, Bayesian statistical analysis was used to calculate the path coefficients and the standardized effects for each relationship in the model. The coefficients of determination were calculated to determine the amount of variance each path explained. Only five of 21 paths had statistical significance. Only one of the five statistically significant paths (Attended Magnet School to Motivation to Learn Science) explained a noteworthy amount (45.8%) of the variance.

  17. Grid Resolution Study over Operability Space for a Mach 1.7 Low Boom External Compression Inlet

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.

    2014-01-01

    This paper presents a statistical methodology whereby the probability limits associated with CFD grid resolution of inlet flow analysis can be determined which provide quantitative information on the distribution of that error over the specified operability range. The objectives of this investigation is to quantify the effects of both random (accuracy) and systemic (biasing) errors associated with grid resolution in the analysis of the Lockheed Martin Company (LMCO) N+2 Low Boom external compression supersonic inlet. The study covers the entire operability space as defined previously by the High Speed Civil Transport (HSCT) High Speed Research (HSR) program goals. The probability limits in terms of a 95.0% confidence interval on the analysis data were evaluated for four ARP1420 inlet metrics, namely (1) total pressure recovery (PFAIP), (2) radial hub distortion (DPH/P), (3) ) radial tip distortion (DPT/P), and (4) ) circumferential distortion (DPC/P). In general, the resulting +/-0.95 delta Y interval was unacceptably large in comparison to the stated goals of the HSCT program. Therefore, the conclusion was reached that the "standard grid" size was insufficient for this type of analysis. However, in examining the statistical data, it was determined that the CFD analysis results at the outer fringes of the operability space were the determining factor in the measure of statistical uncertainty. Adequate grids are grids that are free of biasing (systemic) errors and exhibit low random (precision) errors in comparison to their operability goals. In order to be 100% certain that the operability goals have indeed been achieved for each of the inlet metrics, the Y+/-0.95 delta Y limit must fall inside the stated operability goals. For example, if the operability goal for DPC/P circumferential distortion is =0.06, then the forecast Y for DPC/P plus the 95% confidence interval on DPC/P, i.e. +/-0.95 delta Y, must all be less than or equal to 0.06.

  18. GIS and statistical analysis for landslide susceptibility mapping in the Daunia area, Italy

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Ritrovato, G.

    2010-09-01

    This study focuses on landslide susceptibility mapping in the Daunia area (Apulian Apennines, Italy) and achieves this by using a multivariate statistical method and data processing in a Geographical Information System (GIS). The Logistic Regression (hereafter LR) method was chosen to produce a susceptibility map over an area of 130 000 ha where small settlements are historically threatened by landslide phenomena. By means of LR analysis, the tendency to landslide occurrences was, therefore, assessed by relating a landslide inventory (dependent variable) to a series of causal factors (independent variables) which were managed in the GIS, while the statistical analyses were performed by means of the SPSS (Statistical Package for the Social Sciences) software. The LR analysis produced a reliable susceptibility map of the investigated area and the probability level of landslide occurrence was ranked in four classes. The overall performance achieved by the LR analysis was assessed by local comparison between the expected susceptibility and an independent dataset extrapolated from the landslide inventory. Of the samples classified as susceptible to landslide occurrences, 85% correspond to areas where landslide phenomena have actually occurred. In addition, the consideration of the regression coefficients provided by the analysis demonstrated that a major role is played by the "land cover" and "lithology" causal factors in determining the occurrence and distribution of landslide phenomena in the Apulian Apennines.

  19. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Orbit Determination (OD) Error Analysis Results for the Triana Sun-Earth L1 Libration Point Mission and for the Fourier Kelvin Stellar Interferometer (FKSI) Sun-Earth L2 Libration Point Mission Concept

    NASA Technical Reports Server (NTRS)

    Marr, Greg C.

    2003-01-01

    The Triana spacecraft was designed to be launched by the Space Shuttle. The nominal Triana mission orbit will be a Sun-Earth L1 libration point orbit. Using the NASA Goddard Space Flight Center's Orbit Determination Error Analysis System (ODEAS), orbit determination (OD) error analysis results are presented for all phases of the Triana mission from the first correction maneuver through approximately launch plus 6 months. Results are also presented for the science data collection phase of the Fourier Kelvin Stellar Interferometer Sun-Earth L2 libration point mission concept with momentum unloading thrust perturbations during the tracking arc. The Triana analysis includes extensive analysis of an initial short arc orbit determination solution and results using both Deep Space Network (DSN) and commercial Universal Space Network (USN) statistics. These results could be utilized in support of future Sun-Earth libration point missions.

  1. Analysis of determinations of the distance between the sun and the galactic center

    NASA Astrophysics Data System (ADS)

    Malkin, Z. M.

    2013-02-01

    The paper investigates the question of whether or not determinations of the distance between the Sun and the Galactic center R 0 are affected by the so-called "bandwagon effect", leading to selection effects in published data that tend to be close to expected values, as was suggested by some authors. It is difficult to estimate numerically a systematic uncertainty in R 0 due to the bandwagon effect; however, it is highly probable that, even if widely accepted values differ appreciably from the true value, the published results should eventually approach the true value despite the bandwagon effect. This should be manifest as a trend in the published R 0 data: if this trend is statistically significant, the presence of the bandwagon effect can be suspected in the data. Fifty two determinations of R 0 published over the last 20 years were analyzed. These data reveal no statistically significant trend, suggesting they are unlikely to involve any systematic uncertainty due to the bandwagon effect. At the same time, the published data show a gradual and statistically significant decrease in the uncertainties in the R 0 determinations with time.

  2. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  3. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  4. PCR evaluation : considering transition from manual to semi-automated pavement distress collection and analysis.

    DOT National Transportation Integrated Search

    2013-07-01

    This study is designed to assist the Ohio Department of Transportation (ODOT) in determining : whether transitioning from manual to state-of the-practice semi-automated pavement distress : data collection is feasible and recommended. Statistical and ...

  5. Thin layer asphaltic concrete density measuring using nuclear gages.

    DOT National Transportation Integrated Search

    1989-03-01

    A Troxler 4640 thin layer nuclear gage was evaluated under field conditions to determine if it would provide improved accuracy of density measurements on asphalt overlays of 1-3/4 and 2 inches in thickness. Statistical analysis shows slightly improve...

  6. Statistical analysis plan for the Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART). A randomized controlled trial

    PubMed Central

    Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi

    2017-01-01

    Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255

  7. Improved Diagnostic Accuracy of SPECT Through Statistical Analysis and the Detection of Hot Spots at the Primary Sensorimotor Area for the Diagnosis of Alzheimer Disease in a Community-Based Study: "The Osaki-Tajiri Project".

    PubMed

    Kaneta, Tomohiro; Nakatsuka, Masahiro; Nakamura, Kei; Seki, Takashi; Yamaguchi, Satoshi; Tsuboi, Masahiro; Meguro, Kenichi

    2016-01-01

    SPECT is an important diagnostic tool for dementia. Recently, statistical analysis of SPECT has been commonly used for dementia research. In this study, we evaluated the accuracy of visual SPECT evaluation and/or statistical analysis for the diagnosis (Dx) of Alzheimer disease (AD) and other forms of dementia in our community-based study "The Osaki-Tajiri Project." Eighty-nine consecutive outpatients with dementia were enrolled and underwent brain perfusion SPECT with 99mTc-ECD. Diagnostic accuracy of SPECT was tested using 3 methods: visual inspection (SPECT Dx), automated diagnostic tool using statistical analysis with easy Z-score imaging system (eZIS Dx), and visual inspection plus eZIS (integrated Dx). Integrated Dx showed the highest sensitivity, specificity, and accuracy, whereas eZIS was the second most accurate method. We also observed that a higher than expected rate of SPECT images indicated false-negative cases of AD. Among these, 50% showed hypofrontality and were diagnosed as frontotemporal lobar degeneration. These cases typically showed regional "hot spots" in the primary sensorimotor cortex (ie, a sensorimotor hot spot sign), which we determined were associated with AD rather than frontotemporal lobar degeneration. We concluded that the diagnostic abilities were improved by the integrated use of visual assessment and statistical analysis. In addition, the detection of a sensorimotor hot spot sign was useful to detect AD when hypofrontality is present and improved the ability to properly diagnose AD.

  8. Citation Patterns of Engineering, Statistics, and Computer Science Researchers: An Internal and External Citation Analysis across Multiple Engineering Subfields

    ERIC Educational Resources Information Center

    Kelly, Madeline

    2015-01-01

    This study takes a multidimensional approach to citation analysis, examining citations in multiple subfields of engineering, from both scholarly journals and doctoral dissertations. The three major goals of the study are to determine whether there are differences between citations drawn from dissertations and those drawn from journal articles; to…

  9. How Will DSM-5 Affect Autism Diagnosis? A Systematic Literature Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Kulage, Kristine M.; Smaldone, Arlene M.; Cohn, Elizabeth G.

    2014-01-01

    We conducted a systematic review and meta-analysis to determine the effect of changes to the Diagnostic and Statistical Manual (DSM)-5 on autism spectrum disorder (ASD) and explore policy implications. We identified 418 studies; 14 met inclusion criteria. Studies consistently reported decreases in ASD diagnosis (range 7.3-68.4%) using DSM-5…

  10. Predicting Risk Sensitivity in Humans and Lower Animals: Risk as Variance or Coefficient of Variation

    ERIC Educational Resources Information Center

    Weber, Elke U.; Shafir, Sharoni; Blais, Ann-Renee

    2004-01-01

    This article examines the statistical determinants of risk preference. In a meta-analysis of animal risk preference (foraging birds and insects), the coefficient of variation (CV), a measure of risk per unit of return, predicts choices far better than outcome variance, the risk measure of normative models. In a meta-analysis of human risk…

  11. Landsat real-time processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, E.L.

    A novel method for performing real-time acquisition and processing Landsat/EROS data covers all aspects including radiometric and geometric corrections of multispectral scanner or return-beam vidicon inputs, image enhancement, statistical analysis, feature extraction, and classification. Radiometric transformations include bias/gain adjustment, noise suppression, calibration, scan angle compensation, and illumination compensation, including topography and atmospheric effects. Correction or compensation for geometric distortion includes sensor-related distortions, such as centering, skew, size, scan nonlinearity, radial symmetry, and tangential symmetry. Also included are object image-related distortions such as aspect angle (altitude), scale distortion (altitude), terrain relief, and earth curvature. Ephemeral corrections are also applied to compensatemore » for satellite forward movement, earth rotation, altitude variations, satellite vibration, and mirror scan velocity. Image enhancement includes high-pass, low-pass, and Laplacian mask filtering and data restoration for intermittent losses. Resource classification is provided by statistical analysis including histograms, correlational analysis, matrix manipulations, and determination of spectral responses. Feature extraction includes spatial frequency analysis, which is used in parallel discriminant functions in each array processor for rapid determination. The technique uses integrated parallel array processors that decimate the tasks concurrently under supervision of a control processor. The operator-machine interface is optimized for programming ease and graphics image windowing.« less

  12. A standards-based method for compositional analysis by energy dispersive X-ray spectrometry using multivariate statistical analysis: application to multicomponent alloys.

    PubMed

    Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W

    2013-02-01

    Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.

  13. A factor analysis of the SSQ (Speech, Spatial, and Qualities of Hearing Scale).

    PubMed

    Akeroyd, Michael A; Guy, Fiona H; Harrison, Dawn L; Suller, Sharon L

    2014-02-01

    The speech, spatial, and qualities of hearing questionnaire (SSQ) is a self-report test of auditory disability. The 49 items ask how well a listener would do in many complex listening situations illustrative of real life. The scores on the items are often combined into the three main sections or into 10 pragmatic subscales. We report here a factor analysis of the SSQ that we conducted to further investigate its statistical properties and to determine its structure. Statistical factor analysis of questionnaire data, using parallel analysis to determine the number of factors to retain, oblique rotation of factors, and a bootstrap method to estimate the confidence intervals. 1220 people who have attended MRC IHR over the last decade. We found three clear factors, essentially corresponding to the three main sections of the SSQ. They are termed "speech understanding", "spatial perception", and "clarity, separation, and identification". Thirty-five of the SSQ questions were included in the three factors. There was partial evidence for a fourth factor, "effort and concentration", representing two more questions. These results aid in the interpretation and application of the SSQ and indicate potential methods for generating average scores.

  14. Kinetic analysis of single molecule FRET transitions without trajectories

    NASA Astrophysics Data System (ADS)

    Schrangl, Lukas; Göhring, Janett; Schütz, Gerhard J.

    2018-03-01

    Single molecule Förster resonance energy transfer (smFRET) is a popular tool to study biological systems that undergo topological transitions on the nanometer scale. smFRET experiments typically require recording of long smFRET trajectories and subsequent statistical analysis to extract parameters such as the states' lifetimes. Alternatively, analysis of probability distributions exploits the shapes of smFRET distributions at well chosen exposure times and hence works without the acquisition of time traces. Here, we describe a variant that utilizes statistical tests to compare experimental datasets with Monte Carlo simulations. For a given model, parameters are varied to cover the full realistic parameter space. As output, the method yields p-values which quantify the likelihood for each parameter setting to be consistent with the experimental data. The method provides suitable results even if the actual lifetimes differ by an order of magnitude. We also demonstrated the robustness of the method to inaccurately determine input parameters. As proof of concept, the new method was applied to the determination of transition rate constants for Holliday junctions.

  15. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The human chromosomal fragile sites more often involved in constitutional deletions and duplications - A genetic and statistical assessment

    NASA Astrophysics Data System (ADS)

    Gomes, Dora Prata; Sequeira, Inês J.; Figueiredo, Carlos; Rueff, José; Brás, Aldina

    2016-12-01

    Human chromosomal fragile sites (CFSs) are heritable loci or regions of the human chromosomes prone to exhibit gaps, breaks and rearrangements. Determining the frequency of deletions and duplications in CFSs may contribute to explain the occurrence of human disease due to those rearrangements. In this study we analyzed the frequency of deletions and duplications in each human CFS. Statistical methods, namely data display, descriptive statistics and linear regression analysis were applied to analyze this dataset. We found that FRA15C, FRA16A and FRAXB are the most frequently involved CFSs in deletions and duplications occurring in the human genome.

  17. Linear retrieval and global measurements of wind speed from the Seasat SMMR

    NASA Technical Reports Server (NTRS)

    Pandey, P. C.

    1983-01-01

    Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.

  18. Statistical crystallography of surface micelle spacing

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1992-01-01

    The aggregation of the recently reported surface micelles of block polyelectrolytes is analyzed using techniques of statistical crystallography. A polygonal lattice (Voronoi mosaic) connects center-to-center points, yielding statistical agreement with crystallographic predictions; Aboav-Weaire's law and Lewis's law are verified. This protocol supplements the standard analysis of surface micelles leading to aggregation number determination and, when compared to numerical simulations, allows further insight into the random partitioning of surface films. In particular, agreement with Lewis's law has been linked to the geometric packing requirements of filling two-dimensional space which compete with (or balance) physical forces such as interfacial tension, electrostatic repulsion, and van der Waals attraction.

  19. A bootstrap based Neyman-Pearson test for identifying variable importance.

    PubMed

    Ditzler, Gregory; Polikar, Robi; Rosen, Gail

    2015-04-01

    Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.

  20. The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.

    PubMed

    Christensen, G B; Knight, S; Camp, N J

    2009-11-01

    We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.

  1. Maintenance therapy with sucralfate in duodenal ulcer: genuine prevention or accelerated healing of ulcer recurrence?

    PubMed

    Bynum, T E; Koch, G G

    1991-08-08

    We sought to compare the efficacy of sucralfate to placebo for the prevention of duodenal ulcer recurrence and to determine that the efficacy of sucralfate was due to a true reduction in ulcer prevalence and not due to secondary effects such as analgesic activity or accelerated healing. This was a double-blind, randomized, placebo-controlled, parallel groups, multicenter clinical study with 254 patients. All patients had a past history of at least two duodenal ulcers with at least one ulcer diagnosed by endoscopic examination 3 months or less before the start of the study. Complete ulcer healing without erosions was required to enter the study. Sucralfate or placebo were dosed as a 1-g tablet twice a day for 4 months, or until ulcer recurrence. Endoscopic examinations once a month and when symptoms developed determined the presence or absence of duodenal ulcers. If a patient developed an ulcer between monthly scheduled visits, the patient was dosed with a 1-g sucralfate tablet twice a day until the next scheduled visit. Statistical analyses of the results determined the efficacy of sucralfate compared with placebo for preventing duodenal ulcer recurrence. Comparisons of therapeutic agents for preventing duodenal ulcers have usually been made by testing for statistical differences in the cumulative rates for all ulcers developed during a follow-up period, regardless of the time of detection. Statistical experts at the United States Food and Drug Administration (FDA) and on the FDA Advisory Panel expressed doubts about clinical study results based on this type of analysis. They suggested three possible mechanisms for reducing the number of observed ulcers: (a) analgesic effects, (b) accelerated healing, and (c) true ulcer prevention. Traditional ulcer analysis could miss recurring ulcers due to an analgesic effect or accelerated healing. Point-prevalence analysis could miss recurring ulcers due to accelerated healing between endoscopic examinations. Maximum ulcer analyses, a novel statistical method, eliminated analgesic effects by regularly scheduled endoscopies and accelerated healing of recurring ulcers by frequent endoscopies and an open-label phase. Maximum ulcer analysis reflects true ulcer recurrence and prevention. Sucralfate was significantly superior to placebo in reducing ulcer prevalence by all analyses. Significance (p less than 0.05) was found at months 3 and 4 for all analyses. All months were significant in the traditional analysis, months 2-4 in point-prevalence analysis, and months 3-4 in the maximal ulcer prevalence analysis. Sucralfate was shown to be effective for the prevention of duodenal ulcer recurrence by a true reduction in new ulcer development.

  2. PSSMSearch: a server for modeling, visualization, proteome-wide discovery and annotation of protein motif specificity determinants.

    PubMed

    Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E

    2018-06-05

    There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.

  3. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  4. A Critical Analysis of U.S. Army Accessions through Socioeconomic Consideration between 1970 and 1984.

    DTIC Science & Technology

    1985-06-01

    ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK AREA & WORK UNIT NUMBERS Naval Postgraduate School Monterey, California 93943 11. CONTROLLING OFFICE NAME AND...determine the sccioeccnomic representativeness of the Army’s enlistees in that iarticular year. In addition, the socioeconomic overviev of Republic cf...accomplished with the use of the Statistical Analysis System (SAS), an integrated computer system for data analysis. 32 TABLE 2 The States in Each District

  5. Statistics and quality assurance for the Northern Research Station Forest Inventory and Analysis Program

    Treesearch

    Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; James A. Westfall

    2018-01-01

    The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIA’s primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nation’s forest land through a comprehensive inventory and analysis of the Nation’s forest resources. The FIA program...

  6. Neuromathematical Trichotomous Mixed Methods Analysis: Using the Neuroscientific Tri-Squared Test Statistical Metric as a Post Hoc Analytic to Determine North Carolina School of Science and Mathematics Leadership Efficacy

    ERIC Educational Resources Information Center

    Osler, James Edward, II; Mason, Letita R.

    2016-01-01

    This study examines the leadership efficacy amongst graduates of The North Carolina School of Science and Mathematics (NCSSM) for the classes of 2000 through 2007 from a neuroscientific and neuromathematic perspective. NCSSM alumni (as the primary unit of analysis) were examined using a novel neuromathematic post hoc method of analysis. This study…

  7. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  8. Evaluation of the Kinetic Property of Single-Molecule Junctions by Tunneling Current Measurements.

    PubMed

    Harashima, Takanori; Hasegawa, Yusuke; Kiguchi, Manabu; Nishino, Tomoaki

    2018-01-01

    We investigated the formation and breaking of single-molecule junctions of two kinds of dithiol molecules by time-resolved tunneling current measurements in a metal nanogap. The resulting current trajectory was statistically analyzed to determine the single-molecule conductance and, more importantly, to reveal the kinetic property of the single-molecular junction. These results suggested that combining a measurement of the single-molecule conductance and statistical analysis is a promising method to uncover the kinetic properties of the single-molecule junction.

  9. Are conventional statistical techniques exhaustive for defining metal background concentrations in harbour sediments? A case study: The Coastal Area of Bari (Southeast Italy).

    PubMed

    Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio

    2015-11-01

    Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Mediation analysis in nursing research: a methodological review

    PubMed Central

    Liu, Jianghong; Ulrich, Connie

    2017-01-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804

  11. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  12. Application of Digital Image Analysis to Determine Pancreatic Islet Mass and Purity in Clinical Islet Isolation and Transplantation

    PubMed Central

    Wang, Ling-jia; Kissler, Hermann J; Wang, Xiaojun; Cochet, Olivia; Krzystyniak, Adam; Misawa, Ryosuke; Golab, Karolina; Tibudan, Martin; Grzanka, Jakub; Savari, Omid; Grose, Randall; Kaufman, Dixon B; Millis, Michael; Witkowski, Piotr

    2015-01-01

    Pancreatic islet mass, represented by islet equivalent (IEQ), is the most important parameter in decision making for clinical islet transplantation. To obtain IEQ, the sample of islets is routinely counted manually under a microscope and discarded thereafter. Islet purity, another parameter in islet processing, is routinely acquired by estimation only. In this study, we validated our digital image analysis (DIA) system developed using the software of Image Pro Plus for islet mass and purity assessment. Application of the DIA allows to better comply with current good manufacturing practice (cGMP) standards. Human islet samples were captured as calibrated digital images for the permanent record. Five trained technicians participated in determination of IEQ and purity by manual counting method and DIA. IEQ count showed statistically significant correlations between the manual method and DIA in all sample comparisons (r >0.819 and p < 0.0001). Statistically significant difference in IEQ between both methods was found only in High purity 100μL sample group (p = 0.029). As far as purity determination, statistically significant differences between manual assessment and DIA measurement was found in High and Low purity 100μL samples (p<0.005), In addition, islet particle number (IPN) and the IEQ/IPN ratio did not differ statistically between manual counting method and DIA. In conclusion, the DIA used in this study is a reliable technique in determination of IEQ and purity. Islet sample preserved as a digital image and results produced by DIA can be permanently stored for verification, technical training and islet information exchange between different islet centers. Therefore, DIA complies better with cGMP requirements than the manual counting method. We propose DIA as a quality control tool to supplement the established standard manual method for islets counting and purity estimation. PMID:24806436

  13. STATISTICAL METHOD FOR DETECTION OF A TREND IN ATMOSPHERIC SULFATE

    EPA Science Inventory

    Daily atmospheric concentrations of sulfate collected in northeastern Pennsylvania are regressed against meteorological factors, ozone, and time in order to determine if a significant trend in sulfate can be detected. he data used in this analysis were collected during the Sulfat...

  14. Transportation safety data and analysis : Volume 1, Analyzing the effectiveness of safety measures using Bayesian methods.

    DOT National Transportation Integrated Search

    2010-12-01

    Recent research suggests that traditional safety evaluation methods may be inadequate in accurately determining the effectiveness of roadway safety measures. In recent years, advanced statistical methods are being utilized in traffic safety studies t...

  15. Job Stress among Hispanic Professionals

    ERIC Educational Resources Information Center

    Rodriguez-Calcagno, Maria; Brewer, Ernest W.

    2005-01-01

    This study explores job stress among a random sample of 219 Hispanic professionals. Participants complete the Job Stress Survey by Spielberger and Vagg and a demographic questionnaire. Responses are analyzed using descriptive statistics, a factorial analysis of variance, and coefficients of determination. Results indicate that Hispanic…

  16. Non Kolmogorov Probability Models Outside Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2009-03-01

    This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.

  17. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  18. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  19. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  20. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  1. Social determinants, their relationship with leprosy risk and temporal trends in a tri-border region in Latin America

    PubMed Central

    Arcoverde, Marcos Augusto Moraes; Ramos, Antônio Carlos Viera; Alves, Luana Seles; Berra, Thais Zamboni; Arroyo, Luiz Henrique; de Queiroz, Ana Angélica Rêgo; dos Santos, Danielle Talita; Belchior, Aylana de Souza; Alves, Josilene Dália; Pieri, Flávia Meneguetti; Silva-Sobrinho, Reinaldo Antônio; Pinto, Ione Carvalho; Tavares, Clodis Maria; Yamamura, Mellina; Frade, Marco Andrey Cipriani; Palha, Pedro Fredemir; Chiaravalloti-Neto, Francisco; Arcêncio, Ricardo Alexandre

    2018-01-01

    Background Brazil is the only country in Latin America that has adopted a national health system. This causes differences in access to health among Latin American countries and induces noticeable migration to Brazilian regions to seek healthcare. This phenomenon has led to difficulties in the control and elimination of diseases related to poverty, such as leprosy. The aim of this study was to evaluate social determinants and their relationship with the risk of leprosy, as well as to examine the temporal trend of its occurrence in a Brazilian municipality located on the tri-border area between Brazil, Paraguay and Argentina. Methods This ecological study investigated newly-diagnosed cases of leprosy between 2003 and 2015. Exploratory analysis of the data was performed through descriptive statistics. For spatial analysis, geocoding of the data was performed using spatial scan statistic techniques to obtain the Relative Risk (RR) for each census tract, with their respective 95% confidence intervals calculated. The Bivariate Moran I test, Ordinary Least Squares (OLS) and Geographically Weighted Regression (GWR) models were applied to analyze the spatial relationships of social determinants and leprosy risk. The temporal trend of the annual coefficient of new cases was obtained through the Prais-Winsten regression. A standard error of 5% was considered statistically significant (p < 0.05). Results Of the 840 new cases identified in the study, there was a predominance of females (n = 427, 50.8%), of white race/color (n = 685, 81.6%), age range 15 to 59 years (n = 624, 74.3%), and incomplete elementary education (n = 504, 60.0%). The results obtained from multivariate analysis revealed that the proportion of households with monthly nominal household income per capita greater than 1 minimum wage (β = 0.025, p = 0.036) and people of brown race (β = -0.101, p = 0.024) were statistically-significantly associated with risk of illness due to leprosy. These results also confirmed that social determinants and risk of leprosy were significantly spatially non-stationary. Regarding the temporal trend, a decrease of 4% (95% CI [-0.053, -0.033], p = 0.000) per year was observed in the rate of detection of new cases of leprosy. Conclusion The social determinants income and race/color were associated with the risk of leprosy. The study’s highlighting of these social determinants can contribute to the development of public policies directed toward the elimination of leprosy in the border region. PMID:29624595

  2. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  3. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Statistical methods for astronomical data with upper limits. II - Correlation and regression

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Nelson, P. I.

    1986-01-01

    Statistical methods for calculating correlations and regressions in bivariate censored data where the dependent variable can have upper or lower limits are presented. Cox's regression and the generalization of Kendall's rank correlation coefficient provide significant levels of correlations, and the EM algorithm, under the assumption of normally distributed errors, and its nonparametric analog using the Kaplan-Meier estimator, give estimates for the slope of a regression line. Monte Carlo simulations demonstrate that survival analysis is reliable in determining correlations between luminosities at different bands. Survival analysis is applied to CO emission in infrared galaxies, X-ray emission in radio galaxies, H-alpha emission in cooling cluster cores, and radio emission in Seyfert galaxies.

  5. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  6. A New Paradigm to Analyze Data Completeness of Patient Data.

    PubMed

    Nasir, Ayan; Gurupur, Varadraj; Liu, Xinliang

    2016-08-03

    There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data.

  7. A New Paradigm to Analyze Data Completeness of Patient Data

    PubMed Central

    Nasir, Ayan; Liu, Xinliang

    2016-01-01

    Summary Background There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Objectives Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. Methods The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. Results The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. Conclusion DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data. PMID:27484918

  8. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    NASA Astrophysics Data System (ADS)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  9. Analysis of uncertainties and convergence of the statistical quantities in turbulent wall-bounded flows by means of a physically based criterion

    NASA Astrophysics Data System (ADS)

    Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu

    2018-04-01

    The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.

  10. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  11. Deformable image registration as a tool to improve survival prediction after neoadjuvant chemotherapy for breast cancer: results from the ACRIN 6657/I-SPY-1 trial

    NASA Astrophysics Data System (ADS)

    Jahani, Nariman; Cohen, Eric; Hsieh, Meng-Kang; Weinstein, Susan P.; Pantalone, Lauren; Davatzikos, Christos; Kontos, Despina

    2018-02-01

    We examined the ability of DCE-MRI longitudinal features to give early prediction of recurrence-free survival (RFS) in women undergoing neoadjuvant chemotherapy for breast cancer, in a retrospective analysis of 106 women from the ISPY 1 cohort. These features were based on the voxel-wise changes seen in registered images taken before treatment and after the first round of chemotherapy. We computed the transformation field using a robust deformable image registration technique to match breast images from these two visits. Using the deformation field, parametric response maps (PRM) — a voxel-based feature analysis of longitudinal changes in images between visits — was computed for maps of four kinetic features (signal enhancement ratio, peak enhancement, and wash-in/wash-out slopes). A two-level discrete wavelet transform was applied to these PRMs to extract heterogeneity information about tumor change between visits. To estimate survival, a Cox proportional hazard model was applied with the C statistic as the measure of success in predicting RFS. The best PRM feature (as determined by C statistic in univariable analysis) was determined for each of the four kinetic features. The baseline model, incorporating functional tumor volume, age, race, and hormone response status, had a C statistic of 0.70 in predicting RFS. The model augmented with the four PRM features had a C statistic of 0.76. Thus, our results suggest that adding information on the texture of voxel-level changes in tumor kinetic response between registered images of first and second visits could improve early RFS prediction in breast cancer after neoadjuvant chemotherapy.

  12. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  13. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  14. Statistical analysis of the horizontal divergent flow in emerging solar active regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toriumi, Shin; Hayashi, Keiji; Yokoyama, Takaaki, E-mail: shin.toriumi@nao.ac.jp

    Solar active regions (ARs) are thought to be formed by magnetic fields from the convection zone. Our flux emergence simulations revealed that a strong horizontal divergent flow (HDF) of unmagnetized plasma appears at the photosphere before the flux begins to emerge. In our earlier study, we analyzed HMI data for a single AR and confirmed presence of this precursor plasma flow in the actual Sun. In this paper, as an extension of our earlier study, we conducted a statistical analysis of the HDFs to further investigate their characteristics and better determine the properties. From SDO/HMI data, we picked up 23more » flux emergence events over a period of 14 months, the total flux of which ranges from 10{sup 20} to 10{sup 22} Mx. Out of 23 selected events, 6 clear HDFs were detected by the method we developed in our earlier study, and 7 HDFs detected by visual inspection were added to this statistic analysis. We found that the duration of the HDF is on average 61 minutes and the maximum HDF speed is on average 3.1 km s{sup –1}. We also estimated the rising speed of the subsurface magnetic flux to be 0.6-1.4 km s{sup –1}. These values are highly consistent with our previous one-event analysis as well as our simulation results. The observation results lead us to the conclusion that the HDF is a rather common feature in the earliest phase of AR emergence. Moreover, our HDF analysis has the capability of determining the subsurface properties of emerging fields that cannot be directly measured.« less

  15. The effect of ion-exchange purification on the determination of plutonium at the New Brunswick Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, W.G.; Spaletto, M.I.; Lewis, K.

    The method of plutonium (Pu) determination at the Brunswick Laboratory (NBL) consists of a combination of ion-exchange purification followed by controlled-potential coulometric analysis (IE/CPC). The present report's purpose is to quantify any detectable Pu loss occurring in the ion-exchange (IE) purification step which would cause a negative bias in the NBL method for Pu analysis. The magnitude of any such loss would be contained within the reproducibility (0.05%) of the IE/CPC method which utilizes a state-of-the-art autocoulometer developed at NBL. When the NBL IE/CPC method is used for Pu analysis, any loss in ion-exchange purification (<0.05%) is confounded with themore » repeatability of the ion-exchange and the precision of the CPC analysis technique (<0.05%). Consequently, to detect a bias in the IE/CPC method due to the IE alone using the IE/CPC method itself requires that many randomized analyses on a single material be performed over time and that statistical analysis of the data be performed. The initial approach described in this report to quantify any IE loss was an independent method, Isotope Dilution Mass Spectrometry; however, the number of analyses performed was insufficient to assign a statistically significant value to the IE loss (<0.02% of 10 mg samples of Pu). The second method used for quantifying any IE loss of Pu was multiple ion exchanges of the same Pu aliquant; the small number of analyses possible per individual IE together with the column-to-column variability over multiple ion exchanges prevented statistical detection of any loss of <0.05%. 12 refs.« less

  16. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  17. Longitudinal and Immediate Effect of Kundalini Yoga on Salivary Levels of Cortisol and Activity of Alpha-Amylase and Its Effect on Perceived Stress

    PubMed Central

    García-Sesnich, Jocelyn N; Flores, Mauricio Garrido; Ríos, Marcela Hernández; Aravena, Jorge Gamonal

    2017-01-01

    Context: Stress is defined as an alteration of an organism's balance in response to a demand perceived from the environment. Diverse methods exist to evaluate physiological response. A noninvasive method is salivary measurement of cortisol and alpha-amylase. A growing body of evidence suggests that the regular practice of Yoga would be an effective treatment for stress. Aims: To determine the Kundalini Yoga (KY) effect, immediate and after 3 months of regular practice, on the perception of psychological stress and the salivary levels of cortisol and alpha-amylase activity. Settings and Design: To determine the psychological perceived stress, levels of cortisol and alpha-amylase activity in saliva, and compare between the participants to KY classes performed for 3 months and a group that does not practice any type of yoga. Subjects and Methods: The total sample consisted of 26 people between 18 and 45-year-old; 13 taking part in KY classes given at the Faculty of Dentistry, University of Chile and 13 controls. Salivary samples were collected, enzyme-linked immunosorbent assay was performed to quantify cortisol and kinetic reaction test was made to determine alpha-amylase activity. Perceived Stress Scale was applied at the beginning and at the end of the intervention. Statistical Analysis Used: Statistical analysis was applied using Stata v11.1 software. Shapiro–Wilk test was used to determine data distribution. The paired analysis was fulfilled by t-test or Wilcoxon signed-rank test. T-test or Mann–Whitney's test was applied to compare longitudinal data. A statistical significance was considered when P < 0.05. Results: KY practice had an immediate effect on salivary cortisol. The activity of alpha-amylase did not show significant changes. A significant decrease of perceived stress in the study group was found. Conclusions: KY practice shows an immediate effect on salivary cortisol levels and on perceived stress after 3 months of practice. PMID:28546677

  18. Factors determining access to oral health services among children aged less than 12 years in Peru.

    PubMed

    Azañedo, Diego; Hernández-Vásquez, Akram; Casas-Bendezú, Mixsi; Gutiérrez, César; Agudelo-Suárez, Andrés A; Cortés, Sandra

    2017-01-01

    Background: Understanding problems of access to oral health services requires knowledge of factors that determine access. This study aimed to evaluate factors that determine access to oral health services among children aged <12 years in Peru between 2014 and 2015. Methods: We performed a secondary data analysis of 71,614 Peruvian children aged <12 years and their caregivers. Data were obtained from the Survey on Demography and Family Health 2014-2015 (Encuesta Demográfica y de Salud Familiar - ENDES). Children's access to oral health services within the previous 6 months was used as the dependent variable (i.e. Yes/No), and the Andersen and col model was used to select independent variables. Predisposing (e.g., language spoken by  tutor or guardian, wealth level, caregivers' educational level, area of residence, natural region of residence, age, and sex) and enabling factors (e.g. type of health insurance) were considered. Descriptive statistics were calculated, and multivariate analysis was performed using generalized linear models (Poisson family). Results: Of all the children, 51% were males, 56% were aged <5 years, and 62.6% lived in urban areas. The most common type of health insurance was Integral Health Insurance (57.8%), and most respondents were in the first quintile of wealth (31.6%). Regarding caregivers, the most common educational level was high school (43.02%) and the most frequently spoken language was Spanish (88.4%). Univariate analysis revealed that all variables, except sex and primary educational level, were statistically significant. After adjustment, sex, area of residence, and language were insignificant, whereas the remaining variables were statistically significant. Conclusions: Wealth index, caregivers' education level, natural region of residence, age, and type of health insurance are factors that determine access to oral health services among children aged <12 years in Peru. These factors should be considered when devising strategies to mitigate against inequities in access to oral health services.

  19. Factors determining access to oral health services among children aged less than 12 years in Peru

    PubMed Central

    Azañedo, Diego; Hernández-Vásquez, Akram; Casas-Bendezú, Mixsi; Gutiérrez, César; Agudelo-Suárez, Andrés A.; Cortés, Sandra

    2017-01-01

    Background: Understanding problems of access to oral health services requires knowledge of factors that determine access. This study aimed to evaluate factors that determine access to oral health services among children aged <12 years in Peru between 2014 and 2015. Methods: We performed a secondary data analysis of 71,614 Peruvian children aged <12 years and their caregivers. Data were obtained from the Survey on Demography and Family Health 2014-2015 (Encuesta Demográfica y de Salud Familiar - ENDES). Children’s access to oral health services within the previous 6 months was used as the dependent variable (i.e. Yes/No), and the Andersen and col model was used to select independent variables. Predisposing (e.g., language spoken by  tutor or guardian, wealth level, caregivers’ educational level, area of residence, natural region of residence, age, and sex) and enabling factors (e.g. type of health insurance) were considered. Descriptive statistics were calculated, and multivariate analysis was performed using generalized linear models (Poisson family). Results: Of all the children, 51% were males, 56% were aged <5 years, and 62.6% lived in urban areas. The most common type of health insurance was Integral Health Insurance (57.8%), and most respondents were in the first quintile of wealth (31.6%). Regarding caregivers, the most common educational level was high school (43.02%) and the most frequently spoken language was Spanish (88.4%). Univariate analysis revealed that all variables, except sex and primary educational level, were statistically significant. After adjustment, sex, area of residence, and language were insignificant, whereas the remaining variables were statistically significant. Conclusions: Wealth index, caregivers’ education level, natural region of residence, age, and type of health insurance are factors that determine access to oral health services among children aged <12 years in Peru. These factors should be considered when devising strategies to mitigate against inequities in access to oral health services. PMID:29527289

  20. Power analysis and trend detection for water quality monitoring data. An application for the Greater Yellowstone Inventory and Monitoring Network

    USGS Publications Warehouse

    Irvine, Kathryn M.; Manlove, Kezia; Hollimon, Cynthia

    2012-01-01

    An important consideration for long term monitoring programs is determining the required sampling effort to detect trends in specific ecological indicators of interest. To enhance the Greater Yellowstone Inventory and Monitoring Network’s water resources protocol(s) (O’Ney 2006 and O’Ney et al. 2009 [under review]), we developed a set of tools to: (1) determine the statistical power for detecting trends of varying magnitude in a specified water quality parameter over different lengths of sampling (years) and different within-year collection frequencies (monthly or seasonal sampling) at particular locations using historical data, and (2) perform periodic trend analyses for water quality parameters while addressing seasonality and flow weighting. A power analysis for trend detection is a statistical procedure used to estimate the probability of rejecting the hypothesis of no trend when in fact there is a trend, within a specific modeling framework. In this report, we base our power estimates on using the seasonal Kendall test (Helsel and Hirsch 2002) for detecting trend in water quality parameters measured at fixed locations over multiple years. We also present procedures (R-scripts) for conducting a periodic trend analysis using the seasonal Kendall test with and without flow adjustment. This report provides the R-scripts developed for power and trend analysis, tutorials, and the associated tables and graphs. The purpose of this report is to provide practical information for monitoring network staff on how to use these statistical tools for water quality monitoring data sets.

  1. Residual stress in glass: indentation crack and fractography approaches.

    PubMed

    Anunmana, Chuchai; Anusavice, Kenneth J; Mecholsky, John J

    2009-11-01

    To test the hypothesis that the indentation crack technique can determine surface residual stresses that are not statistically significantly different from those determined from the analytical procedure using surface cracks, the four-point flexure test, and fracture surface analysis. Soda-lime-silica glass bar specimens (4 mm x 2.3 mm x 28 mm) were prepared and annealed at 650 degrees C for 30 min before testing. The fracture toughness values of the glass bars were determined from 12 specimens based on induced surface cracks, four-point flexure, and fractographic analysis. To determine the residual stress from the indentation technique, 18 specimens were indented under 19.6N load using a Vickers microhardness indenter. Crack lengths were measured within 1 min and 24h after indentation, and the measured crack lengths were compared with the mean crack lengths of annealed specimens. Residual stress was calculated from an equation developed for the indentation technique. All specimens were fractured in a four-point flexure fixture and the residual stress was calculated from the strength and measured crack sizes on the fracture surfaces. The results show that there was no significant difference between the residual stresses calculated from the two techniques. However, the differences in mean residual stresses calculated within 1 min compared with those calculated after 24h were statistically significant (p=0.003). This study compared the indentation technique with the fractographic analysis method for determining the residual stress in the surface of soda-lime-silica glass. The indentation method may be useful for estimating residual stress in glass.

  2. Residual stress in glass: indentation crack and fractography approaches

    PubMed Central

    Anunmana, Chuchai; Anusavice, Kenneth J.; Mecholsky, John J.

    2009-01-01

    Objective To test the hypothesis that the indentation crack technique can determine surface residual stresses that are not statistically significantly different from those determined from the analytical procedure using surface cracks, the four-point flexure test, and fracture surface analysis. Methods Soda-lime-silica glass bar specimens (4 mm × 2.3 mm × 28 mm) were prepared and annealed at 650 °C for 30 min before testing. The fracture toughness values of the glass bars were determined from 12 specimens based on induced surface cracks, four-point flexure, and fractographic analysis. To determine the residual stress from the indentation technique, 18 specimens were indented under 19.6 N load using a Vickers microhardness indenter. Crack lengths were measured within 1 min and 24 h after indentation, and the measured crack lengths were compared with the mean crack lengths of annealed specimens. Residual stress was calculated from an equation developed for the indentation technique. All specimens were fractured in a four-point flexure fixture and the residual stress was calculated from the strength and measured crack sizes on the fracture surfaces. Results The results show that there was no significant difference between the residual stresses calculated from the two techniques. However, the differences in mean residual stresses calculated within 1 min compared with those calculated after 24 h were statistically significant (p=0.003). Significance This study compared the indentation technique with the fractographic analysis method for determining the residual stress in the surface of soda-lime silica glass. The indentation method may be useful for estimating residual stress in glass. PMID:19671475

  3. Objective determination of image end-members in spectral mixture analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.

    1993-01-01

    Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.

  4. STATISTICAL VALIDATION OF SULFATE QUANTIFICATION METHODS USED FOR ANALYSIS OF ACID MINE DRAINAGE

    EPA Science Inventory

    Turbidimetric method (TM), ion chromatography (IC) and inductively coupled plasma atomic emission spectrometry (ICP-AES) with and without acid digestion have been compared and validated for the determination of sulfate in mining wastewater. Analytical methods were chosen to compa...

  5. An analysis of tire tread wear groove patterns and the effect of heteroscedasticity on tire tread wear statistics

    DOT National Transportation Integrated Search

    1985-09-01

    This report examines the groove wear variability among tires subjected to the : Uniform Tire Quality Grading (UTQC) test procedure for determining tire tread wear. : The effects of heteroscedasticity (variable variance) on a previously reported : sta...

  6. Risk Driven Outcome-Based Command and Control (C2) Assessment

    DTIC Science & Technology

    2000-01-01

    shaping the risk ranking scores into more interpretable and statistically sound risk measures. Regression analysis was applied to determine what...Architecture Framework Implementation, AFCEA Coursebook 503J, February 8-11, 2000, San Diego, California. [Morgan and Henrion, 1990] M. Granger Morgan and

  7. Graduate Programs in Education: Impact on Teachers' Careers

    ERIC Educational Resources Information Center

    Tucker, Janice; Fushell, Marian

    2013-01-01

    This paper examined teachers' decisions to pursue graduate programs and their career choices following completion of their studies. Based on document analysis and statistical examination of teacher questionnaire responses, this study determined that teachers choose graduate studies for different reasons, their program choice influences future…

  8. Analysis of chemical warfare agents. II. Use of thiols and statistical experimental design for the trace level determination of vesicant compounds in air samples.

    PubMed

    Muir, Bob; Quick, Suzanne; Slater, Ben J; Cooper, David B; Moran, Mary C; Timperley, Christopher M; Carrick, Wendy A; Burnell, Christopher K

    2005-03-18

    Thermal desorption with gas chromatography-mass spectrometry (TD-GC-MS) remains the technique of choice for analysis of trace concentrations of analytes in air samples. This paper describes the development and application of a method for analysing the vesicant compounds sulfur mustard and Lewisites I-III. 3,4-Dimercaptotoluene and butanethiol were used to spike sorbent tubes and vesicant vapours sampled; Lewisite I and II reacted with the thiols while sulfur mustard and Lewisite III did not. Statistical experimental design was used to optimise thermal desorption parameters and the optimum method used to determine vesicant compounds in headspace samples taken from a decontamination trial. 3,4-Dimercaptotoluene reacted with Lewisites I and II to give a common derivative with a limit of detection (LOD) of 260 microg m(-3), while the butanethiol gave distinct derivatives with limits of detection around 30 microg m(-3).

  9. Optimization of the omega-3 extraction as a functional food from flaxseed.

    PubMed

    Hassan-Zadeh, A; Sahari, M A; Barzegar, M

    2008-09-01

    The fatty acid content, total lipid, refractive index, peroxide, iodine, acid and saponification values of Iranian linseed oil (Linum usitatissimum) were studied. For optimization of extraction conditions, this oil was extracted by solvents (petroleum benzene and methanol-water-petroleum benzene) in 1:2, 1:3 and 1:4 ratios at 2, 5 and 8 h. Then its fatty acid content, omega-3 content and extraction yield were determined. According to the statistical analysis, petroleum benzene in a ratio of 1:3 at 5 h was chosen for the higher fatty acid, extraction yield, and economical feasibility. For preservation of omega-3 ingredients, oil with specified characters containing 46.8% omega-3 was kept under a nitrogen atmosphere at -30 degrees C during 0, 7, 30, 60 and 90 days and its peroxide value was determined. Statistical analysis showed a significant difference in the average amount of peroxide value only on the first 7 days of storage, and its increase (8.30%) conformed to the international standard.

  10. Phase Space Dissimilarity Measures for Structural Health Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bubacz, Jacob A; Chmielewski, Hana T; Pape, Alexander E

    A novel method for structural health monitoring (SHM), known as the Phase Space Dissimilarity Measures (PSDM) approach, is proposed and developed. The patented PSDM approach has already been developed and demonstrated for a variety of equipment and biomedical applications. Here, we investigate SHM of bridges via analysis of time serial accelerometer measurements. This work has four aspects. The first is algorithm scalability, which was found to scale linearly from one processing core to four cores. Second, the same data are analyzed to determine how the use of the PSDM approach affects sensor placement. We found that a relatively low-density placementmore » sufficiently captures the dynamics of the structure. Third, the same data are analyzed by unique combinations of accelerometer axes (vertical, longitudinal, and lateral with respect to the bridge) to determine how the choice of axes affects the analysis. The vertical axis is found to provide satisfactory SHM data. Fourth, statistical methods were investigated to validate the PSDM approach for this application, yielding statistically significant results.« less

  11. Nurses' autonomy level in teaching hospitals and its relationship with the underlying factors.

    PubMed

    Amini, Kourosh; Negarandeh, Reza; Ramezani-Badr, Farhad; Moosaeifard, Mahdi; Fallah, Ramezan

    2015-02-01

    This study aimed to determine the autonomy level of nurses in hospitals affiliated to Zanjan University of Medical Sciences, Iran. In this descriptive cross-sectional study, 252 subjects were recruited using systematic random sampling method. The data were collected using questionnaire including Dempster Practice Behavior Scale. For data analysis, descriptive statistics and to compare the overall score and its subscales according to the demographic variables, t-test and analysis of variance test were used. The nurses in this study had medium professional autonomy. Statistical tests showed significant differences in the research sample according to age, gender, work experience, working position and place of work. The results of this study revealed that most of the nurses who participated in the study compared with western societies have lower professional autonomy. More studies are needed to determine the factors related to this difference and how we can promote Iranian nurses' autonomy. © 2013 Wiley Publishing Asia Pty Ltd.

  12. Comparison of the flexural strength of six reinforced restorative materials.

    PubMed

    Cohen, B I; Volovich, Y; Musikant, B L; Deutsch, A S

    2001-01-01

    This study calculated the flexural strength for six reinforced restorative materials and demonstrated that flexural strength values can be determined simply by using physical parameters (diametral tensile strength and Young's modulus values) that are easily determined experimentally. A one-way ANOVA analysis demonstrated a statistically significant difference between the two reinforced glass ionomers and the four composite resin materials, with the composite resin being stronger than the glass ionomers.

  13. Probabilistic Component Mode Synthesis of Nondeterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1996-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. We present a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  14. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  15. Coherent spectroscopic methods for monitoring pathogens, genetically modified products and nanostructured materials in colloidal solution

    NASA Astrophysics Data System (ADS)

    Moguilnaya, T.; Suminov, Y.; Botikov, A.; Ignatov, S.; Kononenko, A.; Agibalov, A.

    2017-01-01

    We developed the new automatic method that combines the method of forced luminescence and stimulated Brillouin scattering. This method is used for monitoring pathogens, genetically modified products and nanostructured materials in colloidal solution. We carried out the statistical spectral analysis of pathogens, genetically modified soy and nano-particles of silver in water from different regions in order to determine the statistical errors of the method. We studied spectral characteristics of these objects in water to perform the initial identification with 95% probability. These results were used for creation of the model of the device for monitor of pathogenic organisms and working model of the device to determine the genetically modified soy in meat.

  16. Research on the raw data processing method of the hydropower construction project

    NASA Astrophysics Data System (ADS)

    Tian, Zhichao

    2018-01-01

    In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.

  17. Incidence, prevalence and genetic determinants of neonatal diabetes mellitus: a systematic review and meta-analysis protocol.

    PubMed

    Nansseu, Jobert Richie N; Ngo-Um, Suzanne S; Balti, Eric V

    2016-11-10

    In the absence of existing data, the present review intends to determine the incidence, prevalence and/or genetic determinants of neonatal diabetes mellitus (NDM), with expected contribution to disease characterization. We will include cross-sectional, cohort or case-control studies which have reported the incidence, prevalence and/or genetic determinants of NDM between January 01, 2000 and May 31, 2016, published in English or French languages and without any geographical limitation. PubMed and EMBASE will be extensively screened to identify potentially eligible studies, completed by manual search. Two authors will independently screen, select studies, extract data, and assess the risk of bias; disagreements will be resolved by consensus. Clinical heterogeneity will be investigated by examining the design and setting (including geographic region), procedure used for genetic testing, calculation of incidence or prevalence, and outcomes in each study. Studies found to be clinically homogeneous will be pooled together through a random effects meta-analysis. Statistical heterogeneity will be assessed using the chi-square test of homogeneity and quantified using the I 2 statistic. In case of substantial heterogeneity, subgroup analyses will be undertaken. Publication bias will be assessed with funnel plots, complemented with the use of Egger's test of bias. This systematic review and meta-analysis is expected to draw a clear picture of phenotypic and genotypic presentations of NDM in order to better understand the condition and adequately address challenges in respect with its management. PROSPERO CRD42016039765.

  18. Influence of Structural Features and Fracture Processes on Surface Roughness: A Case Study from the Krosno Sandstones of the Górka-Mucharz Quarry (Little Beskids, Southern Poland)

    NASA Astrophysics Data System (ADS)

    Pieczara, Łukasz

    2015-09-01

    The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).

  19. Determination of reference ranges for elements in human scalp hair.

    PubMed

    Druyan, M E; Bass, D; Puchyr, R; Urek, K; Quig, D; Harmon, E; Marquardt, W

    1998-06-01

    Expected values, reference ranges, or reference limits are necessary to enable clinicians to apply analytical chemical data in the delivery of health care. Determination of references ranges is not straightforward in terms of either selecting a reference population or performing statistical analysis. In light of logistical, scientific, and economic obstacles, it is understandable that clinical laboratories often combine approaches in developing health associated reference values. A laboratory may choose to: 1. Validate either reference ranges of other laboratories or published data from clinical research or both, through comparison of patients test data. 2. Base the laboratory's reference values on statistical analysis of results from specimens assayed by the clinical reference laboratory itself. 3. Adopt standards or recommendations of regulatory agencies and governmental bodies. 4. Initiate population studies to validate transferred reference ranges or to determine them anew. Effects of external contamination and anecdotal information from clinicians may be considered. The clinical utility of hair analysis is well accepted for some elements. For others, it remains in the realm of clinical investigation. This article elucidates an approach for establishment of reference ranges for elements in human scalp hair. Observed levels of analytes from hair specimens from both our laboratory's total patient population and from a physician-defined healthy American population have been evaluated. Examination of levels of elements often associated with toxicity serves to exemplify the process of determining reference ranges in hair. In addition the approach serves as a model for setting reference ranges for analytes in a variety of matrices.

  20. Estimating short-run and long-run interaction mechanisms in interictal state.

    PubMed

    Ozkaya, Ata; Korürek, Mehmet

    2010-04-01

    We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.

  1. The use of multicomponent statistical analysis in hydrogeological environmental research.

    PubMed

    Lambrakis, Nicolaos; Antonakos, Andreas; Panagopoulos, George

    2004-04-01

    The present article examines the possibilities of investigating NO(3)(-) spread in aquifers by applying multicomponent statistical methods (factor, cluster and discriminant analysis) on hydrogeological, hydrochemical, and environmental parameters. A 4-R-Mode factor model determined from the analysis showed its useful role in investigating hydrogeological parameters affecting NO(3)(-) concentration, such as its dilution by upcoming groundwater of the recharge areas. The relationship between NO(3)(-) concentration and agricultural activities can be determined sufficiently by the first factor which relies on NO(3)(-) and SO(4)(2-) of the same origin-that of agricultural fertilizers. The other three factors of R-Mode analysis are not connected directly to the NO(3)(-) problem. They do however, by extracting the role of the unsaturated zone, show an interesting relationship between organic matter content, thickness and saturated hydraulic conductivity. The application of Hirerarchical Cluster Analysis, based on all possible combinations of classification method, showed two main groups of samples. The first group comprises samples from the edges and the second from the central part of the study area. By the application of Discriminant Analysis it was shown that NO(3)(-) and SO(4)(2-) ions are the most significant variables in the discriminant function. Therefore, the first group is considered to comprise all samples from areas not influenced by fertilizers lying on the edges of contaminating activities such as crop cultivation, while the second comprises all the other samples.

  2. Evidence of nonextensive statistical physics behavior in the watershed distribution in active tectonic areas: examples from Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Kouli, Maria

    2013-08-01

    The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.

  3. Is the spatial distribution of brain lesions associated with closed-head injury predictive of subsequent development of attention-deficit/hyperactivity disorder? Analysis with brain-image database

    NASA Technical Reports Server (NTRS)

    Herskovits, E. H.; Megalooikonomou, V.; Davatzikos, C.; Chen, A.; Bryan, R. N.; Gerring, J. P.

    1999-01-01

    PURPOSE: To determine whether there is an association between the spatial distribution of lesions detected at magnetic resonance (MR) imaging of the brain in children after closed-head injury and the development of secondary attention-deficit/hyperactivity disorder (ADHD). MATERIALS AND METHODS: Data obtained from 76 children without prior history of ADHD were analyzed. MR images were obtained 3 months after closed-head injury. After manual delineation of lesions, images were registered to the Talairach coordinate system. For each subject, registered images and secondary ADHD status were integrated into a brain-image database, which contains depiction (visualization) and statistical analysis software. Using this database, we assessed visually the spatial distributions of lesions and performed statistical analysis of image and clinical variables. RESULTS: Of the 76 children, 15 developed secondary ADHD. Depiction of the data suggested that children who developed secondary ADHD had more lesions in the right putamen than children who did not develop secondary ADHD; this impression was confirmed statistically. After Bonferroni correction, we could not demonstrate significant differences between secondary ADHD status and lesion burdens for the right caudate nucleus or the right globus pallidus. CONCLUSION: Closed-head injury-induced lesions in the right putamen in children are associated with subsequent development of secondary ADHD. Depiction software is useful in guiding statistical analysis of image data.

  4. Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra

    NASA Astrophysics Data System (ADS)

    Karstens, William; Smith, David

    2013-03-01

    Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.

  5. Kolmogorov-Smirnov statistical test for analysis of ZAP-70 expression in B-CLL, compared with quantitative PCR and IgV(H) mutation status.

    PubMed

    Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan

    2006-07-15

    ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.

  6. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  7. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  8. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  9. A statistical approach to evaluate the performance of cardiac biomarkers in predicting death due to acute myocardial infarction: time-dependent ROC curve

    PubMed

    Karaismailoğlu, Eda; Dikmen, Zeliha Günnur; Akbıyık, Filiz; Karaağaoğlu, Ahmet Ergun

    2018-04-30

    Background/aim: Myoglobin, cardiac troponin T, B-type natriuretic peptide (BNP), and creatine kinase isoenzyme MB (CK-MB) are frequently used biomarkers for evaluating risk of patients admitted to an emergency department with chest pain. Recently, time- dependent receiver operating characteristic (ROC) analysis has been used to evaluate the predictive power of biomarkers where disease status can change over time. We aimed to determine the best set of biomarkers that estimate cardiac death during follow-up time. We also obtained optimal cut-off values of these biomarkers, which differentiates between patients with and without risk of death. A web tool was developed to estimate time intervals in risk. Materials and methods: A total of 410 patients admitted to the emergency department with chest pain and shortness of breath were included. Cox regression analysis was used to determine an optimal set of biomarkers that can be used for estimating cardiac death and to combine the significant biomarkers. Time-dependent ROC analysis was performed for evaluating performances of significant biomarkers and a combined biomarker during 240 h. The bootstrap method was used to compare statistical significance and the Youden index was used to determine optimal cut-off values. Results : Myoglobin and BNP were significant by multivariate Cox regression analysis. Areas under the time-dependent ROC curves of myoglobin and BNP were about 0.80 during 240 h, and that of the combined biomarker (myoglobin + BNP) increased to 0.90 during the first 180 h. Conclusion: Although myoglobin is not clinically specific to a cardiac event, in our study both myoglobin and BNP were found to be statistically significant for estimating cardiac death. Using this combined biomarker may increase the power of prediction. Our web tool can be useful for evaluating the risk status of new patients and helping clinicians in making decisions.

  10. Critical discussion of evaluation parameters for inter-observer variability in target definition for radiation therapy.

    PubMed

    Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D

    2012-02-01

    Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.

  11. Assessing landslide susceptibility by statistical data analysis and GIS: the case of Daunia (Apulian Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Ceppi, C.; Mancini, F.; Ritrovato, G.

    2009-04-01

    This study aim at the landslide susceptibility mapping within an area of the Daunia (Apulian Apennines, Italy) by a multivariate statistical method and data manipulation in a Geographical Information System (GIS) environment. Among the variety of existing statistical data analysis techniques, the logistic regression was chosen to produce a susceptibility map all over an area where small settlements are historically threatened by landslide phenomena. By logistic regression a best fitting between the presence or absence of landslide (dependent variable) and the set of independent variables is performed on the basis of a maximum likelihood criterion, bringing to the estimation of regression coefficients. The reliability of such analysis is therefore due to the ability to quantify the proneness to landslide occurrences by the probability level produced by the analysis. The inventory of dependent and independent variables were managed in a GIS, where geometric properties and attributes have been translated into raster cells in order to proceed with the logistic regression by means of SPSS (Statistical Package for the Social Sciences) package. A landslide inventory was used to produce the bivariate dependent variable whereas the independent set of variable concerned with slope, aspect, elevation, curvature, drained area, lithology and land use after their reductions to dummy variables. The effect of independent parameters on landslide occurrence was assessed by the corresponding coefficient in the logistic regression function, highlighting a major role played by the land use variable in determining occurrence and distribution of phenomena. Once the outcomes of the logistic regression are determined, data are re-introduced in the GIS to produce a map reporting the proneness to landslide as predicted level of probability. As validation of results and regression model a cell-by-cell comparison between the susceptibility map and the initial inventory of landslide events was performed and an agreement at 75% level achieved.

  12. Standardised method of determining vibratory perception thresholds for diagnosis and screening in neurological investigation.

    PubMed Central

    Goldberg, J M; Lindblom, U

    1979-01-01

    Vibration threshold determinations were made by means of an electromagnetic vibrator at three sites (carpal, tibial, and tarsal), which were primarily selected for examining patients with polyneuropathy. Because of the vast variation demonstrated for both vibrator output and tissue damping, the thresholds were expressed in terms of amplitude of stimulator movement measured by means of an accelerometer, instead of applied voltage which is commonly used. Statistical analysis revealed a higher power of discimination for amplitude measurements at all three stimulus sites. Digital read-out gave the best statistical result and was also most practical. Reference values obtained from 110 healthy males, 10 to 74 years of age, were highly correlated with age for both upper and lower extremities. The variance of the vibration perception threshold was less than that of the disappearance threshold, and determination of the perception threshold alone may be sufficient in most cases. PMID:501379

  13. Quantification and statistical significance analysis of group separation in NMR-based metabonomics studies

    PubMed Central

    Goodpaster, Aaron M.; Kennedy, Michael A.

    2015-01-01

    Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647

  14. Luster measurements of lips treated with lipstick formulations.

    PubMed

    Yadav, Santosh; Issa, Nevine; Streuli, David; McMullen, Roger; Fares, Hani

    2011-01-01

    In this study, digital photography in combination with image analysis was used to measure the luster of several lipstick formulations containing varying amounts and types of polymers. A weighed amount of lipstick was applied to a mannequin's lips and the mannequin was illuminated by a uniform beam of a white light source. Digital images of the mannequin were captured with a high-resolution camera and the images were analyzed using image analysis software. Luster analysis was performed using Stamm (L(Stamm)) and Reich-Robbins (L(R-R)) luster parameters. Statistical analysis was performed on each luster parameter (L(Stamm) and L(R-R)), peak height, and peak width. Peak heights for lipstick formulation containing 11% and 5% VP/eicosene copolymer were statistically different from those of the control. The L(Stamm) and L(R-R) parameters for the treatment containing 11% VP/eicosene copolymer were statistically different from these of the control. Based on the results obtained in this study, we are able to determine whether a polymer is a good pigment dispersant and contributes to visually detected shine of a lipstick upon application. The methodology presented in this paper could serve as a tool for investigators to screen their ingredients for shine in lipstick formulations.

  15. The influence of control group reproduction on the statistical ...

    EPA Pesticide Factsheets

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of breeding pairs of medaka. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) will have on the statistical power of the test. A software tool, the MEOGRT Reproduction Power Analysis Tool, was developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. The manuscript illustrates how the reproductive performance of the control medaka that are used in a MEOGRT influence statistical power, and therefore the successful implementation of the protocol. Example scenarios, based upon medaka reproduction data collected at MED, are discussed that bolster the recommendation that facilities planning to implement the MEOGRT should have a culture of medaka with hi

  16. Prognostic value of GLUT-1 expression in ovarian surface epithelial tumors: a morphometric study.

    PubMed

    Ozcan, Ayhan; Deveci, Mehmet Salih; Oztas, Emin; Dede, Murat; Yenen, Mufit Cemal; Korgun, Emin Turkay; Gunhan, Omer

    2005-08-01

    To investigate the reported increase in the expression of the glucose transporter GLUT-1 in borderline and malignant ovarian epithelial tumors and its relationship to prognosis. In this study, areas in which immunohistochemical membranous staining with GLUT-1 were most evident were selected, and the proportions of GLUT-1 expression in 46 benign, 11 borderline and 42 malignant cases of ovarian epithelial tumors were determined quantitatively with a computer and Zeiss Vision KS 400 3.0 (Göttingen, Germany) for Windows (Microsoft, Redmond, Washington, U.S.A.) image analysis. GLUT-1 expression was determined in all borderline tumors (11 of 11) and in 97.6% of malignant tumors (41 of 42). No GLUT-1 expression was observed in benign tumors. The intensity of GLUT-1 staining was lower in borderline tumors than in malignant cases. This was statistically significant (p = 0.005). As differentiation in malignant tumors increased, proportions of GLUT-1 expression showed a relative increase, but this difference was not statistically significant (p = 0.68). When GLUT-1 expression in borderline and malignant ovarian epithelial tumors was analyzed against prognosis, no statistically significant difference was identified. Assessment of GLUT-1 expression using the image analysis program was more reliable, with higher reproducibility than in previous studies.

  17. Determination of variables in the prediction of strontium distribution coefficients for selected sediments

    USGS Publications Warehouse

    Pace, M.N.; Rosentreter, J.J.; Bartholomay, R.C.

    2001-01-01

    Idaho State University and the US Geological Survey, in cooperation with the US Department of Energy, conducted a study to determine and evaluate strontium distribution coefficients (Kds) of subsurface materials at the Idaho National Engineering and Environmental Laboratory (INEEL). The Kds were determined to aid in assessing the variability of strontium Kds and their effects on chemical transport of strontium-90 in the Snake River Plain aquifer system. Data from batch experiments done to determine strontium Kds of five sediment-infill samples and six standard reference material samples were analyzed by using multiple linear regression analysis and the stepwise variable-selection method in the statistical program, Statistical Product and Service Solutions, to derive an equation of variables that can be used to predict strontium Kds of sediment-infill samples. The sediment-infill samples were from basalt vesicles and fractures from a selected core at the INEEL; strontium Kds ranged from ???201 to 356 ml g-1. The standard material samples consisted of clay minerals and calcite. The statistical analyses of the batch-experiment results showed that the amount of strontium in the initial solution, the amount of manganese oxide in the sample material, and the amount of potassium in the initial solution are the most important variables in predicting strontium Kds of sediment-infill samples.

  18. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  19. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  20. Quantifying the evolution of flow boiling bubbles by statistical testing and image analysis: toward a general model.

    PubMed

    Xiao, Qingtai; Xu, Jianxin; Wang, Hua

    2016-08-16

    A new index, the estimate of the error variance, which can be used to quantify the evolution of the flow patterns when multiphase components or tracers are difficultly distinguishable, was proposed. The homogeneity degree of the luminance space distribution behind the viewing windows in the direct contact boiling heat transfer process was explored. With image analysis and a linear statistical model, the F-test of the statistical analysis was used to test whether the light was uniform, and a non-linear method was used to determine the direction and position of a fixed source light. The experimental results showed that the inflection point of the new index was approximately equal to the mixing time. The new index has been popularized and applied to a multiphase macro mixing process by top blowing in a stirred tank. Moreover, a general quantifying model was introduced for demonstrating the relationship between the flow patterns of the bubble swarms and heat transfer. The results can be applied to investigate other mixing processes that are very difficult to recognize the target.

  1. The problem of pseudoreplication in neuroscientific studies: is it affecting your analysis?

    PubMed Central

    2010-01-01

    Background Pseudoreplication occurs when observations are not statistically independent, but treated as if they are. This can occur when there are multiple observations on the same subjects, when samples are nested or hierarchically organised, or when measurements are correlated in time or space. Analysis of such data without taking these dependencies into account can lead to meaningless results, and examples can easily be found in the neuroscience literature. Results A single issue of Nature Neuroscience provided a number of examples and is used as a case study to highlight how pseudoreplication arises in neuroscientific studies, why the analyses in these papers are incorrect, and appropriate analytical methods are provided. 12% of papers had pseudoreplication and a further 36% were suspected of having pseudoreplication, but it was not possible to determine for certain because insufficient information was provided. Conclusions Pseudoreplication can undermine the conclusions of a statistical analysis, and it would be easier to detect if the sample size, degrees of freedom, the test statistic, and precise p-values are reported. This information should be a requirement for all publications. PMID:20074371

  2. Quantifying the evolution of flow boiling bubbles by statistical testing and image analysis: toward a general model

    PubMed Central

    Xiao, Qingtai; Xu, Jianxin; Wang, Hua

    2016-01-01

    A new index, the estimate of the error variance, which can be used to quantify the evolution of the flow patterns when multiphase components or tracers are difficultly distinguishable, was proposed. The homogeneity degree of the luminance space distribution behind the viewing windows in the direct contact boiling heat transfer process was explored. With image analysis and a linear statistical model, the F-test of the statistical analysis was used to test whether the light was uniform, and a non-linear method was used to determine the direction and position of a fixed source light. The experimental results showed that the inflection point of the new index was approximately equal to the mixing time. The new index has been popularized and applied to a multiphase macro mixing process by top blowing in a stirred tank. Moreover, a general quantifying model was introduced for demonstrating the relationship between the flow patterns of the bubble swarms and heat transfer. The results can be applied to investigate other mixing processes that are very difficult to recognize the target. PMID:27527065

  3. [Organizational climate and burnout syndrome].

    PubMed

    Lubrańska, Anna

    2011-01-01

    The paper addresses the issue of organizational climate and burnout syndrome. It has been assumed that burnout syndrome is dependent on work climate (organizational climate), therefore, two concepts were analyzed: by D. Kolb (organizational climate) and by Ch. Maslach (burnout syndrome). The research involved 239 persons (122 woman, 117 men), aged 21-66. In the study Maslach Burnout Inventory (MBI) and Inventory of Organizational Climate were used. The results of statistical methods (correlation analysis, one-variable analysis of variance and regression analysis) evidenced a strong relationship between organizational climate and burnout dimension. As depicted by the results, there are important differences in the level of burnout between the study participants who work in different types of organizational climate. The results of the statistical analyses indicate that the organizational climate determines burnout syndrome. Therefore, creating supportive conditions at the workplace might reduce the risk of burnout.

  4. Statistical comparison of pooled nitrogen washout data of various altitude decompression response groups

    NASA Technical Reports Server (NTRS)

    Edwards, B. F.; Waligora, J. M.; Horrigan, D. J., Jr.

    1985-01-01

    This analysis was done to determine whether various decompression response groups could be characterized by the pooled nitrogen (N2) washout profiles of the group members, pooling individual washout profiles provided a smooth time dependent function of means representative of the decompression response group. No statistically significant differences were detected. The statistical comparisons of the profiles were performed by means of univariate weighted t-test at each 5 minute profile point, and with levels of significance of 5 and 10 percent. The estimated powers of the tests (i.e., probabilities) to detect the observed differences in the pooled profiles were of the order of 8 to 30 percent.

  5. Ion induced electron emission statistics under Agm- cluster bombardment of Ag

    NASA Astrophysics Data System (ADS)

    Breuers, A.; Penning, R.; Wucher, A.

    2018-05-01

    The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.

  6. Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects

    NASA Astrophysics Data System (ADS)

    Ekness, Jamie Lynn

    The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field project would be necessary to obtain standardly accepted statistically significant results (p < 0.5) for the double ratio of precipitation amount, the obtained p-value of 0.063 is close and considering the positive result from other hygroscopic seeding experiments, the North Dakota Cloud Modification Project should consider implementation of hygroscopic seeding.

  7. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  8. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  9. Thinking big: linking rivers to landscapes

    Treesearch

    Joan O’Callaghan; Ashley E. Steel; Kelly M. Burnett

    2012-01-01

    Exploring relationships between landscape characteristics and rivers is an emerging field, enabled by the proliferation of satellite date, advances in statistical analysis, and increased emphasis on large-scale monitoring. Landscapes features such as road networks, underlying geology, and human developments, determine the characteristics of the rivers flowing through...

  10. Detection of semi-volatile organic compounds in permeable ...

    EPA Pesticide Factsheets

    Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame

  11. Orthotopic bladder substitution in men revisited: identification of continence predictors.

    PubMed

    Koraitim, M M; Atta, M A; Foda, M K

    2006-11-01

    We determined the impact of the functional characteristics of the neobladder and urethral sphincter on continence results, and determined the most significant predictors of continence. A total of 88 male patients 29 to 70 years old underwent orthotopic bladder substitution with tubularized ileocecal segment (40) and detubularized sigmoid (25) or ileum (23). Uroflowmetry, cystometry and urethral pressure profilometry were performed at 13 to 36 months (mean 19) postoperatively. The correlation between urinary continence and 28 urodynamic variables was assessed. Parameters that correlated significantly with continence were entered into a multivariate analysis using a logistic regression model to determine the most significant predictors of continence. Maximum urethral closure pressure was the only parameter that showed a statistically significant correlation with diurnal continence. Nocturnal continence had not only a statistically significant positive correlation with maximum urethral closure pressure, but also statistically significant negative correlations with maximum contraction amplitude, and baseline pressure at mid and maximum capacity. Three of these 4 parameters, including maximum urethral closure pressure, maximum contraction amplitude and baseline pressure at mid capacity, proved to be significant predictors of continence on multivariate analysis. While daytime continence is determined by maximum urethral closure pressure, during the night it is the net result of 2 forces that have about equal influence but in opposite directions, that is maximum urethral closure pressure vs maximum contraction amplitude plus baseline pressure at mid capacity. Two equations were derived from the logistic regression model to predict the probability of continence after orthotopic bladder substitution, including Z1 (diurnal) = 0.605 + 0.0085 maximum urethral closure pressure and Z2 (nocturnal) = 0.841 + 0.01 [maximum urethral closure pressure - (maximum contraction amplitude + baseline pressure at mid capacity)].

  12. SU-E-T-247: Determinations of the Optimal Phase for Respiratory Gated Radiotherapy From Statistical Analysis Using a Visible Guidance System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oh, S; Yea, J; Kang, M

    Purpose: Respiratory gated radiation therapy (RGRT) is used to minimize the radiation dose to normal tissue in lung cancer patients. Determination of the optimal point in the respiratory phase of a patient is important in RGRT but it is not easy. The goal of the present study was to see if a visible guidance system is helpful in determining the optimal phase in respiratory gated therapy. Methods: The breathing signals of 23 lung cancer patients were recorded with a Real-time Position Management (RPM) respiratory gating system (Varian, USA). The patients underwent breathing training with our visible guidance system, after whichmore » their breathing signals were recorded during 5 min of free breathing and 5 min of guided breathing. The breathing signals recorded between 3 and 5 min before and after training were compared. We performed statistical analysis of the breathing signals to find the optimal duty cycle in guided breathing for RGRT. Results: The breathing signals aided by the visible guidance system had more regular cycles over time and smaller variations in the positions of the marker block than the free breathing signals. Of the 23 lung cancer patients, 19 showed statistically significant differences by time when the values obtained before and after breathing were compared (p < 0.05); 30% and 40% of the duty cycle, respectively, was determined to be the most effective, and the corresponding phases were 30 60% (duty cycle, 30%; p < 0.05) and 30 70% (duty cycle, 40%; p < 0.05). Conclusion: Respiratory regularity was significantly improved with the use of the RPM with our visible guiding system; therefore, it would help improve the accuracy and efficiency of RGRT.« less

  13. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  14. Practicality of Elementary Statistics Module Based on CTL Completed by Instructions on Using Software R

    NASA Astrophysics Data System (ADS)

    Delyana, H.; Rismen, S.; Handayani, S.

    2018-04-01

    This research is a development research using 4-D design model (define, design, develop, and disseminate). The results of the define stage are analyzed for the needs of the following; Syllabus analysis, textbook analysis, student characteristics analysis and literature analysis. The results of textbook analysis obtained the description that of the two textbooks that must be owned by students also still difficulty in understanding it, the form of presentation also has not facilitated students to be independent in learning to find the concept, textbooks are also not equipped with data processing referrals by using software R. The developed module is considered valid by the experts. Further field trials are conducted to determine the practicality and effectiveness. The trial was conducted to the students of Mathematics Education Study Program of STKIP PGRI which was taken randomly which has not taken Basic Statistics Course that is as many as 4 people. Practical aspects of attention are easy, time efficient, easy to interpret, and equivalence. The practical value in each aspect is 3.7; 3.79, 3.7 and 3.78. Based on the results of the test students considered that the module has been very practical use in learning. This means that the module developed can be used by students in Elementary Statistics learning.

  15. Experimental design of an interlaboratory study for trace metal analysis of liquid fluids. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1983-01-01

    The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.

  16. A factor analysis of the SSQ (Speech, Spatial, and Qualities of Hearing Scale)

    PubMed Central

    2014-01-01

    Objective The speech, spatial, and qualities of hearing questionnaire (SSQ) is a self-report test of auditory disability. The 49 items ask how well a listener would do in many complex listening situations illustrative of real life. The scores on the items are often combined into the three main sections or into 10 pragmatic subscales. We report here a factor analysis of the SSQ that we conducted to further investigate its statistical properties and to determine its structure. Design Statistical factor analysis of questionnaire data, using parallel analysis to determine the number of factors to retain, oblique rotation of factors, and a bootstrap method to estimate the confidence intervals. Study sample 1220 people who have attended MRC IHR over the last decade. Results We found three clear factors, essentially corresponding to the three main sections of the SSQ. They are termed “speech understanding”, “spatial perception”, and “clarity, separation, and identification”. Thirty-five of the SSQ questions were included in the three factors. There was partial evidence for a fourth factor, “effort and concentration”, representing two more questions. Conclusions These results aid in the interpretation and application of the SSQ and indicate potential methods for generating average scores. PMID:24417459

  17. Multivariate frequency domain analysis of protein dynamics

    NASA Astrophysics Data System (ADS)

    Matsunaga, Yasuhiro; Fuchigami, Sotaro; Kidera, Akinori

    2009-03-01

    Multivariate frequency domain analysis (MFDA) is proposed to characterize collective vibrational dynamics of protein obtained by a molecular dynamics (MD) simulation. MFDA performs principal component analysis (PCA) for a bandpass filtered multivariate time series using the multitaper method of spectral estimation. By applying MFDA to MD trajectories of bovine pancreatic trypsin inhibitor, we determined the collective vibrational modes in the frequency domain, which were identified by their vibrational frequencies and eigenvectors. At near zero temperature, the vibrational modes determined by MFDA agreed well with those calculated by normal mode analysis. At 300 K, the vibrational modes exhibited characteristic features that were considerably different from the principal modes of the static distribution given by the standard PCA. The influences of aqueous environments were discussed based on two different sets of vibrational modes, one derived from a MD simulation in water and the other from a simulation in vacuum. Using the varimax rotation, an algorithm of the multivariate statistical analysis, the representative orthogonal set of eigenmodes was determined at each vibrational frequency.

  18. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    PubMed

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  19. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    PubMed Central

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  20. Association between Hypertension and Epistaxis: Systematic Review and Meta-analysis.

    PubMed

    Min, Hyun Jin; Kang, Hyun; Choi, Geun Joo; Kim, Kyung Soo

    2017-12-01

    Objective Whether there is an association or a cause-and-effect relationship between epistaxis and hypertension is a subject of longstanding controversy. The objective of this systematic review and meta-analysis was to determine the association between epistaxis and hypertension and to verify whether hypertension is an independent risk factor of epistaxis. Data Sources A comprehensive search was performed using the MEDLINE, EMBASE, and Cochrane Library databases. Review Methods The review was performed according to the Meta-analysis of Observational Studies in Epidemiology guidelines and reported using the Preferred Reporting Items for Systematic Reviews and Meta-analysis guidelines. Results We screened 2768 unique studies and selected 10 for this meta-analysis. Overall, the risk of epistaxis was significantly increased for patients with hypertension (odds ratio, 1.532 [95% confidence interval (CI), 1.181-1.986]; number needed to treat, 14.9 [95% CI, 12.3-19.0]). Results of the Q test and I 2 statistics suggested considerable heterogeneity ([Formula: see text] = 0.038, I 2 = 49.3%). The sensitivity analysis was performed by excluding 1 study at a time, and it revealed no change in statistical significance. Conclusion Although this meta-analysis had some limitations, our study demonstrated that hypertension was significantly associated with the risk of epistaxis. However, since this association does not support a causal relationship between hypertension and epistaxis, further clinical trials with large patient populations will be required to determine the impact of hypertension on epistaxis.

  1. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    USGS Publications Warehouse

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The report provides extensive information about statistical methods used to analyze quality-control data in order to estimate potential bias and variability in environmental data. These methods include construction of confidence intervals on various statistical measures, such as the mean, percentiles and percentages, and standard deviation. The methods are used to compare quality-control results with the larger set of environmental data in order to determine whether the effects of bias and variability might interfere with interpretation of these data. Examples from published reports are presented to illustrate how the methods are applied, how bias and variability are reported, and how the interpretation of environmental data can be qualified based on the quality-control analysis.

  2. Statistical errors in molecular dynamics averages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiferl, S.K.; Wallace, D.C.

    1985-11-15

    A molecular dynamics calculation produces a time-dependent fluctuating signal whose average is a thermodynamic quantity of interest. The average of the kinetic energy, for example, is proportional to the temperature. A procedure is described for determining when the molecular dynamics system is in equilibrium with respect to a given variable, according to the condition that the mean and the bandwidth of the signal should be sensibly constant in time. Confidence limits for the mean are obtained from an analysis of a finite length of the equilibrium signal. The role of serial correlation in this analysis is discussed. The occurence ofmore » unstable behavior in molecular dynamics data is noted, and a statistical test for a level shift is described.« less

  3. Wavelet analysis of polarization azimuths maps for laser images of myocardial tissue for the purpose of diagnosing acute coronary insufficiency

    NASA Astrophysics Data System (ADS)

    Wanchuliak, O. Ya.; Peresunko, A. P.; Bakko, Bouzan Adel; Kushnerick, L. Ya.

    2011-09-01

    This paper presents the foundations of a large scale - localized wavelet - polarization analysis - inhomogeneous laser images of histological sections of myocardial tissue. Opportunities were identified defining relations between the structures of wavelet coefficients and causes of death. The optical model of polycrystalline networks of myocardium protein fibrils is presented. The technique of determining the coordinate distribution of polarization azimuth of the points of laser images of myocardium histological sections is suggested. The results of investigating the interrelation between the values of statistical (statistical moments of the 1st-4th order) parameters are presented which characterize distributions of wavelet - coefficients polarization maps of myocardium layers and death reasons.

  4. Quinolizidine alkaloids from Lupinus lanatus

    NASA Astrophysics Data System (ADS)

    Neto, Alexandre T.; Oliveira, Carolina Q.; Ilha, Vinicius; Pedroso, Marcelo; Burrow, Robert A.; Dalcol, Ionara I.; Morel, Ademir F.

    2011-10-01

    In this study, one new quinolizidine alkaloid, lanatine A ( 1), together with three other known alkaloids, 13-α- trans-cinnamoyloxylupanine ( 2), 13-α-hydroxylupanine ( 3), and (-)-multiflorine ( 4) were isolated from the aerial parts of Lupinus lanatus (Fabaceae). The structures of alkaloids 1- 4 were elucidated by spectroscopic data analysis. The stereochemistry of 1 was determined by single crystal X-ray analysis. Bayesian statistical analysis of the Bijvoet differences suggests the absolute stereochemistry of 1. In addition, the antimicrobial potential of alkaloids 1- 4 is also reported.

  5. Special Flood Hazard Evaluation Report, Maumee River, Defiance and Paulding Counties, Ohio

    DTIC Science & Technology

    1988-01-01

    into the Flood Flow Frequency Analysis (FFFA) computer program (Reference 3) to determine the discharge-frequency relationship for the Maumee River...although the flood may occur in any year. It is based on statistical analysis of streamflow records available for the watershed and analysis of rainfall...C) K) K4 10 ERFODBUDR .S ryEgne itit ufI N - FODA ONAYSEIA LO AADEAUTO 6 ? -F -C )I= ~ - %E )tvXJ. AE LO LVTO MAMERVE CROS SECIONLOCAION DEFINCEAND

  6. Predictive data modeling of human type II diabetes related statistics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.

    2009-04-01

    During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.

  7. Statistical Measurement and Analysis of Claimant and Demographic Variables Affecting Processing and Adjudication Duration in The United States Army Physical Disability Evaluation System.

    DTIC Science & Technology

    1997-02-06

    Adjudication Duration 2 2. INTRODUCTION This retrospective study analyzes relationships of variables to adjudication and processing duration in the Army...Package for Social Scientists (SPSS), Standard Version 6.1, June 1994, to determine relationships among the dependent and independent variables... consanguinity between variables. Content and criterion validity is employed to determine the measure of scientific validity. Reliability is also

  8. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  9. Bayesian statistics as a new tool for spectral analysis - I. Application for the determination of basic parameters of massive stars

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2015-11-01

    Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.

  10. Segment and Fit Thresholding: A New Method for Image Analysis Applied to Microarray and Immunofluorescence Data

    PubMed Central

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.

    2016-01-01

    Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  11. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  12. Factors Influencing Stress, Burnout, and Retention of Secondary Teachers

    ERIC Educational Resources Information Center

    Fisher, Molly H.

    2011-01-01

    This study examines the stress, burnout, satisfaction, and preventive coping skills of nearly 400 secondary teachers to determine variables contributing to these major factors influencing teachers. Analysis of Variance (ANOVA) statistics were conducted that found the burnout levels between new and experienced teachers are significantly different,…

  13. 7 CFR 601.1 - Functions assigned.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) National Resources Inventory (NRI) that is a statistically-based survey designed and implemented using... both the provisions of the Food Security Act and Section 404 of the Clean Water Act. (ii) Soil surveys.... Soil surveys are based on scientific analysis and classification of the soils and are used to determine...

  14. 7 CFR 601.1 - Functions assigned.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) National Resources Inventory (NRI) that is a statistically-based survey designed and implemented using... both the provisions of the Food Security Act and Section 404 of the Clean Water Act. (ii) Soil surveys.... Soil surveys are based on scientific analysis and classification of the soils and are used to determine...

  15. 7 CFR 601.1 - Functions assigned.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) National Resources Inventory (NRI) that is a statistically-based survey designed and implemented using... both the provisions of the Food Security Act and Section 404 of the Clean Water Act. (ii) Soil surveys.... Soil surveys are based on scientific analysis and classification of the soils and are used to determine...

  16. 7 CFR 601.1 - Functions assigned.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) National Resources Inventory (NRI) that is a statistically-based survey designed and implemented using... both the provisions of the Food Security Act and Section 404 of the Clean Water Act. (ii) Soil surveys.... Soil surveys are based on scientific analysis and classification of the soils and are used to determine...

  17. 7 CFR 601.1 - Functions assigned.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) National Resources Inventory (NRI) that is a statistically-based survey designed and implemented using... both the provisions of the Food Security Act and Section 404 of the Clean Water Act. (ii) Soil surveys.... Soil surveys are based on scientific analysis and classification of the soils and are used to determine...

  18. A Comparison of Approaches for Setting Proficiency Standards.

    ERIC Educational Resources Information Center

    Koffler, Stephen L.

    This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…

  19. The problem of extreme events in paired-watershed studies

    Treesearch

    James W. Hornbeck

    1973-01-01

    In paired-watershed studies, the occurrence of an extreme event during the after-treatment period presents a problem: the effects of treatment must be determined by using greatly extrapolated regression statistics. Several steps are presented to help insure careful handling of extreme events during analysis and reporting of research results.

  20. The Effectiveness of Edgenuity When Used for Credit Recovery

    ERIC Educational Resources Information Center

    Eddy, Carri

    2013-01-01

    This quantitative study used descriptive statistics, logistic regression, and chi-square analysis to determine the impact of using Edgenuity (formerly Education 2020 Virtual Classroom) to assist students in the recovery of lost credits. The sample included a North Texas school district. The Skyward student management system provided archived…

  1. How does new evidence change our estimates of probabilities? Carnap's formula revisited

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Quintana, Chris

    1992-01-01

    The formula originally proposed by R. Carnap in his analysis of induction is reviewed and its natural generalization is presented. A situation is considered where the probability of a certain event is determined without using standard statistical methods due to the lack of observation.

  2. A Comparative Analysis of Results Using Three Leadership Style Measurement Instruments.

    DTIC Science & Technology

    The objective of this research was to determine if there was conceptual similarity among leadership styles as measured by the results of the Ohio...hybrid statistic was developed here to measure the degree of association among the leadership styles recorded by each individual respondent. The

  3. EFFECTS OF HEAVY METALS IN SEDIMENTS OF THE MACROINVERTEBRATE COMMUNITY IN THE SHORT CREEK/EMPIRE LAKE AQUATIC SYSTEM, CHEROKEE COUNTY, KANSAS: A RECOMMENDATION FOR SITE-SPECIFIC CRITERIA.

    EPA Science Inventory

    The study uses statistical analysis techniques to determine the effects of four heavy metals (cadmium, lead, manganese, and zinc) on the macroinvertebrate community using the data collected in the fall 1987.

  4. Analyzing Large Gene Expression and Methylation Data Profiles Using StatBicRM: Statistical Biclustering-Based Rule Mining

    PubMed Central

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807

  5. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    PubMed

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.

  6. Measurements and analysis in imaging for biomedical applications

    NASA Astrophysics Data System (ADS)

    Hoeller, Timothy L.

    2009-02-01

    A Total Quality Management (TQM) approach can be used to analyze data from biomedical optical and imaging platforms of tissues. A shift from individuals to teams, partnerships, and total participation are necessary from health care groups for improved prognostics using measurement analysis. Proprietary measurement analysis software is available for calibrated, pixel-to-pixel measurements of angles and distances in digital images. Feature size, count, and color are determinable on an absolute and comparative basis. Although changes in images of histomics are based on complex and numerous factors, the variation of changes in imaging analysis to correlations of time, extent, and progression of illness can be derived. Statistical methods are preferred. Applications of the proprietary measurement software are available for any imaging platform. Quantification of results provides improved categorization of illness towards better health. As health care practitioners try to use quantified measurement data for patient diagnosis, the techniques reported can be used to track and isolate causes better. Comparisons, norms, and trends are available from processing of measurement data which is obtained easily and quickly from Scientific Software and methods. Example results for the class actions of Preventative and Corrective Care in Ophthalmology and Dermatology, respectively, are provided. Improved and quantified diagnosis can lead to better health and lower costs associated with health care. Systems support improvements towards Lean and Six Sigma affecting all branches of biology and medicine. As an example for use of statistics, the major types of variation involving a study of Bone Mineral Density (BMD) are examined. Typically, special causes in medicine relate to illness and activities; whereas, common causes are known to be associated with gender, race, size, and genetic make-up. Such a strategy of Continuous Process Improvement (CPI) involves comparison of patient results to baseline data using F-statistics. Self-parings over time are also useful. Special and common causes are identified apart from aging in applying the statistical methods. In the future, implementation of imaging measurement methods by research staff, doctors, and concerned patient partners result in improved health diagnosis, reporting, and cause determination. The long-term prospects for quantified measurements are better quality in imaging analysis with applications of higher utility for heath care providers.

  7. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  8. An Automated Method of Scanning Probe Microscopy (SPM) Data Analysis and Reactive Site Tracking for Mineral-Water Interface Reactions Observed at the Nanometer Scale

    NASA Astrophysics Data System (ADS)

    Campbell, B. D.; Higgins, S. R.

    2008-12-01

    Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.

  9. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI

    PubMed Central

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-01-01

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798

  10. Regional Regression Equations to Estimate Flow-Duration Statistics at Ungaged Stream Sites in Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.

    2010-01-01

    Multiple linear regression equations for determining flow-duration statistics were developed to estimate select flow exceedances ranging from 25- to 99-percent for six 'bioperiods'-Salmonid Spawning (November), Overwinter (December-February), Habitat Forming (March-April), Clupeid Spawning (May), Resident Spawning (June), and Rearing and Growth (July-October)-in Connecticut. Regression equations also were developed to estimate the 25- and 99-percent flow exceedances without reference to a bioperiod. In total, 32 equations were developed. The predictive equations were based on regression analyses relating flow statistics from streamgages to GIS-determined basin and climatic characteristics for the drainage areas of those streamgages. Thirty-nine streamgages (and an additional 6 short-term streamgages and 28 partial-record sites for the non-bioperiod 99-percent exceedance) in Connecticut and adjacent areas of neighboring States were used in the regression analysis. Weighted least squares regression analysis was used to determine the predictive equations; weights were assigned based on record length. The basin characteristics-drainage area, percentage of area with coarse-grained stratified deposits, percentage of area with wetlands, mean monthly precipitation (November), mean seasonal precipitation (December, January, and February), and mean basin elevation-are used as explanatory variables in the equations. Standard errors of estimate of the 32 equations ranged from 10.7 to 156 percent with medians of 19.2 and 55.4 percent to predict the 25- and 99-percent exceedances, respectively. Regression equations to estimate high and median flows (25- to 75-percent exceedances) are better predictors (smaller variability of the residual values around the regression line) than the equations to estimate low flows (less than 75-percent exceedance). The Habitat Forming (March-April) bioperiod had the smallest standard errors of estimate, ranging from 10.7 to 20.9 percent. In contrast, the Rearing and Growth (July-October) bioperiod had the largest standard errors, ranging from 30.9 to 156 percent. The adjusted coefficient of determination of the equations ranged from 77.5 to 99.4 percent with medians of 98.5 and 90.6 percent to predict the 25- and 99-percent exceedances, respectively. Descriptive information on the streamgages used in the regression, measured basin and climatic characteristics, and estimated flow-duration statistics are provided in this report. Flow-duration statistics and the 32 regression equations for estimating flow-duration statistics in Connecticut are stored on the U.S. Geological Survey World Wide Web application ?StreamStats? (http://water.usgs.gov/osw/streamstats/index.html). The regression equations developed in this report can be used to produce unbiased estimates of select flow exceedances statewide.

  11. Spatio-temporal analysis of annual rainfall in Crete, Greece

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil A.; Corzo, Gerald A.; Karatzas, George P.; Kotsopoulou, Anastasia

    2018-03-01

    Analysis of rainfall data from the island of Crete, Greece was performed to identify key hydrological years and return periods as well as to analyze the inter-annual behavior of the rainfall variability during the period 1981-2014. The rainfall spatial distribution was also examined in detail to identify vulnerable areas of the island. Data analysis using statistical tools and spectral analysis were applied to investigate and interpret the temporal course of the available rainfall data set. In addition, spatial analysis techniques were applied and compared to determine the rainfall spatial distribution on the island of Crete. The analysis presented that in contrast to Regional Climate Model estimations, rainfall rates have not decreased, while return periods vary depending on seasonality and geographic location. A small but statistical significant increasing trend was detected in the inter-annual rainfall variations as well as a significant rainfall cycle almost every 8 years. In addition, statistically significant correlation of the island's rainfall variability with the North Atlantic Oscillation is identified for the examined period. On the other hand, regression kriging method combining surface elevation as secondary information improved the estimation of the annual rainfall spatial variability on the island of Crete by 70% compared to ordinary kriging. The rainfall spatial and temporal trends on the island of Crete have variable characteristics that depend on the geographical area and on the hydrological period.

  12. Effects of a finite outer scale on the measurement of atmospheric-turbulence statistics with a Hartmann wave-front sensor.

    PubMed

    Feng, Shen; Wenhan, Jiang

    2002-06-10

    Phase-structure and aperture-averaged slope-correlated functions with a finite outer scale are derived based on the Taylor hypothesis and a generalized spectrum, such as the von Kármán modal. The effects of the finite outer scale on measuring and determining the character of atmospheric-turbulence statistics are shown especially for an approximately 4-m class telescope and subaperture. The phase structure function and atmospheric coherent length based on the Kolmogorov model are approximations of the formalism we have derived. The analysis shows that it cannot be determined whether the deviation from the power-law parameter of Kolmogorov turbulence is caused by real variations of the spectrum or by the effect of the finite outer scale.

  13. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  14. Rock Statistics at the Mars Pathfinder Landing Site, Roughness and Roving on Mars

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Bridges, N. T.; Anderson, R. C.; Golombek, M. P.

    1999-01-01

    Several rock counts have been carried out at the Mars Pathfinder landing site producing consistent statistics of rock coverage and size-frequency distributions. These rock statistics provide a primary element of "ground truth" for anchoring remote sensing information used to pick the Pathfinder, and future, landing sites. The observed rock population statistics should also be consistent with the emplacement and alteration processes postulated to govern the landing site landscape. The rock population databases can however be used in ways that go beyond the calculation of cumulative number and cumulative area distributions versus rock diameter and height. Since the spatial parameters measured to characterize each rock are determined with stereo image pairs, the rock database serves as a subset of the full landing site digital terrain model (DTM). Insofar as a rock count can be carried out in a speedier, albeit coarser, manner than the full DTM analysis, rock counting offers several operational and scientific products in the near term. Quantitative rock mapping adds further information to the geomorphic study of the landing site, and can also be used for rover traverse planning. Statistical analysis of the surface roughness using the rock count proxy DTM is sufficiently accurate when compared to the full DTM to compare with radar remote sensing roughness measures, and with rover traverse profiles.

  15. CTLA-4 gene polymorphisms and their influence on predisposition to autoimmune thyroid diseases (Graves’ disease and Hashimoto's thyroiditis)

    PubMed Central

    Pastuszak-Lewandoska, Dorota; Sewerynek, Ewa; Domańska, Daria; Gładyś, Aleksandra; Skrzypczak, Renata

    2012-01-01

    Introduction Autoimmune thyroid disease (AITD) is associated with both genetic and environmental factors which lead to the overactivity of immune system. Cytotoxic T-Lymphocyte Antigen 4 (CTLA-4) gene polymorphisms belong to the main genetic factors determining the susceptibility to AITD (Hashimoto's thyroiditis, HT and Graves' disease, GD) development. The aim of the study was to evaluate the relationship between CTLA-4 polymorphisms (A49G, 1822 C/T and CT60 A/G) and HT and/or GD in Polish patients. Material and methods Molecular analysis involved AITD group, consisting of HT (n=28) and GD (n=14) patients, and a control group of healthy persons (n=20). Genomic DNA was isolated from peripheral blood and CTLA-4 polymorphisms were assessed by polymerase chain reaction-restriction fragment length polymorphism method, using three restriction enzymes: Fnu4HI (A49G), BsmAI (1822 C/T) and BsaAI (CT60 A/G). Results Statistical analysis (χ2 test) confirmed significant differences between the studied groups concerning CTLA-4 A49G genotypes. CTLA-4 A/G genotype was significantly more frequent in AITD group and OR analysis suggested that it might increase the susceptibility to HT. In GD patients, OR analysis revealed statistically significant relationship with the presence of G allele. In controls, CTLA-4 A/A genotype frequency was significantly increased suggesting a protective effect. There were no statistically significant differences regarding frequencies of other genotypes and polymorphic alleles of the CTLA-4 gene (1822 C/T and CT60 A/G) between the studied groups. Conclusions CTLA-4 A49G polymorphism seems to be an important genetic determinant of the risk of HT and GD in Polish patients. PMID:22851994

  16. CTLA-4 gene polymorphisms and their influence on predisposition to autoimmune thyroid diseases (Graves' disease and Hashimoto's thyroiditis).

    PubMed

    Pastuszak-Lewandoska, Dorota; Sewerynek, Ewa; Domańska, Daria; Gładyś, Aleksandra; Skrzypczak, Renata; Brzeziańska, Ewa

    2012-07-04

    Autoimmune thyroid disease (AITD) is associated with both genetic and environmental factors which lead to the overactivity of immune system. Cytotoxic T-Lymphocyte Antigen 4 (CTLA-4) gene polymorphisms belong to the main genetic factors determining the susceptibility to AITD (Hashimoto's thyroiditis, HT and Graves' disease, GD) development. The aim of the study was to evaluate the relationship between CTLA-4 polymorphisms (A49G, 1822 C/T and CT60 A/G) and HT and/or GD in Polish patients. Molecular analysis involved AITD group, consisting of HT (n=28) and GD (n=14) patients, and a control group of healthy persons (n=20). Genomic DNA was isolated from peripheral blood and CTLA-4 polymorphisms were assessed by polymerase chain reaction-restriction fragment length polymorphism method, using three restriction enzymes: Fnu4HI (A49G), BsmAI (1822 C/T) and BsaAI (CT60 A/G). Statistical analysis (χ(2) test) confirmed significant differences between the studied groups concerning CTLA-4 A49G genotypes. CTLA-4 A/G genotype was significantly more frequent in AITD group and OR analysis suggested that it might increase the susceptibility to HT. In GD patients, OR analysis revealed statistically significant relationship with the presence of G allele. In controls, CTLA-4 A/A genotype frequency was significantly increased suggesting a protective effect. There were no statistically significant differences regarding frequencies of other genotypes and polymorphic alleles of the CTLA-4 gene (1822 C/T and CT60 A/G) between the studied groups. CTLA-4 A49G polymorphism seems to be an important genetic determinant of the risk of HT and GD in Polish patients.

  17. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  18. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  19. Asymptotic approximation method of force reconstruction: Application and analysis of stationary random forces

    NASA Astrophysics Data System (ADS)

    Sanchez, J.

    2018-06-01

    In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.

  20. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    DOE PAGES

    Mohamed, Mamdouh S.; Larson, Bennett C.; Tischler, Jonathan Z.; ...

    2015-05-18

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoreticalmore » analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kr ner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.« less

  1. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  2. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE PAGES

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...

    2016-03-09

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  3. Model Fit and Item Factor Analysis: Overfactoring, Underfactoring, and a Program to Guide Interpretation.

    PubMed

    Clark, D Angus; Bowles, Ryan P

    2018-04-23

    In exploratory item factor analysis (IFA), researchers may use model fit statistics and commonly invoked fit thresholds to help determine the dimensionality of an assessment. However, these indices and thresholds may mislead as they were developed in a confirmatory framework for models with continuous, not categorical, indicators. The present study used Monte Carlo simulation methods to investigate the ability of popular model fit statistics (chi-square, root mean square error of approximation, the comparative fit index, and the Tucker-Lewis index) and their standard cutoff values to detect the optimal number of latent dimensions underlying sets of dichotomous items. Models were fit to data generated from three-factor population structures that varied in factor loading magnitude, factor intercorrelation magnitude, number of indicators, and whether cross loadings or minor factors were included. The effectiveness of the thresholds varied across fit statistics, and was conditional on many features of the underlying model. Together, results suggest that conventional fit thresholds offer questionable utility in the context of IFA.

  4. Synchronization from Second Order Network Connectivity Statistics

    PubMed Central

    Zhao, Liqiong; Beverlin, Bryce; Netoff, Theoden; Nykamp, Duane Q.

    2011-01-01

    We investigate how network structure can influence the tendency for a neuronal network to synchronize, or its synchronizability, independent of the dynamical model for each neuron. The synchrony analysis takes advantage of the framework of second order networks, which defines four second order connectivity statistics based on the relative frequency of two-connection network motifs. The analysis identifies two of these statistics, convergent connections, and chain connections, as highly influencing the synchrony. Simulations verify that synchrony decreases with the frequency of convergent connections and increases with the frequency of chain connections. These trends persist with simulations of multiple models for the neuron dynamics and for different types of networks. Surprisingly, divergent connections, which determine the fraction of shared inputs, do not strongly influence the synchrony. The critical role of chains, rather than divergent connections, in influencing synchrony can be explained by their increasing the effective coupling strength. The decrease of synchrony with convergent connections is primarily due to the resulting heterogeneity in firing rates. PMID:21779239

  5. Antiviral activity of Quercus persica L.: High efficacy and low toxicity

    PubMed Central

    Karimi, Ali; Moradi, Mohammad-Taghi; Saeedi, Mojtaba; Asgari, Sedigheh; Rafieian-kopaei, Mahmoud

    2013-01-01

    Background: Drug-resistant strain of Herpes simplex virus type 1 (HSV-I) has increased the interest in the use of natural substances. Aims: This study was aimed to determine minimum inhibitory concentration of hydroalchoholic extract of a traditionally used herbal plant, Quercus persica L., on HSV-1 replication on baby hamster kidney (BHK) cells. Setting: The study was conducted in Shahrekord University of Medical Sciences, Iran. Design: This was an experimental study. Materials and Methods: BHK cells were grown in monolayer culture with Dulbecco's modified Eagle's medium (DMEM) supplemented with 5% fetal calf serum and plated onto 48-well culture plates. Fifty percent cytotoxic concentration (CC50%) of Q. persica L. on BHK cells was determined. Subsequently, 50% inhibitory concentration (IC50%) of the extract on replication of HSV-1 both in interacellular and exteracellular cases was assessed. Statistical Analysis: Statistic Probit model was used for statistical analysis. The dose-dependent effect of antiviral activity of the extracts was determined by linear regression. Results: Q. persica L. had no cytotoxic effect on this cell line. There was significant relationship between the concentration of the extract and cell death (P<0.01). IC50s of Q. persica L. on HSV-1, before and after attachment to BHK cells were 1.02 and 0.257 μg/mL, respectively. There was significant relationship between the concentration of this extract and inhibition of cytopathic effect (CPE) (P<0.05). Antioxidant capacity of the extract was 67.5%. Conclusions: The hydroalchoholic extract of Q. persica L. is potentially an appropriate and promising anti herpetic herbal medicine. PMID:24516836

  6. Statistical Analysis of Bending Rigidity Coefficient Determined Using Fluorescence-Based Flicker-Noise Spectroscopy.

    PubMed

    Doskocz, Joanna; Drabik, Dominik; Chodaczek, Grzegorz; Przybyło, Magdalena; Langner, Marek

    2018-06-01

    Bending rigidity coefficient describes propensity of a lipid bilayer to deform. In order to measure the parameter experimentally using flickering noise spectroscopy, the microscopic imaging is required, which necessitates the application of giant unilamellar vesicles (GUV) lipid bilayer model. The major difficulty associated with the application of the model is the statistical character of GUV population with respect to their size and the homogeneity of lipid bilayer composition, if a mixture of lipids is used. In the paper, the bending rigidity coefficient was measured using the fluorescence-enhanced flicker-noise spectroscopy. In the paper, the bending rigidity coefficient was determined for large populations of 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine and 1,2-dioleoyl-sn-glycero-3-phosphocholine vesicles. The quantity of obtained experimental data allows to perform statistical analysis aiming at the identification of the distribution, which is the most appropriate for the calculation of the value of the membrane bending rigidity coefficient. It has been demonstrated that the bending rigidity coefficient is characterized by an asymmetrical distribution, which is well approximated with the gamma distribution. Since there are no biophysical reasons for that we propose to use the difference between normal and gamma fits as a measure of the homogeneity of vesicle population. In addition, the effect of a fluorescent label and types of instrumental setups on determined values has been tested. Obtained results show that the value of the bending rigidity coefficient does not depend on the type of a fluorescent label nor on the type of microscope used.

  7. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

    PubMed

    González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

    2007-06-07

    In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

  8. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  9. Post-operative diffusion weighted imaging as a predictor of posterior fossa syndrome permanence in paediatric medulloblastoma.

    PubMed

    Chua, Felicia H Z; Thien, Ady; Ng, Lee Ping; Seow, Wan Tew; Low, David C Y; Chang, Kenneth T E; Lian, Derrick W Q; Loh, Eva; Low, Sharon Y Y

    2017-03-01

    Posterior fossa syndrome (PFS) is a serious complication faced by neurosurgeons and their patients, especially in paediatric medulloblastoma patients. The uncertain aetiology of PFS, myriad of cited risk factors and therapeutic challenges make this phenomenon an elusive entity. The primary objective of this study was to identify associative factors related to the development of PFS in medulloblastoma patient post-tumour resection. This is a retrospective study based at a single institution. Patient data and all related information were collected from the hospital records, in accordance to a list of possible risk factors associated with PFS. These included pre-operative tumour volume, hydrocephalus, age, gender, extent of resection, metastasis, ventriculoperitoneal shunt insertion, post-operative meningitis and radiological changes in MRI. Additional variables included molecular and histological subtypes of each patient's medulloblastoma tumour. Statistical analysis was employed to determine evidence of each variable's significance in PFS permanence. A total of 19 patients with appropriately complete data was identified. Initial univariate analysis did not show any statistical significance. However, multivariate analysis for MRI-specific changes reported bilateral DWI restricted diffusion changes involving both right and left sides of the surgical cavity was of statistical significance for PFS permanence. The authors performed a clinical study that evaluated possible risk factors for permanent PFS in paediatric medulloblastoma patients. Analysis of collated results found that post-operative DWI restriction in bilateral regions within the surgical cavity demonstrated statistical significance as a predictor of PFS permanence-a novel finding in the current literature.

  10. Impact of Damping Uncertainty on SEA Model Response Variance

    NASA Technical Reports Server (NTRS)

    Schiller, Noah; Cabell, Randolph; Grosveld, Ferdinand

    2010-01-01

    Statistical Energy Analysis (SEA) is commonly used to predict high-frequency vibroacoustic levels. This statistical approach provides the mean response over an ensemble of random subsystems that share the same gross system properties such as density, size, and damping. Recently, techniques have been developed to predict the ensemble variance as well as the mean response. However these techniques do not account for uncertainties in the system properties. In the present paper uncertainty in the damping loss factor is propagated through SEA to obtain more realistic prediction bounds that account for both ensemble and damping variance. The analysis is performed on a floor-equipped cylindrical test article that resembles an aircraft fuselage. Realistic bounds on the damping loss factor are determined from measurements acquired on the sidewall of the test article. The analysis demonstrates that uncertainties in damping have the potential to significantly impact the mean and variance of the predicted response.

  11. Multi-element fingerprinting as a tool in origin authentication of four east China marine species.

    PubMed

    Guo, Lipan; Gong, Like; Yu, Yanlei; Zhang, Hong

    2013-12-01

    The contents of 25 elements in 4 types of commercial marine species from the East China Sea were determined by inductively coupled plasma mass spectrometry and atomic absorption spectrometry. The elemental composition was used to differentiate marine species according to geographical origin by multivariate statistical analysis. The results showed that principal component analysis could distinguish samples from different areas and reveal the elements which played the most important role in origin diversity. The established models by partial least squares discriminant analysis (PLS-DA) and by probabilistic neural network (PNN) can both precisely predict the origin of the marine species. Further study indicated that PLS-DA and PNN were efficacious in regional discrimination. The models from these 2 statistical methods, with an accuracy of 97.92% and 100%, respectively, could both distinguish samples from different areas without the need for species differentiation. © 2013 Institute of Food Technologists®

  12. Tract-Based Spatial Statistics in Preterm-Born Neonates Predicts Cognitive and Motor Outcomes at 18 Months.

    PubMed

    Duerden, E G; Foong, J; Chau, V; Branson, H; Poskitt, K J; Grunau, R E; Synnes, A; Zwicker, J G; Miller, S P

    2015-08-01

    Adverse neurodevelopmental outcome is common in children born preterm. Early sensitive predictors of neurodevelopmental outcome such as MR imaging are needed. Tract-based spatial statistics, a diffusion MR imaging analysis method, performed at term-equivalent age (40 weeks) is a promising predictor of neurodevelopmental outcomes in children born very preterm. We sought to determine the association of tract-based spatial statistics findings before term-equivalent age with neurodevelopmental outcome at 18-months corrected age. Of 180 neonates (born at 24-32-weeks' gestation) enrolled, 153 had DTI acquired early at 32 weeks' postmenstrual age and 105 had DTI acquired later at 39.6 weeks' postmenstrual age. Voxelwise statistics were calculated by performing tract-based spatial statistics on DTI that was aligned to age-appropriate templates. At 18-month corrected age, 166 neonates underwent neurodevelopmental assessment by using the Bayley Scales of Infant Development, 3rd ed, and the Peabody Developmental Motor Scales, 2nd ed. Tract-based spatial statistics analysis applied to early-acquired scans (postmenstrual age of 30-33 weeks) indicated a limited significant positive association between motor skills and axial diffusivity and radial diffusivity values in the corpus callosum, internal and external/extreme capsules, and midbrain (P < .05, corrected). In contrast, for term scans (postmenstrual age of 37-41 weeks), tract-based spatial statistics analysis showed a significant relationship between both motor and cognitive scores with fractional anisotropy in the corpus callosum and corticospinal tracts (P < .05, corrected). Tract-based spatial statistics in a limited subset of neonates (n = 22) scanned at <30 weeks did not significantly predict neurodevelopmental outcomes. The strength of the association between fractional anisotropy values and neurodevelopmental outcome scores increased from early-to-late-acquired scans in preterm-born neonates, consistent with brain dysmaturation in this population. © 2015 by American Journal of Neuroradiology.

  13. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  14. Significance analysis of the regional differences on icing time of water onto fire protective clothing

    NASA Astrophysics Data System (ADS)

    Zhao, L. Z.; Jing, L. S.; Zhang, X. Z.; Xia, J. J.; Chen, Y.; Chen, T.; Hu, C.; Bao, Z. M.; Fu, X. C.; Wang, R. J.; Wang, Y.; Wang, Y. J.

    2017-09-01

    The object of this work was to determine the icing temperature in icing experiment. Firstly, a questionnaire investigation was carried out on 38 fire detachments in different regions. These Statistical percentage results were divided into northern east group and northern west group. Secondly, a significance analysis between these two results was made using Mann-Whitney U test. Then the icing temperature was determined in different regions. Thirdly, the icing experiment was made in the environment of -20°C in Daxing’an Mountain. The anti-icing effect of new fire protective clothing was verified in this icing.

  15. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  16. Determining the Statistical Power of the Kolmogorov-Smirnov and Anderson-Darling Goodness-of-Fit Tests via Monte Carlo Simulation

    DTIC Science & Technology

    2016-12-01

    KS and AD Statistical Power via Monte Carlo Simulation Statistical power is the probability of correctly rejecting the null hypothesis when the...Select a caveat DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Determining the Statistical Power...real-world data to test the accuracy of the simulation. Statistical comparison of these metrics can be necessary when making such a determination

  17. Systems Analysis Directorate Activities Summary - September 1977. Volume 1

    DTIC Science & Technology

    1977-10-01

    Identify by block number) Chemical agent Censor criteria Purity of the agent Statistical samples 20. ABSTRACT (Continue on reverse side U... chemical agent lots. Volume II (CONF) contains an analysis fo the operational capability as the 105rom MIOIAI and M102 Hpwltzej^g, DD , FORM JAN 73...Data Entered) CONTENTS Page Procedure for Determining the Serviceability Category of Chemical Agent Lots •• 5 User’s Guide to the Computer

  18. Quantitative Analysis of Repertoire Scale Immunoglobulin properties in Vaccine Induced B cell Responses

    DTIC Science & Technology

    Immunosequencing now readily generates 103105 sequences per sample ; however, statistical analysis of these repertoires is challenging because of the high genetic...diversity of BCRs and the elaborate clonal relationships among them. To date, most immunosequencing analyses have focused on reporting qualitative ...repertoire differences, (2) identifying how two repertoires differ, and (3) determining appropriate confidence intervals for assessing the size of the differences and their potential biological relevance.

  19. Determining the Value of Automation in Commercial and USAF Supplier Evaluation Systems

    DTIC Science & Technology

    2003-03-01

    Component Analysis,” The Journal of Supply Chain Management, 36: 63-69 (Spring 2000). Piercy , Nigel F. and David W. Cravens. “Examining the Role of...includes content analysis of the interview responses of eight senior managers from commercial companies in the aerospace or air transportation industries...interviewing all ten personnel, the responses were transcribed and transferred into StatPac. The statistical 3 information from StatPac provided the

  20. Correlation of RNA secondary structure statistics with thermodynamic stability and applications to folding.

    PubMed

    Wu, Johnny C; Gardner, David P; Ozer, Stuart; Gutell, Robin R; Ren, Pengyu

    2009-08-28

    The accurate prediction of the secondary and tertiary structure of an RNA with different folding algorithms is dependent on several factors, including the energy functions. However, an RNA higher-order structure cannot be predicted accurately from its sequence based on a limited set of energy parameters. The inter- and intramolecular forces between this RNA and other small molecules and macromolecules, in addition to other factors in the cell such as pH, ionic strength, and temperature, influence the complex dynamics associated with transition of a single stranded RNA to its secondary and tertiary structure. Since all of the factors that affect the formation of an RNAs 3D structure cannot be determined experimentally, statistically derived potential energy has been used in the prediction of protein structure. In the current work, we evaluate the statistical free energy of various secondary structure motifs, including base-pair stacks, hairpin loops, and internal loops, using their statistical frequency obtained from the comparative analysis of more than 50,000 RNA sequences stored in the RNA Comparative Analysis Database (rCAD) at the Comparative RNA Web (CRW) Site. Statistical energy was computed from the structural statistics for several datasets. While the statistical energy for a base-pair stack correlates with experimentally derived free energy values, suggesting a Boltzmann-like distribution, variation is observed between different molecules and their location on the phylogenetic tree of life. Our statistical energy values calculated for several structural elements were utilized in the Mfold RNA-folding algorithm. The combined statistical energy values for base-pair stacks, hairpins and internal loop flanks result in a significant improvement in the accuracy of secondary structure prediction; the hairpin flanks contribute the most.

  1. Best Phd thesis Prize: Statistical analysis of ALFALFA galaxies: insights in galaxy

    NASA Astrophysics Data System (ADS)

    Papastergis, E.

    2013-09-01

    We use the rich dataset of local universe galaxies detected by the ALFALFA 21cm survey to study the statistical properties of gas-bearing galaxies. In particular, we measure the number density of galaxies as a function of their baryonic mass ("baryonic mass function") and rotational velocity ("velocity width function"), and we characterize their clustering properties ("two-point correlation function"). These statistical distributions are determined by both the properties of dark matter on small scales, as well as by the complex baryonic processes through which galaxies form over cosmic time. We interpret the ALFALFA measurements with the aid of publicly available cosmological N-body simulations and we present some key results related to galaxy formation and small-scale cosmology.

  2. The construction and assessment of a statistical model for the prediction of protein assay data.

    PubMed

    Pittman, J; Sacks, J; Young, S Stanley

    2002-01-01

    The focus of this work is the development of a statistical model for a bioinformatics database whose distinctive structure makes model assessment an interesting and challenging problem. The key components of the statistical methodology, including a fast approximation to the singular value decomposition and the use of adaptive spline modeling and tree-based methods, are described, and preliminary results are presented. These results are shown to compare favorably to selected results achieved using comparitive methods. An attempt to determine the predictive ability of the model through the use of cross-validation experiments is discussed. In conclusion a synopsis of the results of these experiments and their implications for the analysis of bioinformatic databases in general is presented.

  3. Statistical competencies for medical research learners: What is fundamental?

    PubMed

    Enders, Felicity T; Lindsell, Christopher J; Welty, Leah J; Benn, Emma K T; Perkins, Susan M; Mayo, Matthew S; Rahbar, Mohammad H; Kidwell, Kelley M; Thurston, Sally W; Spratt, Heidi; Grambow, Steven C; Larson, Joseph; Carter, Rickey E; Pollock, Brad H; Oster, Robert A

    2017-06-01

    It is increasingly essential for medical researchers to be literate in statistics, but the requisite degree of literacy is not the same for every statistical competency in translational research. Statistical competency can range from 'fundamental' (necessary for all) to 'specialized' (necessary for only some). In this study, we determine the degree to which each competency is fundamental or specialized. We surveyed members of 4 professional organizations, targeting doctorally trained biostatisticians and epidemiologists who taught statistics to medical research learners in the past 5 years. Respondents rated 24 educational competencies on a 5-point Likert scale anchored by 'fundamental' and 'specialized.' There were 112 responses. Nineteen of 24 competencies were fundamental. The competencies considered most fundamental were assessing sources of bias and variation (95%), recognizing one's own limits with regard to statistics (93%), identifying the strengths, and limitations of study designs (93%). The least endorsed items were meta-analysis (34%) and stopping rules (18%). We have identified the statistical competencies needed by all medical researchers. These competencies should be considered when designing statistical curricula for medical researchers and should inform which topics are taught in graduate programs and evidence-based medicine courses where learners need to read and understand the medical research literature.

  4. Visual functions in amblyopia as determinants of response to treatment.

    PubMed

    Singh, Vinita; Agrawal, Siddharth

    2013-01-01

    To describe the visual functions in amblyopia as determinants of response to treatment. Sixty-nine patients with unilateral and bilateral amblyopia (114 amblyopic eyes) 3 to 15 years old (mean age: 8.80 ± 2.9 years), 40 males (58%) and 29 females (42%), were included in this study. All patients were treated by conventional occlusion 6 hours per day for mild to moderate amblyopia (visual acuity 0.70 or better) and full-time for 4 weeks followed by 6 hours per day for severe amblyopia (visual acuity 0.8 or worse). During occlusion, near activities requiring hand-eye coordination were advised. The follow-up examination was done at 3 and 6 months. Improvement in visual acuity was evaluated on the logMAR chart and correlated with the visual functions. Statistical analysis was done using Wilcoxon rank sum test (Mann-Whitney U test) and Kruskal-Wallis analysis. There was a statistically significant association of poor contrast sensitivity with the grade of amblyopia (P < .001). The grade of amblyopia (P < .01), accommodation (P < .01), stereopsis (P = .01), and mesopic visual acuity (P < .03) were found to have a correlation with response to amblyopia therapy. The grade of amblyopia (initial visual acuity) and accommodation are strong determinants of response to amblyopia therapy, whereas stereopsis and mesopic visual acuity have some value as determinants. Copyright 2013, SLACK Incorporated.

  5. Multi-criteria decision analysis and spatial statistic: an approach to determining human vulnerability to vector transmission of Trypanosoma cruzi.

    PubMed

    Montenegro, Diego; Cunha, Ana Paula da; Ladeia-Andrade, Simone; Vera, Mauricio; Pedroso, Marcel; Junqueira, Angela

    2017-10-01

    Chagas disease (CD), caused by the protozoan Trypanosoma cruzi, is a neglected human disease. It is endemic to the Americas and is estimated to have an economic impact, including lost productivity and disability, of 7 billion dollars per year on average. To assess vulnerability to vector-borne transmission of T. cruzi in domiciliary environments within an area undergoing domiciliary vector interruption of T. cruzi in Colombia. Multi-criteria decision analysis [preference ranking method for enrichment evaluation (PROMETHEE) and geometrical analysis for interactive assistance (GAIA) methods] and spatial statistics were performed on data from a socio-environmental questionnaire and an entomological survey. In the construction of multi-criteria descriptors, decision-making processes and indicators of five determinants of the CD vector pathway were summarily defined, including: (1) house indicator (HI); (2) triatominae indicator (TI); (3) host/reservoir indicator (Ho/RoI); (4) ecotope indicator (EI); and (5) socio-cultural indicator (S-CI). Determination of vulnerability to CD is mostly influenced by TI, with 44.96% of the total weight in the model, while the lowest contribution was from S-CI, with 7.15%. The five indicators comprise 17 indices, and include 78 of the original 104 priority criteria and variables. The PROMETHEE and GAIA methods proved very efficient for prioritisation and quantitative categorisation of socio-environmental determinants and for better determining which criteria should be considered for interrupting the man-T. cruzi-vector relationship in endemic areas of the Americas. Through the analysis of spatial autocorrelation it is clear that there is a spatial dependence in establishing categories of vulnerability, therefore, the effect of neighbors' setting (border areas) on local values should be incorporated into disease management for establishing programs of surveillance and control of CD via vector. The study model proposed here is flexible and can be adapted to various eco-epidemiological profiles and is suitable for focusing anti-T. cruzi serological surveillance programs in vulnerable human populations.

  6. A comparison of the technique of the football quarterback pass between high school and university athletes.

    PubMed

    Toffan, Adam; Alexander, Marion J L; Peeler, Jason

    2017-07-28

    The purpose of the study was to compare the most effective joint movements, segment velocities and body positions to perform the fastest and most accurate pass of high school and university football quarterbacks. Secondary purposes were to develop a quarterback throwing test to assess skill level, to determine which kinematic variables were different between high school and university athletes as well as to determine which variables were significant predictors of quarterback throwing test performance. Ten high school and ten university athletes were filmed for the study, performing nine passes at a target and two passes for maximum distance. Thirty variables were measured using Dartfish Team Pro 4.5.2 video analysis system, and Microsoft Excel was used for statistical analysis. University athletes scored slightly higher than the high school athletes on the throwing test, however this result was not statistically significant. Correlation analysis and forward stepwise multiple regression analysis was performed on both the high school players and the university players in order to determine which variables were significant predictors of throwing test score. Ball velocity was determined to have the strongest predictive effect on throwing test score (r = 0.900) for the high school athletes, however, position of the back foot at release was also determined to be important (r = 0.661) for the university group. Several significant differences in throwing technique between groups were noted during the pass, however, body position at release showed the greatest differences between the two groups. High school players could benefit from more complete weight transfer and decreased throw time to increase throwing test score. University athletes could benefit from increased throw time and greater range of motion in external shoulder rotation and trunk rotation to increase their throwing test score. Coaches and practitioners will be able to use the findings of this research to help improve these and related throwing variables in their high school and university quarterbacks.

  7. Hepatitis C infection among intravenous drug users attending therapy programs in Cyprus.

    PubMed

    Demetriou, Victoria L; van de Vijver, David A M C; Hezka, Johana; Kostrikis, Leondios G; Kostrikis, Leondios G

    2010-02-01

    The most high-risk population for HCV transmission worldwide today are intravenous drug users. HCV genotypes in the general population in Cyprus demonstrate a polyphyletic infection and include subtypes associated with intravenous drug users. The prevalence of HCV, HBV, and HIV infection, HCV genotypes and risk factors among intravenous drug users in Cyprus were investigated here for the first time. Blood samples and interviews were obtained from 40 consenting users in treatment centers, and were tested for HCV, HBV, and HIV antibodies. On the HCV-positive samples, viral RNA extraction, RT-PCR and sequencing were performed. Phylogenetic analysis determined subtype and any relationships with database sequences and statistical analysis determined any correlation of risk factors with HCV infection. The prevalence of HCV infection was 50%, but no HBV or HIV infections were found. Of the PCR-positive samples, eight (57%) were genotype 3a, and six (43%) were 1b. No other subtypes, recombinant strains or mixed infections were observed. The phylogenetic analysis of the injecting drug users' strains against database sequences observed no clustering, which does not allow determination of transmission route, possibly due to a limitation of sequences in the database. However, three clusters were discovered among the drug users' sequences, revealing small groups who possibly share injecting equipment. Statistical analysis showed the risk factor associated with HCV infection is drug use duration. Overall, the polyphyletic nature of HCV infection in Cyprus is confirmed, but the transmission route remains unknown. These findings highlight the need for harm-reduction strategies to reduce HCV transmission. (c) 2009 Wiley-Liss, Inc.

  8. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future.

    PubMed

    Barnes, Stephen; Benton, H Paul; Casazza, Krista; Cooper, Sara J; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K; Renfrow, Matthew B; Tiwari, Hemant K

    2016-08-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    NASA Astrophysics Data System (ADS)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  10. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  11. A Monte Carlo Analysis of the Thrust Imbalance for the RSRMV Booster During Both the Ignition Transient and Steady State Operation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.

    2014-01-01

    This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle

  12. A Monte Carlo Analysis of the Thrust Imbalance for the Space Launch System Booster During Both the Ignition Transient and Steady State Operation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.

    2014-01-01

    This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.

  13. Scaling Laws in Canopy Flows: A Wind-Tunnel Analysis

    NASA Astrophysics Data System (ADS)

    Segalini, Antonio; Fransson, Jens H. M.; Alfredsson, P. Henrik

    2013-08-01

    An analysis of velocity statistics and spectra measured above a wind-tunnel forest model is reported. Several measurement stations downstream of the forest edge have been investigated and it is observed that, while the mean velocity profile adjusts quickly to the new canopy boundary condition, the turbulence lags behind and shows a continuous penetration towards the free stream along the canopy model. The statistical profiles illustrate this growth and do not collapse when plotted as a function of the vertical coordinate. However, when the statistics are plotted as function of the local mean velocity (normalized with a characteristic velocity scale), they do collapse, independently of the streamwise position and freestream velocity. A new scaling for the spectra of all three velocity components is proposed based on the velocity variance and integral time scale. This normalization improves the collapse of the spectra compared to existing scalings adopted in atmospheric measurements, and allows the determination of a universal function that provides the velocity spectrum. Furthermore, a comparison of the proposed scaling laws for two different canopy densities is shown, demonstrating that the vertical velocity variance is the most sensible statistical quantity to the characteristics of the canopy roughness.

  14. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Frontiers of Two-Dimensional Correlation Spectroscopy. Part 1. New concepts and noteworthy developments

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.

  16. Multivariate statistical analysis of heavy metal concentration in soils of Yelagiri Hills, Tamilnadu, India--spectroscopical approach.

    PubMed

    Chandrasekaran, A; Ravisankar, R; Harikrishnan, N; Satapathy, K K; Prasad, M V R; Kanagasabapathy, K V

    2015-02-25

    Anthropogenic activities increase the accumulation of heavy metals in the soil environment. Soil pollution significantly reduces environmental quality and affects the human health. In the present study soil samples were collected at different locations of Yelagiri Hills, Tamilnadu, India for heavy metal analysis. The samples were analyzed for twelve selected heavy metals (Mg, Al, K, Ca, Ti, Fe, V, Cr, Mn, Co, Ni and Zn) using energy dispersive X-ray fluorescence (EDXRF) spectroscopy. Heavy metals concentration in soil were investigated using enrichment factor (EF), geo-accumulation index (Igeo), contamination factor (CF) and pollution load index (PLI) to determine metal accumulation, distribution and its pollution status. Heavy metal toxicity risk was assessed using soil quality guidelines (SQGs) given by target and intervention values of Dutch soil standards. The concentration of Ni, Co, Zn, Cr, Mn, Fe, Ti, K, Al, Mg were mainly controlled by natural sources. Multivariate statistical methods such as correlation matrix, principal component analysis and cluster analysis were applied for the identification of heavy metal sources (anthropogenic/natural origin). Geo-statistical methods such as kirging identified hot spots of metal contamination in road areas influenced mainly by presence of natural rocks. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Regional Exploratory Analysis Between Atomospheric Aerosols and Precipitable Water in the Lower Troposhere via Inferential Statistics

    NASA Astrophysics Data System (ADS)

    Martinez, B. S.; Ye, H.; Levy, R. C.; Fetzer, E. J.; Remer, L.

    2017-12-01

    Atmospheric aerosols expose high levels of uncertainty in regard to Earth's changing atmospheric energy budget. Continued exploration and analysis is necessary to obtain more complete understanding in which, and to what degree, aerosols contribute within climate feedbacks and global climate change. With the advent of global satellite retrievals, along with specific aerosol optical depth (AOD) Dark Target and Deep Blue algorithms, aerosols can now be better measured and analyzed. Aerosol effect on climate depends primarily on altitude, the reflectance albedo of the underlying surface, along with the presence of clouds and the dynamics thereof. As currently known, the majority of aerosol distribution and mixing occur in the lower troposphere from the surface upwards to around 2km. Additionally, being a primary greenhouse gas contributor, water vapor is significant to climate feedbacks and Earth's radiation budget. Feedbacks are generally reported from the top of atmosphere (TOA). Therefore, little is known of the relationship between water vapor and aerosols; specifically, in regional areas of the globe known for aerosol loading such as anthropogenic biomass burning in South America and naturally occurring dust blowing off the deserts in the African and Arabian peninsulas. Statistical regression and timeseries analysis are used in determining significant probabilities suggesting trends of both regional precipitable water (PW) and AOD increase and decrease over a 13-year time period from 2003-2015. Regions with statistically significant positive or negative trends of AOD and PW are analyzed in determining correlations, or lack thereof. This initial examination helps to deduce and better understand how aerosols contribute to the radiation budget and assessing climate change.

  18. Large-angle correlations in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Efstathiou, George; Ma, Yin-Zhe; Hanson, Duncan

    2010-10-01

    It has been argued recently by Copi et al. 2009 that the lack of large angular correlations of the CMB temperature field provides strong evidence against the standard, statistically isotropic, inflationary Lambda cold dark matter (ΛCDM) cosmology. We compare various estimators of the temperature correlation function showing how they depend on assumptions of statistical isotropy and how they perform on the Wilkinson Microwave Anisotropy Probe (WMAP) 5-yr Internal Linear Combination (ILC) maps with and without a sky cut. We show that the low multipole harmonics that determine the large-scale features of the temperature correlation function can be reconstructed accurately from the data that lie outside the sky cuts. The reconstructions are only weakly dependent on the assumed statistical properties of the temperature field. The temperature correlation functions computed from these reconstructions are in good agreement with those computed from the ILC map over the whole sky. We conclude that the large-scale angular correlation function for our realization of the sky is well determined. A Bayesian analysis of the large-scale correlations is presented, which shows that the data cannot exclude the standard ΛCDM model. We discuss the differences between our results and those of Copi et al. Either there exists a violation of statistical isotropy as claimed by Copi et al., or these authors have overestimated the significance of the discrepancy because of a posteriori choices of estimator, statistic and sky cut.

  19. Promoting College Students' Problem Understanding Using Schema-Emphasizing Worked Examples

    ERIC Educational Resources Information Center

    Yan, Jie; Lavigne, Nancy C.

    2014-01-01

    Statistics learners often bypass the critical step of understanding a problem before executing solutions. Worked-out examples that identify problem information (e.g., data type, number of groups, purpose of analysis) key to determining a solution (e.g., "t" test, chi-square, correlation) can address this concern. The authors examined the…

  20. 40 CFR 430.03 - Best management practices (BMPs) for spent pulping liquor, soap, and turpentine management, spill...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... triggers investigative or corrective action. Mills determine action levels by a statistical analysis of six... exchanger, recovery furnace or boiler, pipeline, valve, fitting, or other device that contains, processes... gases from the cooking of softwoods by the kraft pulping process. Sometimes referred to as sulfate...

  1. Students' Initial Knowledge State and Test Design: Towards a Valid and Reliable Test Instrument

    ERIC Educational Resources Information Center

    CoPo, Antonio Roland I.

    2015-01-01

    Designing a good test instrument involves specifications, test construction, validation, try-out, analysis and revision. The initial knowledge state of forty (40) tertiary students enrolled in Business Statistics course was determined and the same test instrument undergoes validation. The designed test instrument did not only reveal the baseline…

  2. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    ERIC Educational Resources Information Center

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  3. Socioeconomic Determinants of Urban Poverty Area Workers' Labor Force Participation and Income.

    ERIC Educational Resources Information Center

    Pinkerton, James R.

    This study examined how the socioeconomic characteristics of male workers from poverty areas in Saint Louis, Missouri, San Antonio, Texas, and Chicago, Illinois, affect their incomes, hours of employment, unemployment, and labor force participation. The research was based on statistical analysis, using an interaction model, of data from the 1970…

  4. The Influence of Ability Grouping on Math Achievement in a Rural Middle School

    ERIC Educational Resources Information Center

    Pritchard, Robert R.

    2012-01-01

    The researcher examined the academic performance of low-tracked students (n = 156) using standardized math test scores to determine whether there is a statistically significant difference in achievement depending on academic environment, tracked or nontracked. An analysis of variance (ANOVA) was calculated, using a paired samples t-test for a…

  5. Assessment of Caudal Fin Clips as a Non-lethal Technique for Predicting Muscle Tissue Mercury Concentrations in Largeouth Bass

    EPA Science Inventory

    The statistical relationship between total mercury (Hg) concentration in clips from the caudal fin and muscle tissue of largemouth bass (Micropterus salmoides) from 26 freshwater sites in Rhode Island, USA was developed and evaluated to determine the utility of fin clip analysis ...

  6. Swimming Pool Water Treatment Chemicals and/or Processes. Standard No. 22.

    ERIC Educational Resources Information Center

    National Sanitation Foundation, Ann Arbor, MI.

    Chemicals or processes used or intended for use, in the treatment of swimming pool water are covered. Minimum public health limits or acceptability in regard to toxicity, biocidal effectiveness, and chemical behavior and analysis are presented. The appendices give guidelines to the scientific and statistically sound evaluations to determine the…

  7. Welding of AM350 and AM355 steel

    NASA Technical Reports Server (NTRS)

    Davis, R. J.; Wroth, R. S.

    1967-01-01

    A series of tests was conducted to establish optimum procedures for TIG welding and heat treating of AM350 and AM355 steel sheet in thicknesses ranging from 0.010 inch to 0.125 inch. Statistical analysis of the test data was performed to determine the anticipated minimum strength of the welded joints.

  8. A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES

    PubMed Central

    BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.

    2014-01-01

    We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250

  9. Use of model calibration to achieve high accuracy in analysis of computer networks

    DOEpatents

    Frogner, Bjorn; Guarro, Sergio; Scharf, Guy

    2004-05-11

    A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.

  10. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz-Bąk, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.; Marciniak, A.

    2017-09-01

    Among asteroids there exist ambiguities in their rotation period determinations. They are due to incomplete coverage of the rotation, noise and/or aliases resulting from gaps between separate lightcurves. To help to remove such uncertainties, basic characteristic of the lightcurves resulting from constraints imposed by the asteroid shapes and geometries of observations should be identified. We simulated light variations of asteroids whose shapes were modelled as Gaussian random spheres, with random orientations of spin vectors and phase angles changed every 5° from 0° to 65°. This produced 1.4 million lightcurves. For each simulated lightcurve, Fourier analysis has been made and the harmonic of the highest amplitude was recorded. From the statistical point of view, all lightcurves observed at phase angles α < 30°, with peak-to-peak amplitudes A > 0.2 mag, are bimodal. Second most frequently dominating harmonic is the first one, with the 3rd harmonic following right after. For 1 per cent of lightcurves with amplitudes A < 0.1 mag and phase angles α < 40°, 4th harmonic dominates.

  11. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  12. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  13. Rapid determination of trace copper in animal feed based on micro-plate colorimetric reaction and statistical partitioning correction.

    PubMed

    Niu, Yiming; Wang, Jiayi; Zhang, Chi; Chen, Yiqiang

    2017-04-15

    The objective of this study was to develop a micro-plate based colorimetric assay for rapid and high-throughput detection of copper in animal feed. Copper ion in animal feed was extracted by trichloroacetic acid solution and reduced to cuprous ion by hydroxylamine. The cuprous ion can chelate with 2,2'-bicinchoninic acid to form a Cu-BCA complex which was detected with high sensitivity by micro-plate reader at 354nm. The whole assay procedure can be completed within 20min. To eliminate matrix interference, a statistical partitioning correction approach was proposed, which makes the detection of copper in complex samples possible. The limit of detection was 0.035μg/mL and the detection range was 0.1-10μg/mL of copper in buffer solution. Actual sample analysis indicated that this colorimetric assay produced results consistent with atomic absorption spectrometry analysis. These results demonstrated that the developed assay can be used for rapid determination of copper in animal feed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. The influence of economic business cycles on United States suicide rates.

    PubMed

    Wasserman, I M

    1984-01-01

    A number of social science investigators have shown that a downturn in the economy leads to an increase in the suicide rate. However, the previous works on the subject are flawed by the fact that they employ years as their temporal unit of analysis. This time period is so large that it makes it difficult for investigators to precisely determine the length of the lag effect, while at the same time removing the autocorrelation effects. Also, although most works on suicide and the business cycle employ unemployment as a measure of a downturn in the business cycle, the average duration of unemployment represents a better measure for determining the social impact of an economic downturn. From 1947 to 1977 the average monthly duration of unemployment is statistically related to the suicide rate using multivariate time-series analysis. From 1910 to 1939 the Ayres business index, a surrogate measure for movement in the business cycle, is statistically related to the monthly suicide rate. An examination of the findings confirms that in most cases a downturn in the economy causes an increase in the suicide rate.

  15. A Method for Characterizing Phenotypic Changes in Highly Variable Cell Populations and its Application to High Content Screening of Arabidopsis thaliana Protoplastsa

    PubMed Central

    Johnson, Gregory R.; Kangas, Joshua D.; Dovzhenko, Alexander; Trojok, Rüdiger; Voigt, Karsten; Majarian, Timothy D.; Palme, Klaus; Murphy, Robert F.

    2017-01-01

    Quantitative image analysis procedures are necessary for the automated discovery of effects of drug treatment in large collections of fluorescent micrographs. When compared to their mammalian counterparts, the effects of drug conditions on protein localization in plant species are poorly understood and underexplored. To investigate this relationship, we generated a large collection of images of single plant cells after various drug treatments. For this, protoplasts were isolated from six transgenic lines of A. thaliana expressing fluorescently tagged proteins. Nine drugs at three concentrations were applied to protoplast cultures followed by automated image acquisition. For image analysis, we developed a cell segmentation protocol for detecting drug effects using a Hough-transform based region of interest detector and a novel cross-channel texture feature descriptor. In order to determine treatment effects, we summarized differences between treated and untreated experiments with an L1 Cramér-von Mises statistic. The distribution of these statistics across all pairs of treated and untreated replicates was compared to the variation within control replicates to determine the statistical significance of observed effects. Using this pipeline, we report the dose dependent drug effects in the first high-content Arabidopsis thaliana drug screen of its kind. These results can function as a baseline for comparison to other protein organization modeling approaches in plant cells. PMID:28245335

  16. The determinants of bond angle variability in protein/peptide backbones: A comprehensive statistical/quantum mechanics analysis.

    PubMed

    Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana

    2015-11-01

    The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.

  17. The GEOS Ozone Data Assimilation System: Specification of Error Statistics

    NASA Technical Reports Server (NTRS)

    Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.

    2000-01-01

    A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.

  18. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  19. Statistical analysis of the electric energy production from photovoltaic conversion using mobile and fixed constructions

    NASA Astrophysics Data System (ADS)

    Bugała, Artur; Bednarek, Karol; Kasprzyk, Leszek; Tomczewski, Andrzej

    2017-10-01

    The paper presents the most representative - from the three-year measurement time period - characteristics of daily and monthly electricity production from a photovoltaic conversion using modules installed in a fixed and 2-axis tracking construction. Results are presented for selected summer, autumn, spring and winter days. Analyzed measuring stand is located on the roof of the Faculty of Electrical Engineering Poznan University of Technology building. The basic parameters of the statistical analysis like mean value, standard deviation, skewness, kurtosis, median, range, or coefficient of variation were used. It was found that the asymmetry factor can be useful in the analysis of the daily electricity production from a photovoltaic conversion. In order to determine the repeatability of monthly electricity production, occurring between the summer, and summer and winter months, a non-parametric Mann-Whitney U test was used as a statistical solution. In order to analyze the repeatability of daily peak hours, describing the largest value of the hourly electricity production, a non-parametric Kruskal-Wallis test was applied as an extension of the Mann-Whitney U test. Based on the analysis of the electric energy distribution from a prepared monitoring system it was found that traditional forecasting methods of the electricity production from a photovoltaic conversion, like multiple regression models, should not be the preferred methods of the analysis.

  20. Predictive analysis effectiveness in determining the epidemic disease infected area

    NASA Astrophysics Data System (ADS)

    Ibrahim, Najihah; Akhir, Nur Shazwani Md.; Hassan, Fadratul Hafinaz

    2017-10-01

    Epidemic disease outbreak had caused nowadays community to raise their great concern over the infectious disease controlling, preventing and handling methods to diminish the disease dissemination percentage and infected area. Backpropagation method was used for the counter measure and prediction analysis of the epidemic disease. The predictive analysis based on the backpropagation method can be determine via machine learning process that promotes the artificial intelligent in pattern recognition, statistics and features selection. This computational learning process will be integrated with data mining by measuring the score output as the classifier to the given set of input features through classification technique. The classification technique is the features selection of the disease dissemination factors that likely have strong interconnection between each other in causing infectious disease outbreaks. The predictive analysis of epidemic disease in determining the infected area was introduced in this preliminary study by using the backpropagation method in observation of other's findings. This study will classify the epidemic disease dissemination factors as the features for weight adjustment on the prediction of epidemic disease outbreaks. Through this preliminary study, the predictive analysis is proven to be effective method in determining the epidemic disease infected area by minimizing the error value through the features classification.

Top