Sample records for multiple testing procedures

  1. Bon-EV: an improved multiple testing procedure for controlling false discovery rates.

    PubMed

    Li, Dongmei; Xie, Zidian; Zand, Martin; Fogg, Thomas; Dye, Timothy

    2017-01-03

    Stability of multiple testing procedures, defined as the standard deviation of total number of discoveries, can be used as an indicator of variability of multiple testing procedures. Improving stability of multiple testing procedures can help to increase the consistency of findings from replicated experiments. Benjamini-Hochberg's and Storey's q-value procedures are two commonly used multiple testing procedures for controlling false discoveries in genomic studies. Storey's q-value procedure has higher power and lower stability than Benjamini-Hochberg's procedure. To improve upon the stability of Storey's q-value procedure and maintain its high power in genomic data analysis, we propose a new multiple testing procedure, named Bon-EV, to control false discovery rate (FDR) based on Bonferroni's approach. Simulation studies show that our proposed Bon-EV procedure can maintain the high power of the Storey's q-value procedure and also result in better FDR control and higher stability than Storey's q-value procedure for samples of large size(30 in each group) and medium size (15 in each group) for either independent, somewhat correlated, or highly correlated test statistics. When sample size is small (5 in each group), our proposed Bon-EV procedure has performance between the Benjamini-Hochberg procedure and the Storey's q-value procedure. Examples using RNA-Seq data show that the Bon-EV procedure has higher stability than the Storey's q-value procedure while maintaining equivalent power, and higher power than the Benjamini-Hochberg's procedure. For medium or large sample sizes, the Bon-EV procedure has improved FDR control and stability compared with the Storey's q-value procedure and improved power compared with the Benjamini-Hochberg procedure. The Bon-EV multiple testing procedure is available as the BonEV package in R for download at https://CRAN.R-project.org/package=BonEV .

  2. A Rejection Principle for Sequential Tests of Multiple Hypotheses Controlling Familywise Error Rates

    PubMed Central

    BARTROFF, JAY; SONG, JINLIN

    2015-01-01

    We present a unifying approach to multiple testing procedures for sequential (or streaming) data by giving sufficient conditions for a sequential multiple testing procedure to control the familywise error rate (FWER). Together we call these conditions a “rejection principle for sequential tests,” which we then apply to some existing sequential multiple testing procedures to give simplified understanding of their FWER control. Next the principle is applied to derive two new sequential multiple testing procedures with provable FWER control, one for testing hypotheses in order and another for closed testing. Examples of these new procedures are given by applying them to a chromosome aberration data set and to finding the maximum safe dose of a treatment. PMID:26985125

  3. Adaptive graph-based multiple testing procedures

    PubMed Central

    Klinglmueller, Florian; Posch, Martin; Koenig, Franz

    2016-01-01

    Multiple testing procedures defined by directed, weighted graphs have recently been proposed as an intuitive visual tool for constructing multiple testing strategies that reflect the often complex contextual relations between hypotheses in clinical trials. Many well-known sequentially rejective tests, such as (parallel) gatekeeping tests or hierarchical testing procedures are special cases of the graph based tests. We generalize these graph-based multiple testing procedures to adaptive trial designs with an interim analysis. These designs permit mid-trial design modifications based on unblinded interim data as well as external information, while providing strong family wise error rate control. To maintain the familywise error rate, it is not required to prespecify the adaption rule in detail. Because the adaptive test does not require knowledge of the multivariate distribution of test statistics, it is applicable in a wide range of scenarios including trials with multiple treatment comparisons, endpoints or subgroups, or combinations thereof. Examples of adaptations are dropping of treatment arms, selection of subpopulations, and sample size reassessment. If, in the interim analysis, it is decided to continue the trial as planned, the adaptive test reduces to the originally planned multiple testing procedure. Only if adaptations are actually implemented, an adjusted test needs to be applied. The procedure is illustrated with a case study and its operating characteristics are investigated by simulations. PMID:25319733

  4. Multiple hypotheses testing based on ordered p values--a historical survey with applications to medical research.

    PubMed

    Hommel, Gerhard; Bretz, Frank; Maurer, Willi

    2011-07-01

    Global tests and multiple test procedures are often based on ordered p values. Such procedures are available for arbitrary dependence structures as well as for specific dependence assumptions of the test statistics. Most of these procedures have been considered as global tests. Multiple test procedures can be obtained by applying the closure principle in order to control the familywise error rate, or by using the false discovery rate as a criterion for type I error rate control. We provide an overview and present examples showing the importance of these procedures in medical research. Finally, we discuss modifications when different weights for the hypotheses of interest are chosen.

  5. A two-step hierarchical hypothesis set testing framework, with applications to gene expression data on ordered categories

    PubMed Central

    2014-01-01

    Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138

  6. 40 CFR 1066.410 - Dynamometer test procedure.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... drive mode. (For purposes of this paragraph (g), the term four-wheel drive includes other multiple drive... Dynamometer test procedure. (a) Dynamometer testing may consist of multiple drive cycles with both cold-start...-setting part identifies the driving schedules and the associated sample intervals, soak periods, engine...

  7. Examining Measurement Invariance and Differential Item Functioning with Discrete Latent Construct Indicators: A Note on a Multiple Testing Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Li, Tatyana; Menold, Natalja

    2018-01-01

    A latent variable modeling method for studying measurement invariance when evaluating latent constructs with multiple binary or binary scored items with no guessing is outlined. The approach extends the continuous indicator procedure described by Raykov and colleagues, utilizes similarly the false discovery rate approach to multiple testing, and…

  8. POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.

    PubMed

    Peña, Edsel A; Habiger, Joshua D; Wu, Wensong

    2011-02-01

    Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.

  9. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  10. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    ERIC Educational Resources Information Center

    Porter, Kristin E.

    2018-01-01

    Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…

  11. A SAS(®) macro implementation of a multiple comparison post hoc test for a Kruskal-Wallis analysis.

    PubMed

    Elliott, Alan C; Hynan, Linda S

    2011-04-01

    The Kruskal-Wallis (KW) nonparametric analysis of variance is often used instead of a standard one-way ANOVA when data are from a suspected non-normal population. The KW omnibus procedure tests for some differences between groups, but provides no specific post hoc pair wise comparisons. This paper provides a SAS(®) macro implementation of a multiple comparison test based on significant Kruskal-Wallis results from the SAS NPAR1WAY procedure. The implementation is designed for up to 20 groups at a user-specified alpha significance level. A Monte-Carlo simulation compared this nonparametric procedure to commonly used parametric multiple comparison tests. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. False Discovery Control in Large-Scale Spatial Multiple Testing

    PubMed Central

    Sun, Wenguang; Reich, Brian J.; Cai, T. Tony; Guindani, Michele; Schwartzman, Armin

    2014-01-01

    Summary This article develops a unified theoretical and computational framework for false discovery control in multiple testing of spatial signals. We consider both point-wise and cluster-wise spatial analyses, and derive oracle procedures which optimally control the false discovery rate, false discovery exceedance and false cluster rate, respectively. A data-driven finite approximation strategy is developed to mimic the oracle procedures on a continuous spatial domain. Our multiple testing procedures are asymptotically valid and can be effectively implemented using Bayesian computational algorithms for analysis of large spatial data sets. Numerical results show that the proposed procedures lead to more accurate error control and better power performance than conventional methods. We demonstrate our methods for analyzing the time trends in tropospheric ozone in eastern US. PMID:25642138

  13. Multiple Testing of Gene Sets from Gene Ontology: Possibilities and Pitfalls.

    PubMed

    Meijer, Rosa J; Goeman, Jelle J

    2016-09-01

    The use of multiple testing procedures in the context of gene-set testing is an important but relatively underexposed topic. If a multiple testing method is used, this is usually a standard familywise error rate (FWER) or false discovery rate (FDR) controlling procedure in which the logical relationships that exist between the different (self-contained) hypotheses are not taken into account. Taking those relationships into account, however, can lead to more powerful variants of existing multiple testing procedures and can make summarizing and interpreting the final results easier. We will show that, from the perspective of interpretation as well as from the perspective of power improvement, FWER controlling methods are more suitable than FDR controlling methods. As an example of a possible power improvement, we suggest a modified version of the popular method by Holm, which we also implemented in the R package cherry. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. A unified framework for weighted parametric multiple test procedures.

    PubMed

    Xi, Dong; Glimm, Ekkehard; Maurer, Willi; Bretz, Frank

    2017-09-01

    We describe a general framework for weighted parametric multiple test procedures based on the closure principle. We utilize general weighting strategies that can reflect complex study objectives and include many procedures in the literature as special cases. The proposed weighted parametric tests bridge the gap between rejection rules using either adjusted significance levels or adjusted p-values. This connection is made by allowing intersection hypotheses of the underlying closed test procedure to be tested at level smaller than α. This may be also necessary to take certain study situations into account. For such cases we introduce a subclass of exact α-level parametric tests that satisfy the consonance property. When the correlation is known only for certain subsets of the test statistics, a new procedure is proposed to fully utilize this knowledge within each subset. We illustrate the proposed weighted parametric tests using a clinical trial example and conduct a simulation study to investigate its operating characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Nonparametric relevance-shifted multiple testing procedures for the analysis of high-dimensional multivariate data with small sample sizes.

    PubMed

    Frömke, Cornelia; Hothorn, Ludwig A; Kropf, Siegfried

    2008-01-27

    In many research areas it is necessary to find differences between treatment groups with several variables. For example, studies of microarray data seek to find a significant difference in location parameters from zero or one for ratios thereof for each variable. However, in some studies a significant deviation of the difference in locations from zero (or 1 in terms of the ratio) is biologically meaningless. A relevant difference or ratio is sought in such cases. This article addresses the use of relevance-shifted tests on ratios for a multivariate parallel two-sample group design. Two empirical procedures are proposed which embed the relevance-shifted test on ratios. As both procedures test a hypothesis for each variable, the resulting multiple testing problem has to be considered. Hence, the procedures include a multiplicity correction. Both procedures are extensions of available procedures for point null hypotheses achieving exact control of the familywise error rate. Whereas the shift of the null hypothesis alone would give straight-forward solutions, the problems that are the reason for the empirical considerations discussed here arise by the fact that the shift is considered in both directions and the whole parameter space in between these two limits has to be accepted as null hypothesis. The first algorithm to be discussed uses a permutation algorithm, and is appropriate for designs with a moderately large number of observations. However, many experiments have limited sample sizes. Then the second procedure might be more appropriate, where multiplicity is corrected according to a concept of data-driven order of hypotheses.

  16. 75 FR 16957 - Energy Conservation Program: Test Procedures for Battery Chargers and External Power Supplies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Procedures C. Review of Single-Voltage External Power Supply Test Procedure D. Multiple-Voltage External...) Deletions of Existing Definitions (b) Revisions to Existing Definitions (c) Additions of New Definitions 4. Test Apparatus and General Instructions (a) Confidence Intervals (b) Temperature (c) AC Input Voltage...

  17. Memory and other properties of multiple test procedures generated by entangled graphs.

    PubMed

    Maurer, Willi; Bretz, Frank

    2013-05-10

    Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Balancing Flexible Constraints and Measurement Precision in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Moyer, Eric L.; Galindo, Jennifer L.; Dodd, Barbara G.

    2012-01-01

    Managing test specifications--both multiple nonstatistical constraints and flexibly defined constraints--has become an important part of designing item selection procedures for computerized adaptive tests (CATs) in achievement testing. This study compared the effectiveness of three procedures: constrained CAT, flexible modified constrained CAT,…

  19. General solutions to multiple testing problems. Translation of "Sonnemann, E. (1982). Allgemeine Lösungen multipler Test probleme. EDV in Medizin und Biologie 13(4), 120-128".

    PubMed

    Sonnemann, Eckart

    2008-10-01

    The introduction of sequentially rejective multiple test procedures (Einot and Gabriel, 1975; Naik, 1975; Holm, 1977; Holm, 1979) has caused considerable progress in the theory of multiple comparisons. Emphasizing the closure of multiple tests we give a survey of the general theory and its recent results in applications. Some new applications are given including a discussion of the connection with the theory of confidence regions.

  20. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  1. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  2. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  3. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  4. 40 CFR Appendix B to Subpart S of... - Test Procedures

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... percent or the vehicle's engine stalls at any time during the test sequence. (4) Multiple exhaust pipes. Exhaust gas concentrations from vehicle engines equipped with multiple exhaust pipes shall be sampled... pipes. Exhaust gas concentrations from vehicle engines equipped with multiple exhaust pipes shall be...

  5. A Normalized Direct Approach for Estimating the Parameters of the Normal Ogive Three-Parameter Model for Ability Tests.

    ERIC Educational Resources Information Center

    Gugel, John F.

    A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…

  6. Multiple testing with discrete data: Proportion of true null hypotheses and two adaptive FDR procedures.

    PubMed

    Chen, Xiongzhi; Doerge, Rebecca W; Heyse, Joseph F

    2018-05-11

    We consider multiple testing with false discovery rate (FDR) control when p values have discrete and heterogeneous null distributions. We propose a new estimator of the proportion of true null hypotheses and demonstrate that it is less upwardly biased than Storey's estimator and two other estimators. The new estimator induces two adaptive procedures, that is, an adaptive Benjamini-Hochberg (BH) procedure and an adaptive Benjamini-Hochberg-Heyse (BHH) procedure. We prove that the adaptive BH (aBH) procedure is conservative nonasymptotically. Through simulation studies, we show that these procedures are usually more powerful than their nonadaptive counterparts and that the adaptive BHH procedure is usually more powerful than the aBH procedure and a procedure based on randomized p-value. The adaptive procedures are applied to a study of HIV vaccine efficacy, where they identify more differentially polymorphic positions than the BH procedure at the same FDR level. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Testing for Factorial Invariance in the Context of Construct Validation

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2010-01-01

    This article describes the logic and procedures behind testing for factorial invariance across groups in the context of construct validation. The procedures include testing for configural, measurement, and structural invariance in the framework of multiple-group confirmatory factor analysis (CFA). The "forward" (sequential constraint imposition)…

  8. Multiple Testing, Cumulative Radiation Dose, and Clinical Indications in Patients Undergoing Myocardial Perfusion Imaging

    PubMed Central

    Einstein, Andrew J.; Weiner, Shepard D.; Bernheim, Adam; Kulon, Michal; Bokhari, Sabahat; Johnson, Lynne L.; Moses, Jeffrey W.; Balter, Stephen

    2013-01-01

    Context Myocardial perfusion imaging (MPI) is the single medical test with the highest radiation burden to the US population. While many patients undergoing MPI receive repeat MPI testing, or additional procedures involving ionizing radiation, no data are available characterizing their total longitudinal radiation burden and relating radiation burden with reasons for testing. Objective To characterize procedure counts, cumulative estimated effective doses of radiation, and clinical indications, for patients undergoing MPI. Design, Setting, Patients Retrospective cohort study evaluating, for 1097 consecutive patients undergoing index MPI during the first 100 days of 2006 at Columbia University Medical Center, all preceding medical imaging procedures involving ionizing radiation undergone beginning October 1988, and all subsequent procedures through June 2008, at that center. Main Outcome Measures Cumulative estimated effective dose of radiation, number of procedures involving radiation, and indications for testing. Results Patients underwent a median (interquartile range, mean) of 15 (6–32, 23.9) procedures involving radiation exposure; 4 (2–8, 6.5) were high-dose (≥3 mSv, i.e. one year's background radiation), including 1 (1–2, 1.8) MPI studies per patient. 31% of patients received cumulative estimated effective dose from all medical sources >100mSv. Multiple MPIs were performed in 39% of patients, for whom cumulative estimated effective dose was 121 (81–189, 149) mSv. Men and whites had higher cumulative estimated effective doses, and there was a trend towards men being more likely to undergo multiple MPIs than women (40.8% vs. 36.6%, Odds ratio 1.29, 95% confidence interval 0.98–1.69). Over 80% of initial and 90% of repeat MPI exams were performed in patients with known cardiac disease or symptoms consistent with it. Conclusion In this institution, multiple testing with MPI was very common, and in many patients associated with very high cumulative estimated doses of radiation. PMID:21078807

  9. Multiple Testing with Modified Bonferroni Methods.

    ERIC Educational Resources Information Center

    Li, Jianmin; And Others

    This paper discusses the issue of multiple testing and overall Type I error rates in contexts other than multiple comparisons of means. It demonstrates, using a 5 x 5 correlation matrix, the application of 5 recently developed modified Bonferroni procedures developed by the following authors: (1) Y. Hochberg (1988); (2) B. S. Holland and M. D.…

  10. Testing Multiple Outcomes in Repeated Measures Designs

    ERIC Educational Resources Information Center

    Lix, Lisa M.; Sajobi, Tolulope

    2010-01-01

    This study investigates procedures for controlling the familywise error rate (FWR) when testing hypotheses about multiple, correlated outcome variables in repeated measures (RM) designs. A content analysis of RM research articles published in 4 psychology journals revealed that 3 quarters of studies tested hypotheses about 2 or more outcome…

  11. The Effects of Item by Item Feedback Given during an Ability Test.

    ERIC Educational Resources Information Center

    Whetton, C.; Childs, R.

    1981-01-01

    Answer-until-correct (AUC) is a procedure for providing feedback during a multiple-choice test, giving an increased range of scores. The performance of secondary students on a verbal ability test using AUC procedures was compared with a group using conventional instructions. AUC scores considerably enhanced reliability but not validity.…

  12. Proceedings of the Conference on the Design of Experiments in Army Research Development and Testing (26th) Held at New Mexico State University, Las Cruces, New Mexico on 22-24 October 1980.

    DTIC Science & Technology

    1981-06-01

    normality and several types of nonnormality. Overall the rank transformation procedure seems to be the best. The Fisher’s LSD multiple comparisons procedure...the rank transformation procedure appears to maintain power better than Fisher’s LSD or the randomization proce- dures. The conclusion of this study...best. The Fisher’s LSD multiple comparisons procedure in the one way and two way layouts iv compared with a randomization procedure and with the same

  13. Piloting a Polychotomous Partial-Credit Scoring Procedure in a Multiple-Choice Test

    ERIC Educational Resources Information Center

    Tsopanoglou, Antonios; Ypsilandis, George S.; Mouti, Anna

    2014-01-01

    Multiple-choice (MC) tests are frequently used to measure language competence because they are quick, economical and straightforward to score. While degrees of correctness have been investigated for partially correct responses in combined-response MC tests, degrees of incorrectness in distractors and the role they play in determining the…

  14. A simple test of association for contingency tables with multiple column responses.

    PubMed

    Decady, Y J; Thomas, D R

    2000-09-01

    Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.

  15. 40 CFR 258.53 - Ground-water sampling and analysis requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... include consistent sampling and analysis procedures that are designed to ensure monitoring results that... testing period. If a multiple comparisons procedure is used, the Type I experiment wise error rate for...

  16. Estimating times of surgeries with two component procedures: comparison of the lognormal and normal models.

    PubMed

    Strum, David P; May, Jerrold H; Sampson, Allan R; Vargas, Luis G; Spangler, William E

    2003-01-01

    Variability inherent in the duration of surgical procedures complicates surgical scheduling. Modeling the duration and variability of surgeries might improve time estimates. Accurate time estimates are important operationally to improve utilization, reduce costs, and identify surgeries that might be considered outliers. Surgeries with multiple procedures are difficult to model because they are difficult to segment into homogenous groups and because they are performed less frequently than single-procedure surgeries. The authors studied, retrospectively, 10,740 surgeries each with exactly two CPTs and 46,322 surgical cases with only one CPT from a large teaching hospital to determine if the distribution of dual-procedure surgery times fit more closely a lognormal or a normal model. The authors tested model goodness of fit to their data using Shapiro-Wilk tests, studied factors affecting the variability of time estimates, and examined the impact of coding permutations (ordered combinations) on modeling. The Shapiro-Wilk tests indicated that the lognormal model is statistically superior to the normal model for modeling dual-procedure surgeries. Permutations of component codes did not appear to differ significantly with respect to total procedure time and surgical time. To improve individual models for infrequent dual-procedure surgeries, permutations may be reduced and estimates may be based on the longest component procedure and type of anesthesia. The authors recommend use of the lognormal model for estimating surgical times for surgeries with two component procedures. Their results help legitimize the use of log transforms to normalize surgical procedure times prior to hypothesis testing using linear statistical models. Multiple-procedure surgeries may be modeled using the longest (statistically most important) component procedure and type of anesthesia.

  17. Wrong Answers on Multiple-Choice Achievement Tests: Blind Guesses or Systematic Choices?.

    ERIC Educational Resources Information Center

    Powell, J. C.

    A multi-faceted model for the selection of answers for multiple-choice tests was developed from the findings of a series of exploratory studies. This model implies that answer selection should be curvilinear. A series of models were tested for fit using the chi square procedure. Data were collected from 359 elementary school students ages 9-12.…

  18. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  19. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  20. Best (but oft-forgotten) practices: the multiple problems of multiplicity-whether and how to correct for many statistical tests.

    PubMed

    Streiner, David L

    2015-10-01

    Testing many null hypotheses in a single study results in an increased probability of detecting a significant finding just by chance (the problem of multiplicity). Debates have raged over many years with regard to whether to correct for multiplicity and, if so, how it should be done. This article first discusses how multiple tests lead to an inflation of the α level, then explores the following different contexts in which multiplicity arises: testing for baseline differences in various types of studies, having >1 outcome variable, conducting statistical tests that produce >1 P value, taking multiple "peeks" at the data, and unplanned, post hoc analyses (i.e., "data dredging," "fishing expeditions," or "P-hacking"). It then discusses some of the methods that have been proposed for correcting for multiplicity, including single-step procedures (e.g., Bonferroni); multistep procedures, such as those of Holm, Hochberg, and Šidák; false discovery rate control; and resampling approaches. Note that these various approaches describe different aspects and are not necessarily mutually exclusive. For example, resampling methods could be used to control the false discovery rate or the family-wise error rate (as defined later in this article). However, the use of one of these approaches presupposes that we should correct for multiplicity, which is not universally accepted, and the article presents the arguments for and against such "correction." The final section brings together these threads and presents suggestions with regard to when it makes sense to apply the corrections and how to do so. © 2015 American Society for Nutrition.

  1. The optimal power puzzle: scrutiny of the monotone likelihood ratio assumption in multiple testing.

    PubMed

    Cao, Hongyuan; Sun, Wenguang; Kosorok, Michael R

    2013-01-01

    In single hypothesis testing, power is a non-decreasing function of type I error rate; hence it is desirable to test at the nominal level exactly to achieve optimal power. The puzzle lies in the fact that for multiple testing, under the false discovery rate paradigm, such a monotonic relationship may not hold. In particular, exact false discovery rate control may lead to a less powerful testing procedure if a test statistic fails to fulfil the monotone likelihood ratio condition. In this article, we identify different scenarios wherein the condition fails and give caveats for conducting multiple testing in practical settings.

  2. Measurement Invariance for Latent Constructs in Multiple Populations: A Critical View and Refocus

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Li, Cheng-Hsien

    2012-01-01

    Popular measurement invariance testing procedures for latent constructs evaluated by multiple indicators in distinct populations are revisited and discussed. A frequently used test of factor loading invariance is shown to possess serious limitations that in general preclude it from accomplishing its goal of ascertaining this invariance. A process…

  3. A Tutorial on Multiple Testing: False Discovery Control

    NASA Astrophysics Data System (ADS)

    Chatelain, F.

    2016-09-01

    This paper presents an overview of criteria and methods in multiple testing, with an emphasis on the false discovery rate control. The popular Benjamini and Hochberg procedure is described. The rationale for this approach is explained through a simple Bayesian interpretation. Some state-of-the-art variations and extensions are also presented.

  4. Improvement of structural models using covariance analysis and nonlinear generalized least squares

    NASA Technical Reports Server (NTRS)

    Glaser, R. J.; Kuo, C. P.; Wada, B. K.

    1992-01-01

    The next generation of large, flexible space structures will be too light to support their own weight, requiring a system of structural supports for ground testing. The authors have proposed multiple boundary-condition testing (MBCT), using more than one support condition to reduce uncertainties associated with the supports. MBCT would revise the mass and stiffness matrix, analytically qualifying the structure for operation in space. The same procedure is applicable to other common test conditions, such as empty/loaded tanks and subsystem/system level tests. This paper examines three techniques for constructing the covariance matrix required by nonlinear generalized least squares (NGLS) to update structural models based on modal test data. The methods range from a complicated approach used to generate the simulation data (i.e., the correct answer) to a diagonal matrix based on only two constants. The results show that NGLS is very insensitive to assumptions about the covariance matrix, suggesting that a workable NGLS procedure is possible. The examples also indicate that the multiple boundary condition procedure more accurately reduces errors than individual boundary condition tests alone.

  5. An extended sequential goodness-of-fit multiple testing method for discrete data.

    PubMed

    Castro-Conde, Irene; Döhler, Sebastian; de Uña-Álvarez, Jacobo

    2017-10-01

    The sequential goodness-of-fit (SGoF) multiple testing method has recently been proposed as an alternative to the familywise error rate- and the false discovery rate-controlling procedures in high-dimensional problems. For discrete data, the SGoF method may be very conservative. In this paper, we introduce an alternative SGoF-type procedure that takes into account the discreteness of the test statistics. Like the original SGoF, our new method provides weak control of the false discovery rate/familywise error rate but attains false discovery rate levels closer to the desired nominal level, and thus it is more powerful. We study the performance of this method in a simulation study and illustrate its application to a real pharmacovigilance data set.

  6. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  7. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  8. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  9. 34 CFR 668.145 - Test approval procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...

  10. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  11. Identification of differentially expressed genes and false discovery rate in microarray studies.

    PubMed

    Gusnanto, Arief; Calza, Stefano; Pawitan, Yudi

    2007-04-01

    To highlight the development in microarray data analysis for the identification of differentially expressed genes, particularly via control of false discovery rate. The emergence of high-throughput technology such as microarrays raises two fundamental statistical issues: multiplicity and sensitivity. We focus on the biological problem of identifying differentially expressed genes. First, multiplicity arises due to testing tens of thousands of hypotheses, rendering the standard P value meaningless. Second, known optimal single-test procedures such as the t-test perform poorly in the context of highly multiple tests. The standard approach of dealing with multiplicity is too conservative in the microarray context. The false discovery rate concept is fast becoming the key statistical assessment tool replacing the P value. We review the false discovery rate approach and argue that it is more sensible for microarray data. We also discuss some methods to take into account additional information from the microarrays to improve the false discovery rate. There is growing consensus on how to analyse microarray data using the false discovery rate framework in place of the classical P value. Further research is needed on the preprocessing of the raw data, such as the normalization step and filtering, and on finding the most sensitive test procedure.

  12. Kernel Machine SNP-set Testing under Multiple Candidate Kernels

    PubMed Central

    Wu, Michael C.; Maity, Arnab; Lee, Seunggeun; Simmons, Elizabeth M.; Harmon, Quaker E.; Lin, Xinyi; Engel, Stephanie M.; Molldrem, Jeffrey J.; Armistead, Paul M.

    2013-01-01

    Joint testing for the cumulative effect of multiple single nucleotide polymorphisms grouped on the basis of prior biological knowledge has become a popular and powerful strategy for the analysis of large scale genetic association studies. The kernel machine (KM) testing framework is a useful approach that has been proposed for testing associations between multiple genetic variants and many different types of complex traits by comparing pairwise similarity in phenotype between subjects to pairwise similarity in genotype, with similarity in genotype defined via a kernel function. An advantage of the KM framework is its flexibility: choosing different kernel functions allows for different assumptions concerning the underlying model and can allow for improved power. In practice, it is difficult to know which kernel to use a priori since this depends on the unknown underlying trait architecture and selecting the kernel which gives the lowest p-value can lead to inflated type I error. Therefore, we propose practical strategies for KM testing when multiple candidate kernels are present based on constructing composite kernels and based on efficient perturbation procedures. We demonstrate through simulations and real data applications that the procedures protect the type I error rate and can lead to substantially improved power over poor choices of kernels and only modest differences in power versus using the best candidate kernel. PMID:23471868

  13. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers... verified the specimen as being positive for marijuana and cocaine and as being a refusal to test because...

  14. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers... verified the specimen as being positive for marijuana and cocaine and as being a refusal to test because...

  15. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers... verified the specimen as being positive for marijuana and cocaine and as being a refusal to test because...

  16. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers... verified the specimen as being positive for marijuana and cocaine and as being a refusal to test because...

  17. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers... verified the specimen as being positive for marijuana and cocaine and as being a refusal to test because...

  18. Multiple IMU system test plan, volume 4. [subroutines for space shuttle requirements

    NASA Technical Reports Server (NTRS)

    Landey, M.; Vincent, K. T., Jr.; Whittredge, R. S.

    1974-01-01

    Operating procedures for this redundant system are described. A test plan is developed with two objectives. First, performance of the hardware and software delivered is demonstrated. Second, applicability of multiple IMU systems to the space shuttle mission is shown through detailed experiments with FDI algorithms and other multiple IMU software: gyrocompassing, calibration, and navigation. Gimbal flip is examined in light of its possible detrimental effects on FDI and navigation. For Vol. 3, see N74-10296.

  19. The Predictive Validity of Using Admissions Testing and Multiple Mini-Interviews in Undergraduate University Admissions

    ERIC Educational Resources Information Center

    Makransky, Guido; Havmose, Philip; Vang, Maria Louison; Andersen, Tonny Elmose; Nielsen, Tine

    2017-01-01

    The aim of this study was to evaluate the predictive validity of a two-step admissions procedure that included a cognitive ability test followed by multiple mini-interviews (MMIs) used to assess non-cognitive skills, compared to grade-based admissions relative to subsequent drop-out rates and academic achievement after one and two years of study.…

  20. Specific arithmetic calculation deficits in children with Turner syndrome.

    PubMed

    Rovet, J; Szekely, C; Hockenberry, M N

    1994-12-01

    Study 1 compared arithmetic processing skills on the WRAT-R in 45 girls with Turner syndrome (TS) and 92 age-matched female controls. Results revealed significant underachievement by subjects with TS, which reflected their poorer performance on problems requiring the retrieval of addition and multiplication facts and procedural knowledge for addition and division operations. TS subjects did not differ qualitatively from controls in type of procedural error committed. Study 2, which compared the performance of 10 subjects with TS and 31 controls on the Keymath Diagnostic Arithmetic Test, showed that the TS group had less adequate knowledge of arithmetic, subtraction, and multiplication procedures but did not differ from controls on Fact items. Error analyses revealed that TS subjects were more likely to confuse component steps or fail to separate intermediate steps or to complete problems. TS subjects relied to a greater degree on verbal than visual-spatial abilities in arithmetic processing while their visual-spatial abilities were associated with retrieval of simple multidigit addition facts and knowledge of subtraction, multiplication, and division procedures. Differences between the TS and control groups increased with age for Keymath, but not WRAT-R, procedures. Discrepant findings are related to the different task constraints (timed vs. untimed, single vs. alternate versions, size of item pool) and the use of different strategies (counting vs. fact retrieval). It is concluded that arithmetic difficulties in females with TS are due to less adequate procedural skills, combined with poorer fact retrieval in timed testing situations, rather than to inadequate visual-spatial abilities.

  1. A general equation to obtain multiple cut-off scores on a test from multinomial logistic regression.

    PubMed

    Bersabé, Rosa; Rivas, Teresa

    2010-05-01

    The authors derive a general equation to compute multiple cut-offs on a total test score in order to classify individuals into more than two ordinal categories. The equation is derived from the multinomial logistic regression (MLR) model, which is an extension of the binary logistic regression (BLR) model to accommodate polytomous outcome variables. From this analytical procedure, cut-off scores are established at the test score (the predictor variable) at which an individual is as likely to be in category j as in category j+1 of an ordinal outcome variable. The application of the complete procedure is illustrated by an example with data from an actual study on eating disorders. In this example, two cut-off scores on the Eating Attitudes Test (EAT-26) scores are obtained in order to classify individuals into three ordinal categories: asymptomatic, symptomatic and eating disorder. Diagnoses were made from the responses to a self-report (Q-EDD) that operationalises DSM-IV criteria for eating disorders. Alternatives to the MLR model to set multiple cut-off scores are discussed.

  2. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  3. A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.

    PubMed

    Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C

    2011-07-01

    When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.

  4. Multiple Hollow Cathode Wear Testing for the Space Station Plasma Contactor

    NASA Technical Reports Server (NTRS)

    Soulas, George C.

    1994-01-01

    A wear test of four hollow cathodes was conducted to resolve issues associated with the Space Station plasma contactor. The objectives of this test were to evaluate unit-to-unit dispersions, verify the transportability of contamination control protocols developed by the project, and to evaluate cathode contamination control and activation procedures to enable simplification of the gas feed system and heater power processor. These objectives were achieved by wear testing four cathodes concurrently to 2000 hours. Test results showed maximum unit-to-unit deviations for discharge voltages and cathode tip temperatures to be +/-3 percent and +/-2 percent, respectively, of the nominal values. Cathodes utilizing contamination control procedures known to increase cathode lifetime showed no trends in their monitored parameters that would indicate a possible failure, demonstrating that contamination control procedures had been successfully transferred. Comparisons of cathodes utilizing and not utilizing a purifier or simplified activation procedure showed similar behavior during wear testing and pre- and post-test performance characterizations. This behavior indicates that use of simplified cathode systems and procedures is consistent with long cathode lifetimes.

  5. Testing and analysis of flat and curved panels with multiple cracks

    NASA Technical Reports Server (NTRS)

    Broek, David; Jeong, David Y.; Thomson, Douglas

    1994-01-01

    An experimental and analytical investigation of multiple cracking in various types of test specimens is described in this paper. The testing phase is comprised of a flat unstiffened panel series and curved stiffened and unstiffened panel series. The test specimens contained various configurations for initial damage. Static loading was applied to these specimens until ultimate failure, while loads and crack propagation were recorded. This data provides the basis for developing and validating methodologies for predicting linkup of multiple cracks, progression to failure, and overall residual strength. The results from twelve flat coupon and ten full scale curved panel tests are presented. In addition, an engineering analysis procedure was developed to predict multiple crack linkup. Reasonable agreement was found between predictions and actual test results for linkup and residual strength for both flat and curved panels. The results indicate that an engineering analysis approach has the potential to quantitatively assess the effect of multiple cracks in the arrest capability of an aircraft fuselage structure.

  6. [In situ suture repair procedure of knee dislocation with multiple-ligament injury at acute stage].

    PubMed

    Ye, Jingbing; Luo, Dahui; Fu, Weili; He, Xin; Li, Jian

    2009-09-01

    To investigate the method and the short term clinical effectiveness of in situ suture repair procedure of knee dislocation with multiple-ligament injury at acute stage. From February 2006 to November 2007, 9 patients suffering from single knee closed dislocation with multiple-ligament injury underwent open in situ suture repair procedure with non-absorbable thread and managements of other combined injuries simultaneously. Nine patients included 6 males and 3 females, aged 34-52 years old. The injured knees were left side in 4 cases and right side in 5 cases. Injuries were caused by traffic accident in 8 cases and heavy-weight crushing in 1 case. EMRI and arthroscopic examination showed that all patients suffered from the avulsion injuries of anterior cruciate ligament and posterior cruciate ligament. The time from injury to operation was 4 to 7 days with an average of 5.1 days. No bacterial arthritis occurred after operation. Subcutaneous ligated fat occurred and cured after symptomatic treatment in 2 cases, other incisions healed by first intension. All patients were followed up 12 months. At 12 months postoperatively, 2 patients' flexion range of the suffering knees lost 10 degrees when to compared with normal knees, and the range of motion was from 0 to 125 degrees. The Lysholm knee scores were 83-92 (average 86.3), the results were excellent in 3 cases and good in 6 cases. The posterior drawer test and anterior drawer test were one-degree positive in 3 cases respectively; the Lachman tests were one-degree positive in 5 cases, lateral stress tests were negative in all cases. In situ suture repair procedure of knee dislocation with multiple-ligament injury at acute stage has the advantages such as reliable fixation, simultaneous management of other combined injuries and satisfactory short term effect.

  7. Resampling-Based Empirical Bayes Multiple Testing Procedures for Controlling Generalized Tail Probability and Expected Value Error Rates: Focus on the False Discovery Rate and Simulation Study

    PubMed Central

    Dudoit, Sandrine; Gilbert, Houston N.; van der Laan, Mark J.

    2014-01-01

    Summary This article proposes resampling-based empirical Bayes multiple testing procedures for controlling a broad class of Type I error rates, defined as generalized tail probability (gTP) error rates, gTP(q, g) = Pr(g(Vn, Sn) > q), and generalized expected value (gEV) error rates, gEV(g) = E[g(Vn, Sn)], for arbitrary functions g(Vn, Sn) of the numbers of false positives Vn and true positives Sn. Of particular interest are error rates based on the proportion g(Vn, Sn) = Vn/(Vn + Sn) of Type I errors among the rejected hypotheses, such as the false discovery rate (FDR), FDR = E[Vn/(Vn + Sn)]. The proposed procedures offer several advantages over existing methods. They provide Type I error control for general data generating distributions, with arbitrary dependence structures among variables. Gains in power are achieved by deriving rejection regions based on guessed sets of true null hypotheses and null test statistics randomly sampled from joint distributions that account for the dependence structure of the data. The Type I error and power properties of an FDR-controlling version of the resampling-based empirical Bayes approach are investigated and compared to those of widely-used FDR-controlling linear step-up procedures in a simulation study. The Type I error and power trade-off achieved by the empirical Bayes procedures under a variety of testing scenarios allows this approach to be competitive with or outperform the Storey and Tibshirani (2003) linear step-up procedure, as an alternative to the classical Benjamini and Hochberg (1995) procedure. PMID:18932138

  8. Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André

    2016-01-01

    Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…

  9. Item Analysis in Introductory Economics Testing.

    ERIC Educational Resources Information Center

    Tinari, Frank D.

    1979-01-01

    Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)

  10. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  11. A comparison of the effects of two methods of acclimation of aerobic biodegradability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, H.M.

    1993-11-01

    The acclimation or adaptation of microorganisms to organic chemicals is an important factor influencing both the rate and the extent of biodegradation. In this study two acclimation procedures were evaluated in terms of their effectiveness in enhancing biodegradation, their relative ease of use in the laboratory, and the implications for biodegradability testing. In the single-flask procedure, microorganisms were acclimated for 2 to 7 d in a single acclimation flask at constant or increasing concentrations of the test chemical without transfer of microorganisms. In the second procedure, the enrichment procedure, microorganisms were acclimated in a series of flasks over a 21-dmore » period by making adaptive transfers to increasing concentrations of the test chemical. Acclimated microorganisms from each procedure were used as the source of inoculum for subsequent biodegradation tests in which carbon dioxide evolution was measured. Six chemicals were tested: quinoline, p-nitrophenol, N-methylaniline, N,N-dimethylaniline, acrylonitrile, and 2,2,4-trimethyl-1,3-pentanediol monoisobutyrate. Microorganisms acclimated in the single-flask procedure were much more effective than those acclimated in the enrichment procedure in degrading the test chemicals. The single-flask procedure is more convenient to use, and it permits monitoring of the time needed for acclimation. The results from these studies have implications for the methodology used in biodegradation test systems and suggest caution before adopting a multiple-flask, enrichment acclimation procedure before the performance of standardized tests for aerobic biodegradability.« less

  12. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  13. 40 CFR 1033.520 - Alternative ramped modal cycles.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) Following the completion of the third test phase of the applicable ramped modal cycle, conduct the post... POLLUTION CONTROLS CONTROL OF EMISSIONS FROM LOCOMOTIVES Test Procedures § 1033.520 Alternative ramped modal... locomotive notch settings. Ramped modal cycles combine multiple test modes of a discrete-mode steady-state...

  14. Note on simultaneous inferences about non-inferiority and superiority for a primary and a secondary endpoint.

    PubMed

    Guilbaud, Olivier

    2011-11-01

    In their review of challenges to multiple testing in clinical trials, Hung and Wang (2010) considered the situation where a treatment is to be compared with an active comparator and the aim is to show non-inferiority and (if possible) superiority with respect to a primary and a secondary endpoint. This note extends their discussion of this particular situation, taking the sequentially rejective procedure they used for illustration as a starting point. Some alternative multiple testing procedures (MTPs) are considered, and corresponding simultaneous confidence regions are discussed that provide additional information "for free". The choice may then be based on the properties of these MTPs and corresponding confidence regions. 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Testing jumps via false discovery rate control.

    PubMed

    Yen, Yu-Min

    2013-01-01

    Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR), an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS) test statistic, and control the FDR with the Benjamini and Hochberg (BH) procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.

  16. A hybrid Dantzig-Wolfe, Benders decomposition and column generation procedure for multiple diet production planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Udomsungworagul, A.; Charnsethikul, P.

    2018-03-01

    This article introduces methodology to solve large scale two-phase linear programming with a case of multiple time period animal diet problems under both nutrients in raw materials and finished product demand uncertainties. Assumption of allowing to manufacture multiple product formulas in the same time period and assumption of allowing to hold raw materials and finished products inventory have been added. Dantzig-Wolfe decompositions, Benders decomposition and Column generations technique has been combined and applied to solve the problem. The proposed procedure was programmed using VBA and Solver tool in Microsoft Excel. A case study was used and tested in term of efficiency and effectiveness trade-offs.

  17. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  18. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  19. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  20. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    ERIC Educational Resources Information Center

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  1. Automatic Scoring of Paper-and-Pencil Figural Responses. Research Report.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; And Others

    Large-scale testing is dominated by the multiple-choice question format. Widespread use of the format is due, in part, to the ease with which multiple-choice items can be scored automatically. This paper examines automatic scoring procedures for an alternative item type: figural response. Figural response items call for the completion or…

  2. Operator Priming and Generalization of Practice in Adults' Simple Arithmetic

    ERIC Educational Resources Information Center

    Chen, Yalin; Campbell, Jamie I. D.

    2016-01-01

    There is a renewed debate about whether educated adults solve simple addition problems (e.g., 2 + 3) by direct fact retrieval or by fast, automatic counting-based procedures. Recent research testing adults' simple addition and multiplication showed that a 150-ms preview of the operator (+ or ×) facilitated addition, but not multiplication,…

  3. Testing a multiple mediation model of Asian American college students' willingness to see a counselor.

    PubMed

    Kim, Paul Youngbin; Park, Irene J K

    2009-07-01

    Adapting the theory of reasoned action, the present study examined help-seeking beliefs, attitudes, and intent among Asian American college students (N = 110). A multiple mediation model was tested to see if the relation between Asian values and willingness to see a counselor was mediated by attitudes toward seeking professional psychological help and subjective norm. A bootstrapping procedure was used to test the multiple mediation model. Results indicated that subjective norm was the sole significant mediator of the effect of Asian values on willingness to see a counselor. The findings highlight the importance of social influences on help-seeking intent among Asian American college students.

  4. The Slope Test: Applications in Formative Evaluation.

    ERIC Educational Resources Information Center

    Baggaley, Jon; Brauer, Aaron-Henry

    1989-01-01

    Discusses problems with formative evaluation of educational materials and examines the slope test when used in a pretest/posttest multiple group (PPMG) design to adjust posttest scores treatment interaction studies. An example is given of the utility of the slope test and analysis of covariance procedure using an educational film about AIDS. (five…

  5. Use of "t"-Test and ANOVA in Career-Technical Education Research

    ERIC Educational Resources Information Center

    Rojewski, Jay W.; Lee, In Heok; Gemici, Sinan

    2012-01-01

    Use of t-tests and analysis of variance (ANOVA) procedures in published research from three scholarly journals in career and technical education (CTE) during a recent 5-year period was examined. Information on post hoc analyses, reporting of effect size, alpha adjustments to account for multiple tests, power, and examination of assumptions…

  6. Integrated Testlets: A New Form of Expert-Student Collaborative Testing

    ERIC Educational Resources Information Center

    Shiell, Ralph C.; Slepkov, Aaron D.

    2015-01-01

    Integrated testlets are a new assessment tool that encompass the procedural benefits of multiple-choice testing, the pedagogical advantages of free-response-based tests, and the collaborative aspects of a viva voce or defence examination format. The result is a robust assessment tool that provides a significant formative aspect for students.…

  7. Examining the Impact of Covariates on Anchor Tests to Ascertain Quality over Time in a College Admissions Test

    ERIC Educational Resources Information Center

    Wiberg, Marie; von Davier, Alina A.

    2017-01-01

    We propose a comprehensive procedure for the implementation of a quality control process of anchor tests for a college admissions test with multiple consecutive administrations. We propose to examine the anchor tests and their items in connection with covariates to investigate if there was any unusual behavior in the anchor test results over time…

  8. Development of a multiple immunoaffinity column for simultaneous determination of multiple mycotoxins in feeds using UPLC-MS/MS.

    PubMed

    Hu, Xiaofeng; Hu, Rui; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Wang, Min

    2016-09-01

    A sensitive and specific immunoaffinity column to clean up and isolate multiple mycotoxins was developed along with a rapid one-step sample preparation procedure for ultra-performance liquid chromatography-tandem mass spectrometry analysis. Monoclonal antibodies against aflatoxin B1, aflatoxin B2, aflatoxin G1, aflatoxin G2, zearalenone, ochratoxin A, sterigmatocystin, and T-2 toxin were coupled to microbeads for mycotoxin purification. We optimized a homogenization and extraction procedure as well as column loading and elution conditions to maximize recoveries from complex feed matrices. This method allowed rapid, simple, and simultaneous determination of mycotoxins in feeds with a single chromatographic run. Detection limits for these toxins ranged from 0.006 to 0.12 ng mL(-1), and quantitation limits ranged from 0.06 to 0.75 ng mL(-1). Concentration curves were linear from 0.12 to 40 μg kg(-1) with correlation coefficients of R (2) > 0.99. Intra-assay and inter-assay comparisons indicated excellent repeatability and reproducibility of the multiple immunoaffinity columns. As a proof of principle, 80 feed samples were tested and several contained multiple mycotoxins. This method is sensitive, rapid, and durable enough for multiple mycotoxin determinations that fulfill European Union and Chinese testing criteria.

  9. Structural-Vibration-Response Data Analysis

    NASA Technical Reports Server (NTRS)

    Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.

    1983-01-01

    Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.

  10. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  11. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  12. Bayesian estimation of the transmissivity spatial structure from pumping test data

    NASA Astrophysics Data System (ADS)

    Demir, Mehmet Taner; Copty, Nadim K.; Trinchero, Paolo; Sanchez-Vila, Xavier

    2017-06-01

    Estimating the statistical parameters (mean, variance, and integral scale) that define the spatial structure of the transmissivity or hydraulic conductivity fields is a fundamental step for the accurate prediction of subsurface flow and contaminant transport. In practice, the determination of the spatial structure is a challenge because of spatial heterogeneity and data scarcity. In this paper, we describe a novel approach that uses time drawdown data from multiple pumping tests to determine the transmissivity statistical spatial structure. The method builds on the pumping test interpretation procedure of Copty et al. (2011) (Continuous Derivation method, CD), which uses the time-drawdown data and its time derivative to estimate apparent transmissivity values as a function of radial distance from the pumping well. A Bayesian approach is then used to infer the statistical parameters of the transmissivity field by combining prior information about the parameters and the likelihood function expressed in terms of radially-dependent apparent transmissivities determined from pumping tests. A major advantage of the proposed Bayesian approach is that the likelihood function is readily determined from randomly generated multiple realizations of the transmissivity field, without the need to solve the groundwater flow equation. Applying the method to synthetically-generated pumping test data, we demonstrate that, through a relatively simple procedure, information on the spatial structure of the transmissivity may be inferred from pumping tests data. It is also shown that the prior parameter distribution has a significant influence on the estimation procedure, given the non-uniqueness of the estimation procedure. Results also indicate that the reliability of the estimated transmissivity statistical parameters increases with the number of available pumping tests.

  13. Influence of temporal context on value in the multiple-chains and successive-encounters procedures.

    PubMed

    O'Daly, Matthew; Angulo, Samuel; Gipson, Cassandra; Fantino, Edmund

    2006-05-01

    This set of studies explored the influence of temporal context across multiple-chain and multiple-successive-encounters procedures. Following training with different temporal contexts, the value of stimuli sharing similar reinforcement schedules was assessed by presenting these stimuli in concurrent probes. The results for the multiple-chain schedule indicate that temporal context does impact the value of a conditioned reinforcer consistent with delay-reduction theory, such that a stimulus signaling a greater reduction in delay until reinforcement has greater value. Further, nonreinforced stimuli that are concurrently presented with the preferred terminal link also have greater value, consistent with value transfer. The effects of context on value for conditions with the multiple-successive-encounters procedure, however, appear to depend on whether the search schedule or alternate handling schedule was manipulated, as well as on whether the tested stimuli were the rich or lean schedules in their components. Overall, the results help delineate the conditions under which temporal context affects conditioned-reinforcement value (acting as a learning variable) and the conditions under which it does not (acting as a performance variable), an issue of relevance to theories of choice.

  14. Developmental dissociation in the neural responses to simple multiplication and subtraction problems

    PubMed Central

    Prado, Jérôme; Mutreja, Rachna; Booth, James R.

    2014-01-01

    Mastering single-digit arithmetic during school years is commonly thought to depend upon an increasing reliance on verbally memorized facts. An alternative model, however, posits that fluency in single-digit arithmetic might also be achieved via the increasing use of efficient calculation procedures. To test between these hypotheses, we used a cross-sectional design to measure the neural activity associated with single-digit subtraction and multiplication in 34 children from 2nd to 7th grade. The neural correlates of language and numerical processing were also identified in each child via localizer scans. Although multiplication and subtraction were undistinguishable in terms of behavior, we found a striking developmental dissociation in their neural correlates. First, we observed grade-related increases of activity for multiplication, but not for subtraction, in a language-related region of the left temporal cortex. Second, we found grade-related increases of activity for subtraction, but not for multiplication, in a region of the right parietal cortex involved in the procedural manipulation of numerical quantities. The present results suggest that fluency in simple arithmetic in children may be achieved by both increasing reliance on verbal retrieval and by greater use of efficient quantity-based procedures, depending on the operation. PMID:25089323

  15. Application of modified profile analysis to function testing of the motion/no-motion issue in an aircraft ground-handling simulation. [statistical analysis procedure for man machine systems flight simulation

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.

    1979-01-01

    A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.

  16. Skylab checkout operations. [from multiple docking adapter contractor viewpoint

    NASA Technical Reports Server (NTRS)

    Timmons, K. P.

    1973-01-01

    The Skylab Program at Kennedy Space Center presented many opportunities for interesting and profound test and checkout experience. It also offered a compilation of challenges and promises for the Center and for the contractors responsible for the various modules making up Skylab. It is very probable that the various contractors had common experiences during the module and combined systems tests, but this paper will discuss those experiences from the viewpoint of the Multiple Docking Adapter contractor. The experience will consider personnel, procedures, and hardware.

  17. Estimating False Discovery Proportion Under Arbitrary Covariance Dependence*

    PubMed Central

    Fan, Jianqing; Han, Xu; Gu, Weijie

    2012-01-01

    Multiple hypothesis testing is a fundamental problem in high dimensional inference, with wide applications in many scientific fields. In genome-wide association studies, tens of thousands of tests are performed simultaneously to find if any SNPs are associated with some traits and those tests are correlated. When test statistics are correlated, false discovery control becomes very challenging under arbitrary dependence. In the current paper, we propose a novel method based on principal factor approximation, which successfully subtracts the common dependence and weakens significantly the correlation structure, to deal with an arbitrary dependence structure. We derive an approximate expression for false discovery proportion (FDP) in large scale multiple testing when a common threshold is used and provide a consistent estimate of realized FDP. This result has important applications in controlling FDR and FDP. Our estimate of realized FDP compares favorably with Efron (2007)’s approach, as demonstrated in the simulated examples. Our approach is further illustrated by some real data applications. We also propose a dependence-adjusted procedure, which is more powerful than the fixed threshold procedure. PMID:24729644

  18. Development and Application of a Two-Tier Multiple-Choice Diagnostic Test for High School Students' Understanding of Cell Division and Reproduction

    ERIC Educational Resources Information Center

    Sesli, Ertugrul; Kara, Yilmaz

    2012-01-01

    This study involved the development and application of a two-tier diagnostic test for measuring students' understanding of cell division and reproduction. The instrument development procedure had three general steps: defining the content boundaries of the test, collecting information on students' misconceptions, and instrument development.…

  19. Effect of implementing instructional videos in a physical examination course: an alternative paradigm for chiropractic physical examination teaching.

    PubMed

    Zhang, Niu; Chawla, Sudeep

    2012-01-01

    This study examined the effect of implementing instructional video in ophthalmic physical examination teaching on chiropractic students' laboratory physical examination skills and written test results. Instructional video clips of ophthalmic physical examination, consisting of both standard procedures and common mistakes, were created and used for laboratory teaching. The video clips were also available for student review after class. Students' laboratory skills and written test results were analyzed and compared using one-way analysis of variance (ANOVA) and post hoc multiple comparison tests among three study cohorts: the comparison cohort who did not utilize the instructional videos as a tool, the standard video cohort who viewed only the standard procedure of video clips, and the mistake-referenced video cohort who viewed video clips containing both standard procedure and common mistakes. One-way ANOVA suggested a significant difference of lab results among the three cohorts. Post hoc multiple comparisons further revealed that the mean scores of both video cohorts were significantly higher than that of the comparison cohort (p < .001). There was, however, no significant difference of the mean scores between the two video cohorts (p > .05). However, the percentage of students having a perfect score was the highest in the mistake-referenced video cohort. There was no significant difference of written test scores among all three cohorts (p > .05). The instructional video of the standard procedure improves chiropractic students' ophthalmic physical examination skills, which may be further enhanced by implementing a mistake-referenced instructional video.

  20. Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates

    PubMed Central

    Bartroff, Jay; Song, Jinlin

    2014-01-01

    This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948

  1. 40 CFR 86.132-96 - Vehicle preconditioning.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... outdoors awaiting testing, to prevent unusual loading of the canisters. During this time care must be taken... idle again for 1 minute. (H) After the vehicle is turned off the last time, it may be tested for... preconditioned according to the following procedure. For vehicles with multiple canisters in a series...

  2. Stress Reconstruction Analysis of Wheel Saw Cut Tests and Evaluation of Reconstruction Procedure

    DOT National Transportation Integrated Search

    1993-09-01

    The report is the fourth in a series of engineering studies on railroad vehicle wheel performance. The results of saw cut tests performed on one new and one used wheel designed for a fleet of multiple unit (MU) power cars are summarized and analyzed....

  3. Simultaneous Inference Procedures for Means.

    ERIC Educational Resources Information Center

    Krishnaiah, P. R.

    Some aspects of simultaneous tests for means are reviewed. Specifically, the comparison of univariate or multivariate normal populations based on the values of the means or mean vectors when the variances or covariance matrices are equal is discussed. Tukey's and Dunnett's tests for multiple comparisons of means, Scheffe's method of examining…

  4. A Generalized Logistic Regression Procedure to Detect Differential Item Functioning among Multiple Groups

    ERIC Educational Resources Information Center

    Magis, David; Raiche, Gilles; Beland, Sebastien; Gerard, Paul

    2011-01-01

    We present an extension of the logistic regression procedure to identify dichotomous differential item functioning (DIF) in the presence of more than two groups of respondents. Starting from the usual framework of a single focal group, we propose a general approach to estimate the item response functions in each group and to test for the presence…

  5. 40 CFR Appendix E to Part 63 - Monitoring Procedure for Nonthoroughly Mixed Open Biological Treatment Systems at Kraft Pulp...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for each data set that is collected during the initial performance test. A single composite value of... Multiple Zone Concentrations Calculations Procedure based on inlet and outlet concentrations (Column A of... composite value of Ks discussed in section III.C of this appendix. This value of Ks is calculated during the...

  6. Multiple statistical tests: Lessons from a d20.

    PubMed

    Madan, Christopher R

    2016-01-01

    Statistical analyses are often conducted with α= .05. When multiple statistical tests are conducted, this procedure needs to be adjusted to compensate for the otherwise inflated Type I error. In some instances in tabletop gaming, sometimes it is desired to roll a 20-sided die (or 'd20') twice and take the greater outcome. Here I draw from probability theory and the case of a d20, where the probability of obtaining any specific outcome is (1)/ 20, to determine the probability of obtaining a specific outcome (Type-I error) at least once across repeated, independent statistical tests.

  7. Analysis of 30 Genes (355 SNPS) Related to Energy Homeostasis for Association with Adiposity in European-American and Yup'ik Eskimo Populations

    PubMed Central

    Chung, Wendy K.; Patki, Amit; Matsuoka, Naoki; Boyer, Bert B.; Liu, Nianjun; Musani, Solomon K.; Goropashnaya, Anna V.; Tan, Perciliz L.; Katsanis, Nicholas; Johnson, Stephen B.; Gregersen, Peter K.; Allison, David B.; Leibel, Rudolph L.; Tiwari, Hemant K.

    2009-01-01

    Objective Human adiposity is highly heritable, but few of the genes that predispose to obesity in most humans are known. We tested candidate genes in pathways related to food intake and energy expenditure for association with measures of adiposity. Methods We studied 355 genetic variants in 30 candidate genes in 7 molecular pathways related to obesity in two groups of adult subjects: 1,982 unrelated European Americans living in the New York metropolitan area drawn from the extremes of their body mass index (BMI) distribution and 593 related Yup'ik Eskimos living in rural Alaska characterized for BMI, body composition, waist circumference, and skin fold thicknesses. Data were analyzed by using a mixed model in conjunction with a false discovery rate (FDR) procedure to correct for multiple testing. Results After correcting for multiple testing, two single nucleotide polymorphisms (SNPs) in Ghrelin (GHRL) (rs35682 and rs35683) were associated with BMI in the New York European Americans. This association was not replicated in the Yup'ik participants. There was no evidence for gene × gene interactions among genes within the same molecular pathway after adjusting for multiple testing via FDR control procedure. Conclusion Genetic variation in GHRL may have a modest impact on BMI in European Americans. PMID:19077438

  8. Analysis of 30 genes (355 SNPS) related to energy homeostasis for association with adiposity in European-American and Yup'ik Eskimo populations.

    PubMed

    Chung, Wendy K; Patki, Amit; Matsuoka, Naoki; Boyer, Bert B; Liu, Nianjun; Musani, Solomon K; Goropashnaya, Anna V; Tan, Perciliz L; Katsanis, Nicholas; Johnson, Stephen B; Gregersen, Peter K; Allison, David B; Leibel, Rudolph L; Tiwari, Hemant K

    2009-01-01

    Human adiposity is highly heritable, but few of the genes that predispose to obesity in most humans are known. We tested candidate genes in pathways related to food intake and energy expenditure for association with measures of adiposity. We studied 355 genetic variants in 30 candidate genes in 7 molecular pathways related to obesity in two groups of adult subjects: 1,982 unrelated European Americans living in the New York metropolitan area drawn from the extremes of their body mass index (BMI) distribution and 593 related Yup'ik Eskimos living in rural Alaska characterized for BMI, body composition, waist circumference, and skin fold thicknesses. Data were analyzed by using a mixed model in conjunction with a false discovery rate (FDR) procedure to correct for multiple testing. After correcting for multiple testing, two single nucleotide polymorphisms (SNPs) in Ghrelin (GHRL) (rs35682 and rs35683) were associated with BMI in the New York European Americans. This association was not replicated in the Yup'ik participants. There was no evidence for gene x gene interactions among genes within the same molecular pathway after adjusting for multiple testing via FDR control procedure. Genetic variation in GHRL may have a modest impact on BMI in European Americans.

  9. Direct and conceptual replications of the taxometric analysis of type a behavior.

    PubMed

    Wilmot, Michael P; Haslam, Nick; Tian, Jingyuan; Ones, Deniz S

    2018-05-17

    We present direct and conceptual replications of the influential taxometric analysis of Type A Behavior (TAB; Strube, 1989), which reported evidence for the latent typology of the construct. Study 1, the direct replication (N = 2,373), duplicated sampling and methodological procedures of the original study, but results showed that the item indicators used in the original study lacked sufficient validity to unambiguously determine latent structure. Using improved factorial subscale indicators to further test the question, multiple taxometric procedures, in combination with parallel analyses of simulated data, failed to replicate the original typological finding. Study 2, the conceptual replication, tested the latent structure of the wider construct of TAB using the sample from the Caerphilly Prospective Study (N = 2,254), which contains responses to the three most widely used self-report measures of TAB: the Jenkins Activity Survey, Bortner scale, and Framingham scale. Factorial subscale indicators were derived from the measures and submitted to multiple taxometric procedures. Results of Study 2 converged with those of Study 1, providing clear evidence of latent dimensional structure. Overall, results suggest there is no evidence for the type in TAB. Findings imply that theoretical models of TAB, assessment practices, and data analytic procedures that assume a typology should be replaced by dimensional models, factorial subscale measures, and corresponding statistical approaches. Specific subscale measures that tap multiple Big Five trait domains, and show evidence of predictive utility, are also recommended. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Organizational Justice and Physiological Coronary Heart Disease Risk Factors in Japanese Employees: a Cross-Sectional Study.

    PubMed

    Inoue, Akiomi; Kawakami, Norito; Eguchi, Hisashi; Miyaki, Koichi; Tsutsumi, Akizumi

    2015-12-01

    Growing evidence has shown that lack of organizational justice (i.e., procedural justice and interactional justice) is associated with coronary heart disease (CHD) while biological mechanisms underlying this association have not yet been fully clarified. The purpose of the present study was to investigate the cross-sectional association of organizational justice with physiological CHD risk factors (i.e., blood pressure, high-density lipoprotein [HDL] cholesterol, low-density lipoprotein [LDL] cholesterol, and triglyceride) in Japanese employees. Overall, 3598 male and 901 female employees from two manufacturing companies in Japan completed self-administered questionnaires measuring organizational justice, demographic characteristics, and lifestyle factors. They completed health checkup, which included blood pressure and serum lipid measurements. Multiple logistic regression analyses and trend tests were conducted. Among male employees, multiple logistic regression analyses and trend tests showed significant associations of low procedural justice and low interactional justice with high triglyceride (defined as 150 mg/dL or greater) after adjusting for demographic characteristics and lifestyle factors. Among female employees, trend tests showed significant dose-response relationship between low interactional justice and high LDL cholesterol (defined as 140 mg/dL or greater) while multiple logistic regression analysis showed only marginally significant or insignificant odds ratio of high LDL cholesterol among the low interactional justice group. Neither procedural justice nor interactional justice was associated with blood pressure or HDL cholesterol. Organizational justice may be an important psychosocial factor associated with increased triglyceride at least among Japanese male employees.

  11. Development of a multiple-parameter nonlinear perturbation procedure for transonic turbomachinery flows: Preliminary application to design/optimization problems

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.

    1983-01-01

    An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.

  12. The Effect of SSM Grading on Reliability When Residual Items Have No Discriminating Power.

    ERIC Educational Resources Information Center

    Kane, Michael T.; Moloney, James M.

    Gilman and Ferry have shown that when the student's score on a multiple choice test is the total number of responses necessary to get all items correct, substantial increases in reliability can occur. In contrast, similar procedures giving partial credit on multiple choice items have resulted in relatively small gains in reliability. The analysis…

  13. Item Response Theory with Covariates (IRT-C): Assessing Item Recovery and Differential Item Functioning for the Three-Parameter Logistic Model

    ERIC Educational Resources Information Center

    Tay, Louis; Huang, Qiming; Vermunt, Jeroen K.

    2016-01-01

    In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across multiple variables (covariates) simultaneously. To…

  14. The Effects of Multiple Exemplar Training on a Working Memory Task Involving Sequential Responding in Children with Autism

    ERIC Educational Resources Information Center

    Baltruschat, Lisa; Hasselhorn, Marcus; Tarbox, Jonathan; Dixon, Dennis R.; Najdowski, Adel; Mullins, Ryan David; Gould, Evelyn

    2012-01-01

    This study is part of a programmatic line of research into the use of basic positive reinforcement procedures for improving working memory in children with autism spectrum disorders. The authors evaluated the effects of multiple exemplar training, utilizing positive reinforcement, on performance of a "digit span backwards" task--a test of working…

  15. A step-up test procedure to find the minimum effective dose.

    PubMed

    Wang, Weizhen; Peng, Jianan

    2015-01-01

    It is of great interest to find the minimum effective dose (MED) in dose-response studies. A sequence of decreasing null hypotheses to find the MED is formulated under the assumption of nondecreasing dose response means. A step-up multiple test procedure that controls the familywise error rate (FWER) is constructed based on the maximum likelihood estimators for the monotone normal means. When the MED is equal to one, the proposed test is uniformly more powerful than Hsu and Berger's test (1999). Also, a simulation study shows a substantial power improvement for the proposed test over four competitors. Three R-codes are provided in Supplemental Materials for this article. Go to the publishers online edition of Journal of Biopharmaceutical Statistics to view the files.

  16. A new statistical method for transfer coefficient calculations in the framework of the general multiple-compartment model of transport for radionuclides in biological systems.

    PubMed

    Garcia, F; Arruda-Neto, J D; Manso, M V; Helene, O M; Vanin, V R; Rodriguez, O; Mesa, J; Likhachev, V P; Filho, J W; Deppman, A; Perez, G; Guzman, F; de Camargo, S P

    1999-10-01

    A new and simple statistical procedure (STATFLUX) for the calculation of transfer coefficients of radionuclide transport to animals and plants is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. By using experimentally available curves of radionuclide concentrations versus time, for each animal compartment (organs), flow parameters were estimated by employing a least-squares procedure, whose consistency is tested. Some numerical results are presented in order to compare the STATFLUX transfer coefficients with those from other works and experimental data.

  17. Single-Word Intelligibility in Speakers with Repaired Cleft Palate

    ERIC Educational Resources Information Center

    Whitehill, Tara; Chau, Cynthia

    2004-01-01

    Many speakers with repaired cleft palate have reduced intelligibility, but there are limitations with current procedures for assessing intelligibility. The aim of this study was to construct a single-word intelligibility test for speakers with cleft palate. The test used a multiple-choice identification format, and was based on phonetic contrasts…

  18. Development of RAD-Score: A Tool to Assess the Procedural Competence of Diagnostic Radiology Residents.

    PubMed

    Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M

    2017-04-01

    The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.

  19. Stochastic DG Placement for Conservation Voltage Reduction Based on Multiple Replications Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhaoyu; Chen, Bokan; Wang, Jianhui

    2015-06-01

    Conservation voltage reduction (CVR) and distributed-generation (DG) integration are popular strategies implemented by utilities to improve energy efficiency. This paper investigates the interactions between CVR and DG placement to minimize load consumption in distribution networks, while keeping the lowest voltage level within the predefined range. The optimal placement of DG units is formulated as a stochastic optimization problem considering the uncertainty of DG outputs and load consumptions. A sample average approximation algorithm-based technique is developed to solve the formulated problem effectively. A multiple replications procedure is developed to test the stability of the solution and calculate the confidence interval ofmore » the gap between the candidate solution and optimal solution. The proposed method has been applied to the IEEE 37-bus distribution test system with different scenarios. The numerical results indicate that the implementations of CVR and DG, if combined, can achieve significant energy savings.« less

  20. Development and Pilot Testing of 24-Hour Multiple-Pass Recall to Assess Dietary Intake of Toddlers of Somali- and Iraqi-Born Mothers Living in Norway

    PubMed Central

    Grewal, Navnit Kaur; Mosdøl, Annhild; Aunan, Marte Bergsund; Monsen, Carina; Torheim, Liv Elin

    2014-01-01

    The aim of this study was to develop, test, and evaluate a 24-h recall procedure to assess the dietary intake of toddlers of Somali- and Iraqi-born mothers living in Norway. A protocol for a 24-h multiple-pass recall procedure, registration forms, and visual tools (a picture library for food identification and portion size estimation) was developed and tested in 12 mothers from Somalia and Iraq with children aged 10–21 months. Five female field workers were recruited and trained to conduct the interviews. Evaluation data for the 24-h recall procedure were collected from both the mothers and the field workers. Nutrient intake was calculated using a Norwegian dietary calculation system. Each child’s estimated energy intake was compared with its estimated energy requirement. Both the mothers and the field workers found the method feasible and the visual tools useful. The estimated energy intake corresponded well with the estimated energy requirement for most of the children (within mean ± 2 SD, except for three). The pilot study identified the need for additional foods in the picture library and some crucial aspects in training and supervising the field workers to reduce sources of error in the data collection. PMID:24949548

  1. Combined use of Kappa Free Light Chain Index and Isoelectrofocusing of Cerebro-Spinal Fluid in Diagnosing Multiple Sclerosis: Performances and Costs.

    PubMed

    Crespi, Ilaria; Sulas, Maria Giovanna; Mora, Riccardo; Naldi, Paola; Vecchio, Domizia; Comi, Cristoforo; Cantello, Roberto; Bellomo, Giorgio

    2017-03-01

    Isoelectrofocusing (IEF) to detect oligoclonal bands (OBCs) in cerebrospinal fluid (CSF) is the gold standard approach for evaluating intrathecal immunoglobulin synthesis in multiple sclerosis (MS) but the kappa free light chain index (KFLCi) is emerging as an alternative marker, and the combined/sequential uses of IEF and KFLCi have never been challenged. CSF and serum albumin, IgG, kFLC and lFLC were measured by nephelometry; albumin, IgG and kFLC quotients as well as Link and kFLC indexes were calculated; OCBs were evaluated by immunofixation. A total of 150 consecutive patients: 48 with MS, 32 with other neurological inflammatory diseases (NID), 62 with neurological non-inflammatory diseases (NNID), and 8 without any detectable neurological disease (NND) were investigated. Both IEF and KFLCi showed a similar accuracy as diagnostic tests for multiple sclerosis. The high sensitivity and specificity associated with the lower cost of KFLCi suggested to use this test first, followed by IEF as a confirmative procedure. The sequential use of IEF and KFLCi showed high diagnostic efficiency with cost reduction of 43 and 21%, if compared to the contemporary use of both tests, or the unique use of IEF in all patients. The "sequential testing" using KFLCi followed by IEF in MS represents an optimal procedure with accurate performance and lower costs.

  2. An analytical approach to reduce between-plate variation in multiplex assays that measure antibodies to Plasmodium falciparum antigens.

    PubMed

    Fang, Rui; Wey, Andrew; Bobbili, Naveen K; Leke, Rose F G; Taylor, Diane Wallace; Chen, John J

    2017-07-17

    Antibodies play an important role in immunity to malaria. Recent studies show that antibodies to multiple antigens, as well as, the overall breadth of the response are associated with protection from malaria. Yet, the variability and reliability of antibody measurements against a combination of malarial antigens using multiplex assays have not been well characterized. A normalization procedure for reducing between-plate variation using replicates of pooled positive and negative controls was investigated. Sixty test samples (30 from malaria-positive and 30 malaria-negative individuals), together with five pooled positive-controls and two pooled negative-controls, were screened for antibody levels to 9 malarial antigens, including merozoite antigens (AMA1, EBA175, MSP1, MSP2, MSP3, MSP11, Pf41), sporozoite CSP, and pregnancy-associated VAR2CSA. The antibody levels were measured in triplicate on each of 3 plates, and the experiments were replicated on two different days by the same technician. The performance of the proposed normalization procedure was evaluated with the pooled controls for the test samples on both the linear and natural-log scales. Compared with data on the linear scale, the natural-log transformed data were less skewed and reduced the mean-variance relationship. The proposed normalization procedure using pooled controls on the natural-log scale significantly reduced between-plate variation. For malaria-related research that measure antibodies to multiple antigens with multiplex assays, the natural-log transformation is recommended for data analysis and use of the normalization procedure with multiple pooled controls can improve the precision of antibody measurements.

  3. Identifying and exploiting trait-relevant tissues with multiple functional annotations in genome-wide association studies

    PubMed Central

    Zhang, Shujun

    2018-01-01

    Genome-wide association studies (GWASs) have identified many disease associated loci, the majority of which have unknown biological functions. Understanding the mechanism underlying trait associations requires identifying trait-relevant tissues and investigating associations in a trait-specific fashion. Here, we extend the widely used linear mixed model to incorporate multiple SNP functional annotations from omics studies with GWAS summary statistics to facilitate the identification of trait-relevant tissues, with which to further construct powerful association tests. Specifically, we rely on a generalized estimating equation based algorithm for parameter inference, a mixture modeling framework for trait-tissue relevance classification, and a weighted sequence kernel association test constructed based on the identified trait-relevant tissues for powerful association analysis. We refer to our analytic procedure as the Scalable Multiple Annotation integration for trait-Relevant Tissue identification and usage (SMART). With extensive simulations, we show how our method can make use of multiple complementary annotations to improve the accuracy for identifying trait-relevant tissues. In addition, our procedure allows us to make use of the inferred trait-relevant tissues, for the first time, to construct more powerful SNP set tests. We apply our method for an in-depth analysis of 43 traits from 28 GWASs using tissue-specific annotations in 105 tissues derived from ENCODE and Roadmap. Our results reveal new trait-tissue relevance, pinpoint important annotations that are informative of trait-tissue relationship, and illustrate how we can use the inferred trait-relevant tissues to construct more powerful association tests in the Wellcome trust case control consortium study. PMID:29377896

  4. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  5. Site conditions related to erosion on logging roads

    Treesearch

    R. M. Rice; J. D. McCashion

    1985-01-01

    Synopsis - Data collected from 299 road segments in northwestern California were used to develop and test a procedure for estimating and managing road-related erosion. Site conditions and the design of each segment were described by 30 variables. Equations developed using 149 of the road segments were tested on the other 150. The best multiple regression equation...

  6. Cultural-Linguistic Test Adaptations: Guidelines for Selection, Alteration, Use, and Review

    ERIC Educational Resources Information Center

    Krach, S. Kathleen; McCreery, Michael P.; Guerard, Jessika

    2017-01-01

    In 1991, Bracken and Barona wrote an article for "School Psychology International" focusing on state of the art procedures for translating and using tests across multiple languages. Considerable progress has been achieved in this area over the 25 years between that publication and today. This article seeks to provide a more current set…

  7. In Situ Training for Increasing Head Start After-Care Teachers' Use of Praise

    ERIC Educational Resources Information Center

    LaBrot, Zachary C.; Pasqua, Jamie L.; Dufrene, Brad A.; Brewer, Elizabeth Ann; Goff, Brian

    2016-01-01

    This study tested the effects of the direct behavioral consultation in situ training procedure for increasing Head Start teachers' praise during an after-school program. Participants included four Head Start teachers in one Head Start center. A multiple baseline design across teachers was employed to test the effects of in situ training on…

  8. PROMISE: a tool to identify genomic features with a specific biologically interesting pattern of associations with multiple endpoint variables.

    PubMed

    Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder

    2009-08-15

    In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.

  9. Rapid diagnosis of tinea incognito using handheld reflectance confocal microscopy: a paradigm shift in dermatology?

    PubMed

    Navarrete-Dechent, Cristián; Bajaj, Shirin; Marghoob, Ashfaq A; Marchetti, Michael A

    2015-06-01

    Dermatophytoses are common skin infections. Traditional diagnostic tests such as skin scrapings for light microscopy examination, fungal cultures and biopsies remain imperfect due to false-negative test results, cost, time required to perform the procedure, time delays in test results and/or a requirement for an invasive procedure. Herein, we present a case of an 80-year-old female whose tinea incognito was non-invasively diagnosed within seconds using handheld reflectance confocal microscopy (RCM). As non-invasive skin imaging continues to improve, we expect light-based office microscopy to be replaced with technologies such as RCM, which has multiple and continually expanding diagnostic applications. © 2015 Blackwell Verlag GmbH.

  10. Multiple objective optimization in reliability demonstration test

    DOE PAGES

    Lu, Lu; Anderson-Cook, Christine Michaela; Li, Mingyang

    2016-10-01

    Reliability demonstration tests are usually performed in product design or validation processes to demonstrate whether a product meets specified requirements on reliability. For binomial demonstration tests, the zero-failure test has been most commonly used due to its simplicity and use of minimum sample size to achieve an acceptable consumer’s risk level. However, this test can often result in unacceptably high risk for producers as well as a low probability of passing the test even when the product has good reliability. This paper explicitly explores the interrelationship between multiple objectives that are commonly of interest when planning a demonstration test andmore » proposes structured decision-making procedures using a Pareto front approach for selecting an optimal test plan based on simultaneously balancing multiple criteria. Different strategies are suggested for scenarios with different user priorities and graphical tools are developed to help quantify the trade-offs between choices and to facilitate informed decision making. As a result, potential impacts of some subjective user inputs on the final decision are studied to offer insights and useful guidance for general applications.« less

  11. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  12. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  13. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  14. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  15. The impact of justice climate and justice orientation on work outcomes: a cross-level multifoci framework.

    PubMed

    Liao, Hui; Rupp, Deborah E

    2005-03-01

    In this article, which takes a person-situation approach, the authors propose and test a cross-level multifoci model of workplace justice. They crossed 3 types of justice (procedural, informational, and interpersonal) with 2 foci (organization and supervisor) and aggregated to the group level to create 6 distinct justice climate variables. They then tested for the effects of these variables on either organization-directed or supervisor-directed commitment, satisfaction, and citizenship behavior. The authors also tested justice orientation as a moderator of these relationships. The results, based on 231 employees constituting 44 work groups representing multiple organizations and occupations, revealed that 4 forms of justice climate (organization-focused procedural and informational justice climate and supervisor-focused procedural and interpersonal justice climate) were significantly related to various work outcomes after controlling for corresponding individual-level justice perceptions. In addition, some moderation effects were found. Implications for organizations and future research are discussed.

  16. False discovery rate control incorporating phylogenetic tree increases detection power in microbiome-wide multiple testing.

    PubMed

    Xiao, Jian; Cao, Hongyuan; Chen, Jun

    2017-09-15

    Next generation sequencing technologies have enabled the study of the human microbiome through direct sequencing of microbial DNA, resulting in an enormous amount of microbiome sequencing data. One unique characteristic of microbiome data is the phylogenetic tree that relates all the bacterial species. Closely related bacterial species have a tendency to exhibit a similar relationship with the environment or disease. Thus, incorporating the phylogenetic tree information can potentially improve the detection power for microbiome-wide association studies, where hundreds or thousands of tests are conducted simultaneously to identify bacterial species associated with a phenotype of interest. Despite much progress in multiple testing procedures such as false discovery rate (FDR) control, methods that take into account the phylogenetic tree are largely limited. We propose a new FDR control procedure that incorporates the prior structure information and apply it to microbiome data. The proposed procedure is based on a hierarchical model, where a structure-based prior distribution is designed to utilize the phylogenetic tree. By borrowing information from neighboring bacterial species, we are able to improve the statistical power of detecting associated bacterial species while controlling the FDR at desired levels. When the phylogenetic tree is mis-specified or non-informative, our procedure achieves a similar power as traditional procedures that do not take into account the tree structure. We demonstrate the performance of our method through extensive simulations and real microbiome datasets. We identified far more alcohol-drinking associated bacterial species than traditional methods. R package StructFDR is available from CRAN. chen.jun2@mayo.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Effect of intraoperative analgesia on children's pain perception during recovery after painful dental procedures performed under general anaesthesia.

    PubMed

    El Batawi, H Y

    2015-02-01

    To investigate the possible effect of intraoperative analgesia, namely diclofenac sodium compared to acetaminophen on post-recovery pain perception in children undergoing painful dental procedures under general anaesthesia. A double-blind randomised clinical trial. A sample of 180 consecutive cases of children undergoing full dental rehabilitation under general anaesthesia in a private hospital in Saudi Arabia during 2013 was divided into three groups (60 children each) according to the analgesic used prior to extubation. Group A, children had diclofenac sodium suppository. Group B, children received acetaminophen suppository and Group C, the control group. Using an authenticated Arabic version of the Wong and Baker faces Pain assessment Scale, patients were asked to choose the face that suits best the pain he/she is suffering. Data were collected and recorded for statistical analysis. Student's t test was used for comparison of sample means. A preliminary F test to compare sample variances was carried out to determine the appropriate t test variant to be used. A "p" value less than 0.05 was considered significant. More than 93% of children had post-operative pain in varying degrees. High statistical significance was observed between children in groups A and B compared to control group C with the later scoring high pain perception. Diclofenac showed higher potency in multiple painful procedures, while the statistical difference was not significant in children with three or less painful dental procedures. Diclophenac sodium is more potent than acetaminophen, especially for multiple pain-provoking or traumatic procedures. A timely use of NSAID analgesia just before extubation helps provide adequate coverage during recovery. Peri-operative analgesia is to be recommended as an essential treatment adjunct for child dental rehabilitation under general anaesthesia.

  18. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    PubMed

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  19. Load rating and retrofit testing of bridge timber piles subjected to eccentric loading.

    DOT National Transportation Integrated Search

    2012-11-01

    This report first evaluated the load rating procedure currently in use by the Illinois Department of Transportation (IDOT) for rating timber : piles supporting multiple-span, simply supported bridges. For simplicity, these piles are often rated under...

  20. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  1. Recommendations for Evaluating Multiple Filters in Ballast Water Management Systems for US Type Approval

    DTIC Science & Technology

    2016-01-01

    is extremely unlikely to be practicable . A second approach is to conduct a full suite of TA testing on a BWMS with a “base filter” configuration...that of full TA testing. Here, three land-based tests would be conducted, and O&M and component testing would also occur. If time or practicality ... Practical salinity units SAE Society of Automotive Engineers SDI Silt density index SOP Standard operating procedure STEP Shipboard Technology

  2. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  3. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  4. Imputation of Test Scores in the National Education Longitudinal Study of 1988 (NELS:88). Working Paper Series.

    ERIC Educational Resources Information Center

    Bokossa, Maxime C.; Huang, Gary G.

    This report describes the imputation procedures used to deal with missing data in the National Education Longitudinal Study of 1988 (NELS:88), the only current National Center for Education Statistics (NCES) dataset that contains scores from cognitive tests given the same set of students at multiple time points. As is inevitable, cognitive test…

  5. Testing accelerometer rectification error caused by multidimensional composite inputs with double turntable centrifuge.

    PubMed

    Guan, W; Meng, X F; Dong, X M

    2014-12-01

    Rectification error is a critical characteristic of inertial accelerometers. Accelerometers working in operational situations are stimulated by composite inputs, including constant acceleration and vibration, from multiple directions. However, traditional methods for evaluating rectification error only use one-dimensional vibration. In this paper, a double turntable centrifuge (DTC) was utilized to produce the constant acceleration and vibration simultaneously and we tested the rectification error due to the composite accelerations. At first, we deduced the expression of the rectification error with the output of the DTC and a static model of the single-axis pendulous accelerometer under test. Theoretical investigation and analysis were carried out in accordance with the rectification error model. Then a detailed experimental procedure and testing results were described. We measured the rectification error with various constant accelerations at different frequencies and amplitudes of the vibration. The experimental results showed the distinguished characteristics of the rectification error caused by the composite accelerations. The linear relation between the constant acceleration and the rectification error was proved. The experimental procedure and results presented in this context can be referenced for the investigation of the characteristics of accelerometer with multiple inputs.

  6. Multiplicity: discussion points from the Statisticians in the Pharmaceutical Industry multiplicity expert group.

    PubMed

    Phillips, Alan; Fletcher, Chrissie; Atkinson, Gary; Channon, Eddie; Douiri, Abdel; Jaki, Thomas; Maca, Jeff; Morgan, David; Roger, James Henry; Terrill, Paul

    2013-01-01

    In May 2012, the Committee of Health and Medicinal Products issued a concept paper on the need to review the points to consider document on multiplicity issues in clinical trials. In preparation for the release of the updated guidance document, Statisticians in the Pharmaceutical Industry held a one-day expert group meeting in January 2013. Topics debated included multiplicity and the drug development process, the usefulness and limitations of newly developed strategies to deal with multiplicity, multiplicity issues arising from interim decisions and multiregional development, and the need for simultaneous confidence intervals (CIs) corresponding to multiple test procedures. A clear message from the meeting was that multiplicity adjustments need to be considered when the intention is to make a formal statement about efficacy or safety based on hypothesis tests. Statisticians have a key role when designing studies to assess what adjustment really means in the context of the research being conducted. More thought during the planning phase needs to be given to multiplicity adjustments for secondary endpoints given these are increasing in importance in differentiating products in the market place. No consensus was reached on the role of simultaneous CIs in the context of superiority trials. It was argued that unadjusted intervals should be employed as the primary purpose of the intervals is estimation, while the purpose of hypothesis testing is to formally establish an effect. The opposing view was that CIs should correspond to the test decision whenever possible. Copyright © 2013 John Wiley & Sons, Ltd.

  7. FDR doesn't Tell the Whole Story: Joint Influence of Effect Size and Covariance Structure on the Distribution of the False Discovery Proportions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James

    2011-01-01

    As part of a 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report results of simulations that estimated the false discovery rate (FDR) for equally correlated test statistics using a well-known multiple-test procedure. In our study we estimate the distribution of the false discovery proportion (FDP) for the same procedure under a variety of correlation structures among multiple dependent variables in a MANOVA context. Specifically, we study the mean (the FDR), skewness, kurtosis, and percentiles of the FDP distribution in the case of multiple comparisons that give rise to correlated non-central t-statistics when results at several time periods are being compared to baseline. Even if the FDR achieves its nominal value, other aspects of the distribution of the FDP depend on the interaction between signed effect sizes and correlations among variables, proportion of true nulls, and number of dependent variables. We show examples where the mean FDP (the FDR) is 10% as designed, yet there is a surprising probability of having 30% or more false discoveries. Thus, in a real experiment, the proportion of false discoveries could be quite different from the stipulated FDR.

  8. Study of the mapping of Navier-Stokes algorithms onto multiple-instruction/multiple-data-stream computers

    NASA Technical Reports Server (NTRS)

    Eberhardt, D. S.; Baganoff, D.; Stevens, K.

    1984-01-01

    Implicit approximate-factored algorithms have certain properties that are suitable for parallel processing. A particular computational fluid dynamics (CFD) code, using this algorithm, is mapped onto a multiple-instruction/multiple-data-stream (MIMD) computer architecture. An explanation of this mapping procedure is presented, as well as some of the difficulties encountered when trying to run the code concurrently. Timing results are given for runs on the Ames Research Center's MIMD test facility which consists of two VAX 11/780's with a common MA780 multi-ported memory. Speedups exceeding 1.9 for characteristic CFD runs were indicated by the timing results.

  9. A chimera grid scheme. [multiple overset body-conforming mesh system for finite difference adaptation to complex aircraft configurations

    NASA Technical Reports Server (NTRS)

    Steger, J. L.; Dougherty, F. C.; Benek, J. A.

    1983-01-01

    A mesh system composed of multiple overset body-conforming grids is described for adapting finite-difference procedures to complex aircraft configurations. In this so-called 'chimera mesh,' a major grid is generated about a main component of the configuration and overset minor grids are used to resolve all other features. Methods for connecting overset multiple grids and modifications of flow-simulation algorithms are discussed. Computational tests in two dimensions indicate that the use of multiple overset grids can simplify the task of grid generation without an adverse effect on flow-field algorithms and computer code complexity.

  10. A two-stage design for multiple testing in large-scale association studies.

    PubMed

    Wen, Shu-Hui; Tzeng, Jung-Ying; Kao, Jau-Tsuen; Hsiao, Chuhsing Kate

    2006-01-01

    Modern association studies often involve a large number of markers and hence may encounter the problem of testing multiple hypotheses. Traditional procedures are usually over-conservative and with low power to detect mild genetic effects. From the design perspective, we propose a two-stage selection procedure to address this concern. Our main principle is to reduce the total number of tests by removing clearly unassociated markers in the first-stage test. Next, conditional on the findings of the first stage, which uses a less stringent nominal level, a more conservative test is conducted in the second stage using the augmented data and the data from the first stage. Previous studies have suggested using independent samples to avoid inflated errors. However, we found that, after accounting for the dependence between these two samples, the true discovery rate increases substantially. In addition, the cost of genotyping can be greatly reduced via this approach. Results from a study of hypertriglyceridemia and simulations suggest the two-stage method has a higher overall true positive rate (TPR) with a controlled overall false positive rate (FPR) when compared with single-stage approaches. We also report the analytical form of its overall FPR, which may be useful in guiding study design to achieve a high TPR while retaining the desired FPR.

  11. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  12. Simple and inexpensive microfluidic devices for the generation of monodisperse multiple emulsions

    NASA Astrophysics Data System (ADS)

    Li, Er Qiang; Zhang, Jia Ming; Thoroddsen, Sigurdur T.

    2014-01-01

    Droplet-based microfluidic devices have become a preferred versatile platform for various fields in physics, chemistry and biology. Polydimethylsiloxane soft lithography, the mainstay for fabricating microfluidic devices, usually requires the usage of expensive apparatus and a complex manufacturing procedure. Here, we report the design and fabrication of simple and inexpensive microfluidic devices based on microscope glass slides and pulled glass capillaries, for generating monodisperse multiple emulsions. The advantages of our method lie in a simple manufacturing procedure, inexpensive processing equipment and flexibility in the surface modification of the designed microfluidic devices. Different types of devices have been designed and tested and the experimental results demonstrated their robustness for preparing monodisperse single, double, triple and multi-component emulsions.

  13. PROMISE: a tool to identify genomic features with a specific biologically interesting pattern of associations with multiple endpoint variables

    PubMed Central

    Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder

    2009-01-01

    Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086

  14. A Study of Platelet Inhibition, Using a 'Point of Care' Platelet Function Test, following Primary Percutaneous Coronary Intervention for ST-Elevation Myocardial Infarction [PINPOINT-PPCI].

    PubMed

    Johnson, Thomas W; Mumford, Andrew D; Scott, Lauren J; Mundell, Stuart; Butler, Mark; Strange, Julian W; Rogers, Chris A; Reeves, Barnaby C; Baumbach, Andreas

    2015-01-01

    Rapid coronary recanalization following ST-elevation myocardial infarction (STEMI) requires effective anti-platelet and anti-thrombotic therapies. This study tested the impact of door to end of procedure ('door-to-end') time and baseline platelet activity on platelet inhibition within 24hours post-STEMI. 108 patients, treated with prasugrel and procedural bivalirudin, underwent Multiplate® platelet function testing at baseline, 0, 1, 2 and 24hours post-procedure. Major adverse cardiac events (MACE), bleeding and stent thrombosis (ST) were recorded. Baseline ADP activity was high (88.3U [71.8-109.0]), procedural time and consequently bivalirudin infusion duration were short (median door-to-end time 55minutes [40-70] and infusion duration 30minutes [20-42]). Baseline ADP was observed to influence all subsequent measurements of ADP activity, whereas door-to-end time only influenced ADP immediately post-procedure. High residual platelet reactivity (HRPR ADP>46.8U) was observed in 75% of patients immediately post-procedure and persisted in 24% of patients at 2hours. Five patients suffered in-hospital MACE (4.6%). Acute ST occurred in 4 patients, all were <120mins post-procedure and had HRPR. No significant bleeding was observed. In a post-hoc analysis, pre-procedural morphine use was associated with significantly higher ADP activity following intervention. Baseline platelet function, time to STEMI treatment and opiate use all significantly influence immediate post-procedural platelet activity.

  15. Identifying Gifted Students: A Practical Guide

    ERIC Educational Resources Information Center

    Johnsen, S., Ed.

    2004-01-01

    This user-friendly guide offers advice and insight on developing defensible identification procedures and services for gifted and talented students. Special attention is given to the use of multiple methods including qualitative and quantitative assessments such as standardized measures (e.g. intelligence, aptitude, and achievement tests),…

  16. Visual Habituation Paradigm with Adults with Profound Intellectual and Multiple Disabilities: A New Way for Cognitive Assessment?

    ERIC Educational Resources Information Center

    Chard, Melissa; Roulin, Jean-Luc; Bouvard, Martine

    2014-01-01

    Background: The use of common psychological assessment tools is invalidated with persons with PIMD. The aim of this study was to test the feasibility of using a visual habituation procedure with a group of adults with PIMD, to develop a new theoretical and practical framework for the assessment of cognitive abilities. Methods: To test the…

  17. Effects of Multiple Contexts and Context Similarity on the Renewal of Extinguished Conditioned Behaviour in an ABA Design with Humans

    ERIC Educational Resources Information Center

    Balooch, Siavash Bandarian; Neumann, David L.

    2011-01-01

    The ABA renewal procedure involves pairing a conditional stimulus (CS) and an unconditional stimulus (US) in one context (A), presenting extinction trials of the CS alone in a second context (B), and nonreinforced test trials of the CS in the acquisition context (A). The renewal of extinguished conditioned behaviour is observed during test. The…

  18. A Technical Description of the Procedures Used in Calculating School-Level Scaled Scores for the "Survey of Basic Skills: Grade 6."

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Bock, R. Darrell

    New legislation in 1972 shifted the emphasis of the California Assessment Program (CAP) from traditional every pupil achievement testing to a more efficient multiple-matrix testing design, under which a broad spectrum of skills could be surveyed without undue expenditure of educational resources. Scale score reporting was introduced to the grade 6…

  19. [Mechanical properties of nickel-titanium files following multiple heat sterilizations].

    PubMed

    Testarelli, L; Gallottini, L; Gambarini, G

    2003-04-01

    The effect of cycles of sterilization procedures on nickel-titanium (NiTi) endodontic instruments is a serious concern for practitioners. There is no agreement in the literature whether these procedures could adversely affect the mechanical properties of endodontic files, and, consequently, increase the risk of intracanal failure. The purpose of this study was to evaluate the mechanichal resistance of Hero (MicroMega, Besancon, France) instruments, before and after sterilization procedures. Thirty 02, 04, 06 tapered Hero size 30 new instruments were chosen and divided into 3 groups. Group A (control) were tested according to ANSI/ADA Spec.no 28 for torsional resistance, angle of torque and angle at breakage (45 inverted exclamation mark ). Group B files were first sterilized with chemiclave for 10 cycles of 20 minutes at 124 inverted exclamation mark C and then tested as described above. Group C files were first sterilized with glass beads for 10 cycles of 20 sec. at 250 inverted exclamation mark C and then tested as described above. Data were collected and statistically analyzed (t-paired test). Differences among the 3 groups were statistically not significant for both tests. All data were well within Spec.no 28 standard values. From the results of the present study, we may conclude that repeated sterilization procedures do not adversely affect the mechanichal resistance of Hero files.

  20. Analysis of simulated angiographic procedures. Part 2: extracting efficiency data from audio and video recordings.

    PubMed

    Duncan, James R; Kline, Benjamin; Glaiberman, Craig B

    2007-04-01

    To create and test methods of extracting efficiency data from recordings of simulated renal stent procedures. Task analysis was performed and used to design a standardized testing protocol. Five experienced angiographers then performed 16 renal stent simulations using the Simbionix AngioMentor angiographic simulator. Audio and video recordings of these simulations were captured from multiple vantage points. The recordings were synchronized and compiled. A series of efficiency metrics (procedure time, contrast volume, and tool use) were then extracted from the recordings. The intraobserver and interobserver variability of these individual metrics was also assessed. The metrics were converted to costs and aggregated to determine the fixed and variable costs of a procedure segment or the entire procedure. Task analysis and pilot testing led to a standardized testing protocol suitable for performance assessment. Task analysis also identified seven checkpoints that divided the renal stent simulations into six segments. Efficiency metrics for these different segments were extracted from the recordings and showed excellent intra- and interobserver correlations. Analysis of the individual and aggregated efficiency metrics demonstrated large differences between segments as well as between different angiographers. These differences persisted when efficiency was expressed as either total or variable costs. Task analysis facilitated both protocol development and data analysis. Efficiency metrics were readily extracted from recordings of simulated procedures. Aggregating the metrics and dividing the procedure into segments revealed potential insights that could be easily overlooked because the simulator currently does not attempt to aggregate the metrics and only provides data derived from the entire procedure. The data indicate that analysis of simulated angiographic procedures will be a powerful method of assessing performance in interventional radiology.

  1. Experimental Applications of Automatic Test Markup Language (ATML)

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; McCartney, Patrick; Gorringe, Chris

    2012-01-01

    The authors describe challenging use-cases for Automatic Test Markup Language (ATML), and evaluate solutions. The first case uses ATML Test Results to deliver active features to support test procedure development and test flow, and bridging mixed software development environments. The second case examines adding attributes to Systems Modelling Language (SysML) to create a linkage for deriving information from a model to fill in an ATML document set. Both cases are outside the original concept of operations for ATML but are typical when integrating large heterogeneous systems with modular contributions from multiple disciplines.

  2. Research Strategies in Higher Education.

    ERIC Educational Resources Information Center

    Semb, George

    The present paper outlines two alternative strategies for evaluating teaching effectiveness. These are: (1) within-subject reversal designs, and (2) multiple baseline testing procedures. Each design is discussed in terms of its application to research problems in higher education. In reversal designs, the student is exposed to different teaching…

  3. Investigations of the pushability behavior of cardiovascular angiographic catheters.

    PubMed

    Bloss, Peter; Rothe, Wolfgang; Wünsche, Peter; Werner, Christian; Rothe, Alexander; Kneissl, Georg Dieter; Burger, Wolfram; Rehberg, Elisabeth

    2003-01-01

    The placement of angiographic catheters into the vascular system is a routine procedure in modern clinical business. The definition of objective but not yet available evaluation protocols based on measurable physical quantities correlated to the empirical clinical findings is of utmost importance for catheter manufacturers for in-house product screening and optimization. In this context, we present an assessment of multiple mechanical and surface catheter properties such as static and kinetic friction, bending stiffness, microscopic surface topology, surface roughness, surface free energy and their interrelation. Theoretical framework, description of experimental methods and extensive data measured on several different catheters are provided and in conclusion a testing procedure is defined. Although this procedure is based on the measurement of several physical quantities it can be easily implemented by commercial laboratories testing catheters as it is based on relatively low-cost standard methods.

  4. Evaluation of multiple comparison correction procedures in drug assessment studies using LORETA maps.

    PubMed

    Alonso, Joan Francesc; Romero, Sergio; Mañanas, Miguel Ángel; Rojas, Mónica; Riba, Jordi; Barbanoj, Manel José

    2015-10-01

    The identification of the brain regions involved in the neuropharmacological action is a potential procedure for drug development. These regions are commonly determined by the voxels showing significant statistical differences after comparing placebo-induced effects with drug-elicited effects. LORETA is an electroencephalography (EEG) source imaging technique frequently used to identify brain structures affected by the drug. The aim of the present study was to evaluate different methods for the correction of multiple comparisons in the LORETA maps. These methods which have been commonly used in neuroimaging and also simulated studies have been applied on a real case of pharmaco-EEG study where the effects of increasing benzodiazepine doses on the central nervous system measured by LORETA were investigated. Data consisted of EEG recordings obtained from nine volunteers who received single oral doses of alprazolam 0.25, 0.5, and 1 mg, and placebo in a randomized crossover double-blind design. The identification of active regions was highly dependent on the selected multiple test correction procedure. The combined criteria approach known as cluster mass was useful to reveal that increasing drug doses led to higher intensity and spread of the pharmacologically induced changes in intracerebral current density.

  5. The Validity of the earth and space science learning materials with orientation on multiple intelligences and character education

    NASA Astrophysics Data System (ADS)

    Liliawati, W.; Utama, J. A.; Ramalis, T. R.; Rochman, A. A.

    2018-03-01

    Validation of the Earth and Space Science learning the material in the chapter of the Earth's Protector based on experts (media & content expert and practitioners) and junior high school students' responses are presented. The data came from the development phase of the 4D method (Define, Design, Develop, Dissemination) which consist of two steps: expert appraisal and developmental testing. The instrument employed is rubric of suitability among the book contents with multiple intelligences activities, character education, a standard of book assessment, a questionnaires and close procedure. The appropriateness of the book contents with multiple intelligences, character education and standard of book assessment is in a good category. Meanwhile, students who used the book in their learning process gave a highly positive response; the book was easy to be understood. In general, the result of cloze procedure indicates high readability of the book. As our conclusion is the book chapter of the Earth's Protector can be used as a learning material accommodating students’ multiple intelligences and character internalization.

  6. Two-dimensional imaging via a narrowband MIMO radar system with two perpendicular linear arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Su, Yi

    2010-05-01

    This paper presents a system model and method for the 2-D imaging application via a narrowband multiple-input multiple-output (MIMO) radar system with two perpendicular linear arrays. Furthermore, the imaging formulation for our method is developed through a Fourier integral processing, and the parameters of antenna array including the cross-range resolution, required size, and sampling interval are also examined. Different from the spatial sequential procedure sampling the scattered echoes during multiple snapshot illuminations in inverse synthetic aperture radar (ISAR) imaging, the proposed method utilizes a spatial parallel procedure to sample the scattered echoes during a single snapshot illumination. Consequently, the complex motion compensation in ISAR imaging can be avoided. Moreover, in our array configuration, multiple narrowband spectrum-shared waveforms coded with orthogonal polyphase sequences are employed. The mainlobes of the compressed echoes from the different filter band could be located in the same range bin, and thus, the range alignment in classical ISAR imaging is not necessary. Numerical simulations based on synthetic data are provided for testing our proposed method.

  7. [Selection of medical students : Measurement of cognitive abilities and psychosocial competencies].

    PubMed

    Schwibbe, Anja; Lackamp, Janina; Knorr, Mirjana; Hissbach, Johanna; Kadmon, Martina; Hampe, Wolfgang

    2018-02-01

    The German Constitutional Court is currently reviewing whether the actual study admission process in medicine is compatible with the constitutional right of freedom of profession, since applicants without an excellent GPA usually have to wait for seven years. If the admission system is changed, politicians would like to increase the influence of psychosocial criteria on selection as specified by the Masterplan Medizinstudium 2020.What experiences have been made with the actual selection procedures? How could Situational Judgement Tests contribute to the validity of future selection procedures to German medical schools?High school GPA is the best predictor of study performance, but is more and more under discussion due to the lack of comparability between states and schools and the growing number of applicants with top grades. Aptitude and knowledge tests, especially in the natural sciences, show incremental validity in predicting study performance. The measurement of psychosocial competencies with traditional interviews shows rather low reliability and validity. The more reliable multiple mini-interviews are superior in predicting practical study performance. Situational judgement tests (SJTs) used abroad are regarded as reliable and valid; the correlation of a German SJT piloted in Hamburg with the multiple mini-interview is cautiously encouraging.A model proposed by the Medizinischer Fakultätentag and the Bundesvertretung der Medizinstudierenden considers these results. Student selection is proposed to be based on a combination of high school GPA (40%) and a cognitive test (40%) as well as an SJT (10%) and job experience (10%). Furthermore, the faculties still have the option to carry out specific selection procedures.

  8. A survey of variable selection methods in two Chinese epidemiology journals

    PubMed Central

    2010-01-01

    Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252

  9. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  10. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    2001-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  11. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    1999-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  12. Context-dependent logo matching and recognition.

    PubMed

    Sahbi, Hichem; Ballan, Lamberto; Serra, Giuseppe; Del Bimbo, Alberto

    2013-03-01

    We contribute, through this paper, to the design of a novel variational framework able to match and recognize multiple instances of multiple reference logos in image archives. Reference logos and test images are seen as constellations of local features (interest points, regions, etc.) and matched by minimizing an energy function mixing: 1) a fidelity term that measures the quality of feature matching, 2) a neighborhood criterion that captures feature co-occurrence/geometry, and 3) a regularization term that controls the smoothness of the matching solution. We also introduce a detection/recognition procedure and study its theoretical consistency. Finally, we show the validity of our method through extensive experiments on the challenging MICC-Logos dataset. Our method overtakes, by 20%, baseline as well as state-of-the-art matching/recognition procedures.

  13. Estimating Creativity with a Multiple-Measurement Approach within Scientific and Artistic Domains

    ERIC Educational Resources Information Center

    Agnoli, Sergio; Corazza, Giovanni E.; Runco, Mark A.

    2016-01-01

    This article presents the structure and the composition of a newly developed multifaceted test battery for the measurement of creativity within scientific and artistic domains. By integrating existing procedures for the evaluation of creativity, the new battery promises to become a comprehensive assessment of creativity, encompassing both…

  14. Should We Stop Developing Heuristics and Only Rely on Mixed Integer Programming Solvers in Automated Test Assembly? A Rejoinder to van der Linden and Li (2016).

    PubMed

    Chen, Pei-Hua

    2017-05-01

    This rejoinder responds to the commentary by van der Linden and Li entiled "Comment on Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" on the article "Three-Element Item Selection Procedures for Multiple Forms Assembly: An Item Matching Approach" by Chen. Van der Linden and Li made a strong statement calling for the cessation of test assembly heuristics development, and instead encouraged embracing mixed integer programming (MIP). This article points out the nondeterministic polynomial (NP)-hard nature of MIP problems and how solutions found using heuristics could be useful in an MIP context. Although van der Linden and Li provided several practical examples of test assembly supporting their view, the examples ignore the cases in which a slight change of constraints or item pool data might mean it would not be possible to obtain solutions as quickly as before. The article illustrates the use of heuristic solutions to improve both the performance of MIP solvers and the quality of solutions. Additional responses to the commentary by van der Linden and Li are included.

  15. Viewpoint: observations on scaled average bioequivalence.

    PubMed

    Patterson, Scott D; Jones, Byron

    2012-01-01

    The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  17. European type-approval test procedure for evaporative emissions from passenger cars against real-world mobility data from two Italian provinces.

    PubMed

    Martini, Giorgio; Paffumi, Elena; De Gennaro, Michele; Mellios, Giorgos

    2014-07-15

    This paper presents an evaluation of the European type-approval test procedure for evaporative emissions from passenger cars based on real-world mobility data. The study relies on two large databases of driving patterns from conventional fuel vehicles collected by means of on-board GPS systems in the Italian provinces of Modena and Firenze. Approximately 28,000 vehicles were monitored, corresponding to approximately 36 million kilometres over a period of one month. The driving pattern of each vehicle was processed to derive the relation between trip length and parking duration, and the rate of occurrence of parking events against multiple evaporative cycles, defined on the basis of the type-approval test procedure as 12-hour diurnal time windows. These results are used as input for an emission simulation model, which calculates the total evaporative emissions given the characteristics of the evaporative emission control system of the vehicle and the ambient temperature conditions. The results suggest that the evaporative emission control system, fitted to the vehicles from Euro 3 step and optimised for the current type-approval test procedure, could not efficiently work under real-world conditions, resulting in evaporative emissions well above the type-approval limit, especially for small size vehicles and warm climate conditions. This calls for a revision of the type-approval test procedure in order to address real-world evaporative emissions. Copyright © 2014. Published by Elsevier B.V.

  18. In vivo bioluminescence imaging of cell differentiation in biomaterials: a platform for scaffold development.

    PubMed

    Bagó, Juli R; Aguilar, Elisabeth; Alieva, Maria; Soler-Botija, Carolina; Vila, Olaia F; Claros, Silvia; Andrades, José A; Becerra, José; Rubio, Nuria; Blanco, Jerónimo

    2013-03-01

    In vivo testing is a mandatory last step in scaffold development. Agile longitudinal noninvasive real-time monitoring of stem cell behavior in biomaterials implanted in live animals should facilitate the development of scaffolds for tissue engineering. We report on a noninvasive bioluminescence imaging (BLI) procedure for simultaneous monitoring of changes in the expression of multiple genes to evaluate scaffold performance in vivo. Adipose tissue-derived stromal mensenchymal cells were dually labeled with Renilla red fluorescent protein and firefly green fluorescent protein chimeric reporters regulated by cytomegalovirus and tissue-specific promoters, respectively. Labeled cells were induced to differentiate in vitro and in vivo, by seeding in demineralized bone matrices (DBMs) and monitored by BLI. Imaging results were validated by RT-polymerase chain reaction and histological procedures. The proposed approach improves molecular imaging and measurement of changes in gene expression of cells implanted in live animals. This procedure, applicable to the simultaneous analysis of multiple genes from cells seeded in DBMs, should facilitate engineering of scaffolds for tissue repair.

  19. In Vivo Bioluminescence Imaging of Cell Differentiation in Biomaterials: A Platform for Scaffold Development

    PubMed Central

    Bagó, Juli R.; Aguilar, Elisabeth; Alieva, Maria; Soler-Botija, Carolina; Vila, Olaia F.; Claros, Silvia; Andrades, José A.; Becerra, José; Rubio, Nuria

    2013-01-01

    In vivo testing is a mandatory last step in scaffold development. Agile longitudinal noninvasive real-time monitoring of stem cell behavior in biomaterials implanted in live animals should facilitate the development of scaffolds for tissue engineering. We report on a noninvasive bioluminescence imaging (BLI) procedure for simultaneous monitoring of changes in the expression of multiple genes to evaluate scaffold performance in vivo. Adipose tissue-derived stromal mensenchymal cells were dually labeled with Renilla red fluorescent protein and firefly green fluorescent protein chimeric reporters regulated by cytomegalovirus and tissue-specific promoters, respectively. Labeled cells were induced to differentiate in vitro and in vivo, by seeding in demineralized bone matrices (DBMs) and monitored by BLI. Imaging results were validated by RT-polymerase chain reaction and histological procedures. The proposed approach improves molecular imaging and measurement of changes in gene expression of cells implanted in live animals. This procedure, applicable to the simultaneous analysis of multiple genes from cells seeded in DBMs, should facilitate engineering of scaffolds for tissue repair. PMID:23013334

  20. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed

    Schmidmaier, Ralf; Eiber, Stephan; Ebersbach, Rene; Schiller, Miriam; Hege, Inga; Holzer, Matthias; Fischer, Martin R

    2013-02-22

    Medical knowledge encompasses both conceptual (facts or "what" information) and procedural knowledge ("how" and "why" information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula.

  1. A differential color flicker test for detecting acquired color vision impairment in multiple sclerosis and diabetic retinopathy.

    PubMed

    Gregori, Bruno; Papazachariadis, Odysseas; Farruggia, Alfonsa; Accornero, Neri

    2011-01-15

    Optic neuritis related to multiple sclerosis and diabetic retinopathy are relatively selective post-retinal and retinal vision disorders. Vision impairment in both conditions is reliably measured by testing critical fusion frequency (CFF). To examine color vision, we measured the CFF in response to red and blue stimuli, and tested CFF values in patients without evident vision impairment. To ensure that differences in CFF values in a given subject depended only on color perception we displayed red and blue flickering stimuli at equal luminance. CFF to red or blue stimuli were compared in patients with medical history of optic neuritis related to multiple sclerosis (post-retinal vision impairment), patients with diabetic retinopathy (retinal vision impairment) and healthy subjects. The test procedure disclosed altered CFF values for red and blue stimuli in both groups of patients studied. The comparison between the two groups disclosed a prevalent CFF impairment for red stimuli in patients with optic neuritis related to multiple sclerosis and for blue stimuli in patients with diabetic retinopathy. The differential color flicker test appears highly accurate in detecting color vision impairment. Comparison of the two color CFFs differentiates retinal from post-retinal visual disorders. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Approach to a case of multiple irregular red cell antibodies in a liver transplant recipient: Need for developing competence

    PubMed Central

    Dara, Ravi C.; Tiwari, Aseem K.; Pandey, Prashant; Arora, Dinesh

    2015-01-01

    Liver transplant procedure acts as a challenge for transfusion services in terms of specialized blood components, serologic problems, and immunologic effects of transfusion. Red cell alloimmunization in patients awaiting a liver transplant complicate the process by undue delay or unavailability of compatible red blood cell units. Compatible blood units can be provided by well-equipped immunohematology laboratory, which has expertise in resolving these serological problems. This report illustrates resolution of a case with multiple alloantibodies using standard techniques, particularly rare antisera. Our case re-emphasizes the need for universal antibody screening in all patients as part of pretransfusion testing, which helps to identify atypical antibodies and plan for appropriate transfusion support well in time. We recommend that the centers, especially the ones that perform complex procedures like solid organ transplants and hematological transplants should have the necessary immunohematological reagents including rare antisera to resolve complex cases of multiple antibodies as illustrated in this case. PMID:25722585

  3. Multivariate two-part statistics for analysis of correlated mass spectrometry data from multiple biological specimens.

    PubMed

    Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi

    2017-01-01

    High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Extending the Implicit Association Test (IAT): Assessing Consumer Attitudes Based on Multi-Dimensional Implicit Associations

    PubMed Central

    Gattol, Valentin; Sääksjärvi, Maria; Carbon, Claus-Christian

    2011-01-01

    Background The authors present a procedural extension of the popular Implicit Association Test (IAT; [1]) that allows for indirect measurement of attitudes on multiple dimensions (e.g., safe–unsafe; young–old; innovative–conventional, etc.) rather than on a single evaluative dimension only (e.g., good–bad). Methodology/Principal Findings In two within-subjects studies, attitudes toward three automobile brands were measured on six attribute dimensions. Emphasis was placed on evaluating the methodological appropriateness of the new procedure, providing strong evidence for its reliability, validity, and sensitivity. Conclusions/Significance This new procedure yields detailed information on the multifaceted nature of brand associations that can add up to a more abstract overall attitude. Just as the IAT, its multi-dimensional extension/application (dubbed md-IAT) is suited for reliably measuring attitudes consumers may not be consciously aware of, able to express, or willing to share with the researcher [2], [3]. PMID:21246037

  5. Synthesis and quality control of fluorodeoxyglucose and performance assessment of Siemens MicroFocus 220 small animal PET scanner

    NASA Astrophysics Data System (ADS)

    Phaterpekar, Siddhesh Nitin

    The scope of this article is to cover the synthesis and quality control procedures involved in production of Fludeoxyglucose (18F--FDG). The article also describes the cyclotron production of 18F radioisotope and gives a brief overview on operations and working of a fixed energy medical cyclotron. The quality control procedures for FDG involve radiochemical and radionuclidic purity tests, pH tests, chemical purity tests, sterility tests, endotoxin tests. Each of these procedures were carried out for multiple batches of FDG with a passing rate of 95% among 20 batches. The article also covers the quality assurance steps for the Siemens MicroPET Focus 220 Scanner using a Jaszczak phantom. We have carried out spatial resolution tests on the scanner, with an average transaxial resolution of 1.775mm with 2-3mm offset. Tests involved detector efficiency, blank scan sinograms and transmission sinograms. A series of radioactivity distribution tests are also carried out on a uniform phantom, denoting the variations in radioactivity and uniformity by using cylindrical ROIs in the transverse region of the final image. The purpose of these quality control tests is to make sure the manufactured FDG is biocompatible with the human body. Quality assurance tests are carried on PET scanners for efficient performance, and to make sure the quality of images acquired is according to the radioactivity distribution in the subject of interest.

  6. Development of multiple choice pictorial test for measuring the dimensions of knowledge

    NASA Astrophysics Data System (ADS)

    Nahadi, Siswaningsih, Wiwi; Erna

    2017-05-01

    This study aims to develop a multiple choice pictorial test as a tool to measure dimension of knowledge in chemical equilibrium subject. The method used is Research and Development and validation that was conducted in the preliminary studies and model development. The product is multiple choice pictorial test. The test was developed by 22 items and tested to 64 high school students in XII grade. The quality of test was determined by value of validity, reliability, difficulty index, discrimination power, and distractor effectiveness. The validity of test was determined by CVR calculation using 8 validators (4 university teachers and 4 high school teachers) with average CVR value 0,89. The reliability of test has very high category with value 0,87. Discrimination power of items with a very good category is 32%, 59% as good category, and 20% as sufficient category. This test has a varying level of difficulty, item with difficult category is 23%, the medium category is 50%, and the easy category is 27%. The distractor effectiveness of items with a very poor category is 1%, poor category is 1%, medium category is 4%, good category is 39%, and very good category is 55%. The dimension of knowledge that was measured consist of factual knowledge, conceptual knowledge, and procedural knowledge. Based on the questionnaire, students responded quite well to the developed test and most of the students like this kind of multiple choice pictorial test that include picture as evaluation tool compared to the naration tests was dominated by text.

  7. The emergence of autoclitic frames in atypically and typically developing children as a function of multiple exemplar instruction.

    PubMed

    Luke, Nicole; Greer, R Douglas; Singer-Dudek, Jessica; Keohane, Dolleen-Day

    2011-01-01

    In two experiments, we tested the effect of multiple exemplar instruction (MEI) for training sets on the emergence of autoclitic frames for spatial relations for novel tacts and mands. In Experiment 1, we used a replicated pre- and post-intervention probe design with four students with significant learning disabilities to test for acquisition of four autoclitic frames with novel tacts and mands before and after MEI. The untaught topographies emerged for all participants. In Experiment 2, we used a multiple probe design to test the effects of the MEI procedures on the same responses in four typically developing, bilingual students. The novel usage emerged for all participants. In the latter experiment, the children demonstrated untaught usage of mand or tact frames regardless of whether they were taught to respond in either listener or speaker functions alone or across listener and speaker functions. The findings are discussed in terms of the role of MEI in the formation of abstractions.

  8. OPATs: Omnibus P-value association tests.

    PubMed

    Chen, Chia-Wei; Yang, Hsin-Chou

    2017-07-10

    Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.

  9. Developmental Dissociation in the Neural Responses to Simple Multiplication and Subtraction Problems

    ERIC Educational Resources Information Center

    Prado, Jérôme; Mutreja, Rachna; Booth, James R.

    2014-01-01

    Mastering single-digit arithmetic during school years is commonly thought to depend upon an increasing reliance on verbally memorized facts. An alternative model, however, posits that fluency in single-digit arithmetic might also be achieved via the increasing use of efficient calculation procedures. To test between these hypotheses, we used a…

  10. Role of Mitochondrial Inheritance on Prostate Cancer Outcome in African-American Men

    DTIC Science & Technology

    2014-10-01

    prostate cancer cell line cybrids was not effective and we have instead decided to use the Rhodamine -6-G procedure. Thus far PNT1A cybrid cell lines...the original protocol. To overcome these difficulties, we tested multiple alternative approaches including rhodamine -6G (R6G) mediated short-term

  11. Role of Mitochondrial Inheritance on Prostate Cancer Outcome in African American Men

    DTIC Science & Technology

    2015-12-01

    for generating prostate cancer cell line cybrids was not effective and we have instead used a Rhodamine -6-G procedure. PNT1A cybrid cell lines have...difficulties, we tested multiple alternative approaches including rhodamine -6G (R6G) mediated short-term mitochondrial dysfunction in generating rho zero cells

  12. A Case for Faculty Involvement in EAP Placement Testing

    ERIC Educational Resources Information Center

    James, Cindy; Templeman, Elizabeth

    2009-01-01

    The EAP placement procedure at Thompson Rivers University (TRU) involves multiple measures to assess the language skills of incoming students, some of which are facilitated and all of which are assessed by ESL faculty. In order to determine the effectiveness of this comprehensive EAP placement process and the effect of the faculty factor, a…

  13. Using Behavioral Skills Training and Video Rehearsal to Teach Blackjack Skills

    ERIC Educational Resources Information Center

    Speelman, Ryan C.; Whiting, Seth W.; Dixon, Mark R.

    2015-01-01

    A behavioral skills training procedure that consisted of video instructions, video rehearsal, and video testing was used to teach 4 recreational gamblers a specific skill in playing blackjack (sometimes called "card counting"). A multiple baseline design was used to evaluate intervention effects on card-counting accuracy and chips won or…

  14. Progress Monitoring in Grade 5 Science for Low Achievers

    ERIC Educational Resources Information Center

    Vannest, Kimberly J.; Parker, Richard; Dyer, Nicole

    2011-01-01

    This article presents procedures and results from a 2-year project developing science key vocabulary (KV) short tests suitable for progress monitoring Grade 5 science in Texas public schools using computer-generated, -administered, and -scored assessments. KV items included KV definitions and important usages in a multiple-choice cloze format. A…

  15. Multiple comparisons permutation test for image based data mining in radiotherapy.

    PubMed

    Chen, Chun; Witte, Marnix; Heemsbergen, Wilma; van Herk, Marcel

    2013-12-23

    : Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy.

  16. Design of lightning protection for a full-authority digital engine control

    NASA Technical Reports Server (NTRS)

    Dargi, M.; Rupke, E.; Wiles, K.

    1991-01-01

    The steps and procedures are described which are necessary to achieve a successful lightning-protection design for a state-of-the-art Full-Authority Digital Engine Control (FADEC) system. The engine and control systems used as examples are fictional, but the design and verification methods are real. Topics discussed include: applicable airworthiness regulation, selection of equipment transient design and control levels for the engine/airframe and intra-engine segments of the system, the use of cable shields, terminal-protection devices and filter circuits in hardware protection design, and software approaches to minimize upset potential. Shield terminations, grounding, and bonding are also discussed, as are the important elements of certification and test plans, and the role of tests and analyses. Also included are examples of multiple-stroke and multiple-burst testing. A review of design pitfalls and challenges, and status of applicable test standards such as RTCA DO-160, Section 22, are presented.

  17. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series

    USGS Publications Warehouse

    Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.

    2013-01-01

    he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.

  18. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; England, J. F.; Berenbrock, C. E.; Mason, R. R.; Stedinger, J. R.; Lamontagne, J. R.

    2013-08-01

    The Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as "less-than" values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.

  19. A new surgical technique for medial collateral ligament balancing: multiple needle puncturing.

    PubMed

    Bellemans, Johan; Vandenneucker, Hilde; Van Lauwe, Johan; Victor, Jan

    2010-10-01

    In this article, we present our experience with a new technique for medial soft tissue balancing, where we make multiple punctures in the medial collateral ligament (MCL) using a 19-gauge needle, to progressively stretch the MCL until a correct ligament balance is achieved. Ligament status was evaluated both before and after the procedure using computer navigation and mediolateral stress testing. The procedure was considered successful when 2 to 4-mm mediolateral joint line opening was obtained in extension and 2 to 6 mm in flexion. In 34 of 35 cases, a progressive correction of medial tightness was achieved according to the above described criteria. One case was considered overreleased in extension. Needle puncturing is a new, effective, and safe technique for progressive correction of MCL tightness in the varus knee. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.

    PubMed

    Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin

    2014-12-01

    Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.

  1. Efficient reanalysis of structures by a direct modification method. [local stiffness modifications of large structures

    NASA Technical Reports Server (NTRS)

    Raibstein, A. I.; Kalev, I.; Pipano, A.

    1976-01-01

    A procedure for the local stiffness modifications of large structures is described. It enables structural modifications without an a priori definition of the changes in the original structure and without loss of efficiency due to multiple loading conditions. The solution procedure, implemented in NASTRAN, involved the decomposed stiffness matrix and the displacement vectors of the original structure. It solves the modified structure exactly, irrespective of the magnitude of the stiffness changes. In order to investigate the efficiency of the present procedure and to test its applicability within a design environment, several real and large structures were solved. The results of the efficiency studies indicate that the break-even point of the procedure varies between 8% and 60% stiffness modifications, depending upon the structure's characteristics and the options employed.

  2. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  3. Double-blind photo lineups using actual eyewitnesses: an experimental test of a sequential versus simultaneous lineup procedure.

    PubMed

    Wells, Gary L; Steblay, Nancy K; Dysart, Jennifer E

    2015-02-01

    Eyewitnesses (494) to actual crimes in 4 police jurisdictions were randomly assigned to view simultaneous or sequential photo lineups using laptop computers and double-blind administration. The sequential procedure used in the field experiment mimicked how it is conducted in actual practice (e.g., using a continuation rule, witness does not know how many photos are to be viewed, witnesses resolve any multiple identifications), which is not how most lab experiments have tested the sequential lineup. No significant differences emerged in rates of identifying lineup suspects (25% overall) but the sequential procedure produced a significantly lower rate (11%) of identifying known-innocent lineup fillers than did the simultaneous procedure (18%). The simultaneous/sequential pattern did not significantly interact with estimator variables and no lineup-position effects were observed for either the simultaneous or sequential procedures. Rates of nonidentification were not significantly different for simultaneous and sequential but nonidentifiers from the sequential procedure were more likely to use the "not sure" response option than were nonidentifiers from the simultaneous procedure. Among witnesses who made an identification, 36% (41% of simultaneous and 32% of sequential) identified a known-innocent filler rather than a suspect, indicating that eyewitness performance overall was very poor. The results suggest that the sequential procedure that is used in the field reduces the identification of known-innocent fillers, but the differences are relatively small.

  4. Interdisciplinary Development of an Improved Emergency Department Procedural Work Surface Through Iterative Design and Use Testing in Simulated and Clinical Environments.

    PubMed

    Zhang, Xiao C; Bermudez, Ana M; Reddy, Pranav M; Sarpatwari, Ravi R; Chheng, Darin B; Mezoian, Taylor J; Schwartz, Victoria R; Simmons, Quinneil J; Jay, Gregory D; Kobayashi, Leo

    2017-03-01

    A stable and readily accessible work surface for bedside medical procedures represents a valuable tool for acute care providers. In emergency department (ED) settings, the design and implementation of traditional Mayo stands and related surface devices often limit their availability, portability, and usability, which can lead to suboptimal clinical practice conditions that may affect the safe and effective performance of medical procedures and delivery of patient care. We designed and built a novel, open-source, portable, bedside procedural surface through an iterative development process with use testing in simulated and live clinical environments. The procedural surface development project was conducted between October 2014 and June 2016 at an academic referral hospital and its affiliated simulation facility. An interdisciplinary team of emergency physicians, mechanical engineers, medical students, and design students sought to construct a prototype bedside procedural surface out of off-the-shelf hardware during a collaborative university course on health care design. After determination of end-user needs and core design requirements, multiple prototypes were fabricated and iteratively modified, with early variants featuring undermattress stabilizing supports or ratcheting clamp mechanisms. Versions 1 through 4 underwent 2 hands-on usability-testing simulation sessions; version 5 was presented at a design critique held jointly by a panel of clinical and industrial design faculty for expert feedback. Responding to select feedback elements over several surface versions, investigators arrived at a near-final prototype design for fabrication and use testing in a live clinical setting. This experimental procedural surface (version 8) was constructed and then deployed for controlled usability testing against the standard Mayo stands in use at the study site ED. Clinical providers working in the ED who opted to participate in the study were provided with the prototype surface and just-in-time training on its use when performing bedside procedures. Subjects completed the validated 10-point System Usability Scale postshift for the surface that they had used. The study protocol was approved by the institutional review board. Multiple prototypes and recursive design revisions resulted in a fully functional, portable, and durable bedside procedural surface that featured a stainless steel tray and intuitive hook-and-lock mechanisms for attachment to ED stretcher bed rails. Forty-two control and 40 experimental group subjects participated and completed questionnaires. The median System Usability Scale score (out of 100; higher scores associated with better usability) was 72.5 (interquartile range [IQR] 51.3 to 86.3) for the Mayo stand; the experimental surface was scored at 93.8 (IQR 84.4 to 97.5 for a difference in medians of 17.5 (95% confidence interval 10 to 27.5). Subjects reported several usability challenges with the Mayo stand; the experimental surface was reviewed as easy to use, simple, and functional. In accordance with experimental live environment deployment, questionnaire responses, and end-user suggestions, the project team finalized the design specification for the experimental procedural surface for open dissemination. An iterative, interdisciplinary approach was used to generate, evaluate, revise, and finalize the design specification for a new procedural surface that met all core end-user requirements. The final surface design was evaluated favorably on a validated usability tool against Mayo stands when use tested in simulated and live clinical settings. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  5. Comparing preference assessments: selection- versus duration-based preference assessment procedures.

    PubMed

    Kodak, Tiffany; Fisher, Wayne W; Kelley, Michael E; Kisamore, April

    2009-01-01

    In the current investigation, the results of a selection- and a duration-based preference assessment procedure were compared. A Multiple Stimulus With Replacement (MSW) preference assessment [Windsor, J., Piché, L. M., & Locke, P. A. (1994). Preference testing: A comparison of two presentation methods. Research in Developmental Disabilities, 15, 439-455] and a variation of a Free-Operant (FO) preference assessment procedure [Roane, H. S., Vollmer, T. R., Ringdahl, J. E., & Marcus, B. A. (1998). Evaluation of a brief stimulus preference assessment. Journal of Applied Behavior Analysis, 31, 605-620] were conducted with four participants. A reinforcer assessment was conducted to determine which preference assessment procedure identified the item that produced the highest rates of responding. The items identified as most highly preferred were different across preference assessment procedures for all participants. Results of the reinforcer assessment showed that the MSW identified the item that functioned as the most effective reinforcer for two participants.

  6. 29 CFR 1926.761 - Training.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... following activities. (1) Multiple lift rigging procedure. The employer shall ensure that each employee who performs multiple lift rigging has been provided training in the following areas: (i) The nature of the hazards associated with multiple lifts; and (ii) The proper procedures and equipment to perform multiple...

  7. 29 CFR 1926.761 - Training.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... following activities. (1) Multiple lift rigging procedure. The employer shall ensure that each employee who performs multiple lift rigging has been provided training in the following areas: (i) The nature of the hazards associated with multiple lifts; and (ii) The proper procedures and equipment to perform multiple...

  8. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  9. Statistical inference for Hardy-Weinberg proportions in the presence of missing genotype information.

    PubMed

    Graffelman, Jan; Sánchez, Milagros; Cook, Samantha; Moreno, Victor

    2013-01-01

    In genetic association studies, tests for Hardy-Weinberg proportions are often employed as a quality control checking procedure. Missing genotypes are typically discarded prior to testing. In this paper we show that inference for Hardy-Weinberg proportions can be biased when missing values are discarded. We propose to use multiple imputation of missing values in order to improve inference for Hardy-Weinberg proportions. For imputation we employ a multinomial logit model that uses information from allele intensities and/or neighbouring markers. Analysis of an empirical data set of single nucleotide polymorphisms possibly related to colon cancer reveals that missing genotypes are not missing completely at random. Deviation from Hardy-Weinberg proportions is mostly due to a lack of heterozygotes. Inbreeding coefficients estimated by multiple imputation of the missings are typically lowered with respect to inbreeding coefficients estimated by discarding the missings. Accounting for missings by multiple imputation qualitatively changed the results of 10 to 17% of the statistical tests performed. Estimates of inbreeding coefficients obtained by multiple imputation showed high correlation with estimates obtained by single imputation using an external reference panel. Our conclusion is that imputation of missing data leads to improved statistical inference for Hardy-Weinberg proportions.

  10. Cross-Cultural Validation of the Patient Perception of Integrated Care Survey.

    PubMed

    Tietschert, Maike V; Angeli, Federica; van Raak, Arno J A; Ruwaard, Dirk; Singer, Sara J

    2017-07-20

    To test the cross-cultural validity of the U.S. Patient Perception of Integrated Care (PPIC) Survey in a Dutch sample using a standardized procedure. Primary data collected from patients of five primary care centers in the south of the Netherlands, through survey research from 2014 to 2015. Cross-sectional data collected from patients who saw multiple health care providers during 6 months preceding data collection. The PPIC survey includes 59 questions that measure patient perceived care integration across providers, settings, and time. Data analysis followed a standardized procedure guiding data preparation, psychometric analysis, and included invariance testing with the U.S. dataset. Latent scale structures of the Dutch and U.S. survey were highly comparable. Factor "Integration with specialist" had lower reliability scores and noninvariance. For the remaining factors, internal consistency and invariance estimates were strong. The standardized cross-cultural validation procedure produced strong support for comparable psychometric characteristics of the Dutch and U.S. surveys. Future research should examine the usability of the proposed procedure for contexts with greater cultural differences. © Health Research and Educational Trust.

  11. Double-label immunofluorescence method for simultaneous detection of adenovirus and herpes simplex virus from the eye.

    PubMed

    Walpita, P; Darougar, S

    1989-07-01

    The development and application of a double-label immunofluorescence method which has the potential to screen for single or dual infections from any site, in single shell vial cultures, is described. In this study, a total of 1,141 ocular specimens were inoculated in shell vials, centrifuged at 15,000 X g for 1 h, incubated at 37 degrees C for 48 h, and fixed in methanol at room temperature for 15 min. The virus inclusions were detected by staining with a double-label indirect immunofluorescence procedure using mixtures of appropriate first antibodies, followed by fluorescein- and rhodamine-conjugated second antibodies. Each specimen was also inoculated in parallel by the conventional virus isolation method. The sensitivity and specificity of the double-label shell vial procedure were comparable to those with the conventional method, and the former test took only 48 h to complete. The test offers a rapid and simple single-vial procedure which allows for individual or simultaneous detection of multiple pathogens. It results in savings in time and cost over the conventional virus isolation method and other shell vial procedures.

  12. Tracking people and cars using 3D modeling and CCTV.

    PubMed

    Edelman, Gerda; Bijhold, Jurrien

    2010-10-10

    The aim of this study was to find a method for the reconstruction of movements of people and cars using CCTV footage and a 3D model of the environment. A procedure is proposed, in which video streams are synchronized and displayed in a 3D model, by using virtual cameras. People and cars are represented by cylinders and boxes, which are moved in the 3D model, according to their movements as shown in the video streams. The procedure was developed and tested in an experimental setup with test persons who logged their GPS coordinates as a recording of the ground truth. Results showed that it is possible to implement this procedure and to reconstruct movements of people and cars from video recordings. The procedure was also applied to a forensic case. In this work we experienced that more situational awareness was created by the 3D model, which made it easier to track people on multiple video streams. Based on all experiences from the experimental set up and the case, recommendations are formulated for use in practice. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. The Language Skill Change Project (LSCP): Background, Procedures, and Preliminary Findings

    DTIC Science & Technology

    1987-12-01

    word in the dictionary so I can understand what I am reading. 49. I use flashcards (with the new word or phrase on one side and the definition or...mostly on a term paper rather than multiple choice tests. 6. I would rather watch a heated debate on a controversial topic than a popular music program

  14. A Critical Analysis of the Body of Work Method for Setting Cut-Scores

    ERIC Educational Resources Information Center

    Radwan, Nizam; Rogers, W. Todd

    2006-01-01

    The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…

  15. Two Readiness Measures As Predictors Of First- And Third-Grade Reading Achievement

    ERIC Educational Resources Information Center

    Randel, Mildred A.; And Others

    1977-01-01

    Multiple-regression procedures were used to assess effectiveness of the ABC Inventory and the Metropolitan Readiness Test (MRT) in predicting first- and third-grade reading achievement. MRT performance accounted for 11 percent of the variance in first-grade SRA reading scores. In predicting third-grade reading, the MRT accounted for 26 percent of…

  16. 40 CFR 65.158 - Performance test procedures for control devices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... simultaneously from multiple loading arms, each run shall represent at least one complete tank truck or tank car... the combustion air or as a secondary fuel into a boiler or process heater with a design capacity less... corrected to 3 percent oxygen if a combustion device is the control device. (A) The emission rate correction...

  17. An approach to analyzing a single subject's scores obtained in a standardized test with application to the Aachen Aphasia Test (AAT).

    PubMed

    Willmes, K

    1985-08-01

    Methods for the analysis of a single subject's test profile(s) proposed by Huber (1973) are applied to the Aachen Aphasia Test (AAT). The procedures are based on the classical test theory model (Lord & Novick, 1968) and are suited for any (achievement) test with standard norms from a large standardization sample and satisfactory reliability estimates. Two test profiles of a Wernicke's aphasic, obtained before and after a 3-month period of speech therapy, are analyzed using inferential comparisons between (groups of) subtest scores on one test application and between two test administrations for single (groups of) subtests. For each of these comparisons, the two aspects of (i) significant (reliable) differences in performance beyond measurement error and (ii) the diagnostic validity of that difference in the reference population of aphasic patients are assessed. Significant differences between standardized subtest scores and a remarkably better preserved reading and writing ability could be found for both test administrations using the multiple test procedure of Holm (1979). Comparison of both profiles revealed an overall increase in performance for each subtest as well as changes in level of performance relations between pairs of subtests.

  18. Statistical analysis of particle trajectories in living cells

    NASA Astrophysics Data System (ADS)

    Briane, Vincent; Kervrann, Charles; Vimond, Myriam

    2018-06-01

    Recent advances in molecular biology and fluorescence microscopy imaging have made possible the inference of the dynamics of molecules in living cells. Such inference allows us to understand and determine the organization and function of the cell. The trajectories of particles (e.g., biomolecules) in living cells, computed with the help of object tracking methods, can be modeled with diffusion processes. Three types of diffusion are considered: (i) free diffusion, (ii) subdiffusion, and (iii) superdiffusion. The mean-square displacement (MSD) is generally used to discriminate the three types of particle dynamics. We propose here a nonparametric three-decision test as an alternative to the MSD method. The rejection of the null hypothesis, i.e., free diffusion, is accompanied by claims of the direction of the alternative (subdiffusion or superdiffusion). We study the asymptotic behavior of the test statistic under the null hypothesis and under parametric alternatives which are currently considered in the biophysics literature. In addition, we adapt the multiple-testing procedure of Benjamini and Hochberg to fit with the three-decision-test setting, in order to apply the test procedure to a collection of independent trajectories. The performance of our procedure is much better than the MSD method as confirmed by Monte Carlo experiments. The method is demonstrated on real data sets corresponding to protein dynamics observed in fluorescence microscopy.

  19. Global Search Capabilities of Indirect Methods for Impulsive Transfers

    NASA Astrophysics Data System (ADS)

    Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong

    2015-09-01

    An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.

  20. Effectiveness of percutaneous vertebroplasty in patients with multiple myeloma having vertebral pain

    PubMed Central

    Nas, Ömer Fatih; İnecikli, Mehmet Fatih; Hacıkurt, Kadir; Büyükkaya, Ramazan; Özkaya, Güven; Özkalemkaş, Fahir; Ali, Rıdvan; Erdoğan, Cüneyt; Hakyemez, Bahattin

    2016-01-01

    PURPOSE We aimed to assess the effectiveness, benefits, and reliability of percutaneous vertebroplasty (PV) in patients with vertebral involvement of multiple myeloma. METHODS PV procedures performed on 166 vertebrae of 41 patients with multiple myeloma were retrospectively evaluated. Most of our patients were using level 3 (moderate to severe pain) analgesics. Magnetic resonance imaging was performed before the procedure to assess vertebral involvement of multiple myeloma. The following variables were evaluated: affected vertebral levels, loss of vertebral body height, polymethylmethacrylate (PMMA) cement amount applied to the vertebral body during PV, PMMA cement leakages, and pain before and after PV as assessed by a visual analogue scale (VAS). RESULTS Median VAS scores of patients decreased from 9 one day before PV, to 6 one day after the procedure, to 3 one week after the procedure, and eventually to 1 three months after the procedure (P < 0.001). During the PV procedure, cement leakage was observed at 68 vertebral levels (41%). The median value of PMMA applied to the vertebral body was 6 mL. CONCLUSION Being a minimally invasive and easily performed procedure with low complication rates, PV should be preferred for serious back pain of multiple myeloma patients. PMID:26912107

  1. Teaching physical activities to students with significant disabilities using video modeling.

    PubMed

    Cannella-Malone, Helen I; Mizrachi, Sharona V; Sabielny, Linsey M; Jimenez, Eliseo D

    2013-06-01

    The objective of this study was to examine the effectiveness of video modeling on teaching physical activities to three adolescents with significant disabilities. The study implemented a multiple baseline across six physical activities (three per student): jumping rope, scooter board with cones, ladder drill (i.e., feet going in and out), ladder design (i.e., multiple steps), shuttle run, and disc ride. Additional prompt procedures (i.e., verbal, gestural, visual cues, and modeling) were implemented within the study. After the students mastered the physical activities, we tested to see if they would link the skills together (i.e., complete an obstacle course). All three students made progress learning the physical activities, but only one learned them with video modeling alone (i.e., without error correction). Video modeling can be an effective tool for teaching students with significant disabilities various physical activities, though additional prompting procedures may be needed.

  2. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed Central

    2013-01-01

    Background Medical knowledge encompasses both conceptual (facts or “what” information) and procedural knowledge (“how” and “why” information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Methods Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Conclusions Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula. PMID:23433202

  3. Association between Exposure of Young Children to Procedures Requiring General Anesthesia and Learning and Behavioral Outcomes in a Population-based Birth Cohort.

    PubMed

    Hu, Danqing; Flick, Randall P; Zaccariello, Michael J; Colligan, Robert C; Katusic, Slavica K; Schroeder, Darrell R; Hanson, Andrew C; Buenvenida, Shonie L; Gleich, Stephen J; Wilder, Robert T; Sprung, Juraj; Warner, David O

    2017-08-01

    Exposure of young animals to general anesthesia causes neurodegeneration and lasting behavioral abnormalities; whether these findings translate to children remains unclear. This study used a population-based birth cohort to test the hypothesis that multiple, but not single, exposures to procedures requiring general anesthesia before age 3 yr are associated with adverse neurodevelopmental outcomes. A retrospective study cohort was assembled from children born in Olmsted County, Minnesota, from 1996 to 2000 (inclusive). Propensity matching selected children exposed and not exposed to general anesthesia before age 3 yr. Outcomes ascertained via medical and school records included learning disabilities, attention-deficit/hyperactivity disorder, and group-administered ability and achievement tests. Analysis methods included proportional hazard regression models and mixed linear models. For the 116 multiply exposed, 457 singly exposed, and 463 unexposed children analyzed, multiple, but not single, exposures were associated with an increased frequency of both learning disabilities and attention-deficit/hyperactivity disorder (hazard ratio for learning disabilities = 2.17 [95% CI, 1.32 to 3.59], unexposed as reference). Multiple exposures were associated with decreases in both cognitive ability and academic achievement. Single exposures were associated with modest decreases in reading and language achievement but not cognitive ability. These findings in children anesthetized with modern techniques largely confirm those found in an older birth cohort and provide additional evidence that children with multiple exposures are more likely to develop adverse outcomes related to learning and attention. Although a robust association was observed, these data do not determine whether anesthesia per se is causal.

  4. [Treatment manual for psychotherapy of acute and posttraumatic stress disorders after multiple ICD shocks].

    PubMed

    Jordan, J; Titscher, G; Kirsch, H

    2011-09-01

    In view of the inceasing number of implanted defibrillators in all industrial nations, the number of people who have suffered so-called multiple shocks (electrical storm, ES) also increases. Common complaints are severe and continuously recurrent massive anxiety, panic attacks, fear of death, helplessness and hopelessness, depression, nervosity and irritability as well as reclusive and uncontrollable avoidance behaviour, intrusions, nightmares, flashbacks, sleeplessness and the inability to show feelings and limitation of future perspectives. Because people with an ICD are often physically (very) ill and after multiple ICD shocks are additionally very insecure, it would seem logical if the inpatient treatment would be carried out in an institution which has close connections and is also spatially close to a cardiology department. The basis of the diagnostics is the clinical anamnesis and a systematic exploration of the trauma situation and the resulting complaints. As an additional diagnostic element psychological test procedures should be implemented to determine the core symptomatic (anxiety, depression, trauma symptoms). Psychological test procedures should be included in the diagnostics so that at the end of treatment it is obvious even to the patient which alterations have occurred. The core element of inpatient treatment is daily intensive psychotherapy and includes deep psychologically well-founded psychotherapy and behavioral therapeutic-oriented anxiety therapy as well as cognitive restructuring and elements of eye movement desensitization and reprocessing (EMDR). A follow-up examination within 4 months of the multiple shocks episode is recommended because symptoms of posttraumatic stress disorder often occur after a long latent time period.

  5. Specialty Payment Model Opportunities and Assessment

    PubMed Central

    Mulcahy, Andrew W.; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J.; Kofner, Aaron; Liu, Jodi L.; Lovejoy, Susan L.; Popescu, Ioana; Timbie, Justin W.; Hussey, Peter S.

    2015-01-01

    Abstract Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model. PMID:28083363

  6. 34 CFR 75.224 - What are the procedures for using a multiple tier review process to evaluate applications?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false What are the procedures for using a multiple tier... applications received. (d) The Secretary may, in any tier— (1) Use more than one group of experts to gain... procedures for using a multiple tier review process to evaluate applications? (a) The Secretary may use a...

  7. Communications processor for C3 analysis and wargaming

    NASA Astrophysics Data System (ADS)

    Clark, L. N.; Pless, L. D.; Rapp, R. L.

    1982-03-01

    This thesis developed the software capability to allow the investigation of c3 problems, procedures and methodologies. The resultant communications model, that while independent of a specific wargame, is currently implemented in conjunction with the McClintic Theater Model. It provides a computerized message handling system (C3 Model) which allows simulation of communication links (circuits) with user-definable delays; garble and loss rates; and multiple circuit types, addresses, and levels of command. It is designed to be used for test and evaluation of command and control problems in the areas of organizational relationships, communication networks and procedures, and combat doctrine or tactics.

  8. Proportion of general factor variance in a hierarchical multiple-component measuring instrument: a note on a confidence interval estimation procedure.

    PubMed

    Raykov, Tenko; Zinbarg, Richard E

    2011-05-01

    A confidence interval construction procedure for the proportion of explained variance by a hierarchical, general factor in a multi-component measuring instrument is outlined. The method provides point and interval estimates for the proportion of total scale score variance that is accounted for by the general factor, which could be viewed as common to all components. The approach may also be used for testing composite (one-tailed) or simple hypotheses about this proportion, and is illustrated with a pair of examples. ©2010 The British Psychological Society.

  9. An Experiment in Scientific Program Understanding

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Owen, Karl (Technical Monitor)

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  10. Invariance levels across language versions of the PISA 2009 reading comprehension tests in Spain.

    PubMed

    Elosua Oliden, Paula; Mujika Lizaso, Josu

    2013-01-01

    The PISA project provides the basis for studying curriculum design and for comparing factors associated with school effectiveness. These studies are only valid if the different language versions are equivalent to each other. In Spain, the application of PISA in autonomous regions with their own languages means that equivalency must also be extended to the Spanish, Galician, Catalan and Basque versions of the test. The aim of this work was to analyse the equivalence among the four language versions of the Reading Comprehension Test (PISA 2009). After defining the testlet as the unit of analysis, equivalence among the language versions was analysed using two invariance testing procedures: multiple-group mean and covariance structure analyses for ordinal data and ordinal logistic regression. The procedures yielded concordant results supporting metric equivalence across all four language versions: Spanish, Basque, Galician and Catalan. The equivalence supports the estimated reading literacy score comparability among the language versions used in Spain.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iguchi, Toshihiro, E-mail: iguchi@ba2.so-net.ne.jp; Hiraki, Takao, E-mail: takaoh@tc4.so-net.ne.jp; Gobara, Hideo, E-mail: gobara@cc.okayama-u.ac.jp

    PurposeThe aim of the study was to retrospectively evaluate simultaneous multiple hook wire placement outcomes before video-assisted thoracoscopic surgery (VATS).Materials and MethodsThirty-eight procedures were performed on 35 patients (13 men and 22 women; mean age, 59.9 years) with 80 lung lesions (mean diameter 7.9 mm) who underwent simultaneous multiple hook wire placements for preoperative localizations. The primary endpoints were technical success, complications, procedure duration, and VATS outcome; secondary endpoints included comparisons between technical success rates, complication rates, and procedure durations of the 238 single-placement procedures performed. Complications were also evaluated.ResultsIn 35 procedures including 74 lesions, multiple hook wire placements were technically successful;more » in the remaining three procedures, the second target placement was aborted because of massive pneumothorax after the first placement. Although complications occurred in 34 procedures, no grade 3 or above adverse event was observed. The mean procedure duration was 36.4 ± 11.8 min. Three hook wires dislodged during patient transport to the surgical suite. Seventy-four successfully marked lesions were resected. Six lesions without hook wires were successfully resected after detection by palpation with an additional mini-thoracotomy or using subtle pleural changes as a guide. The complication rates and procedure durations of multiple-placement procedures were significantly higher (P = 0.04) and longer (P < 0.001) than those in the single-placement group, respectively, while the technical success rate was not significantly different (P = 0.051).ConclusionsSimultaneous multiple hook wire placements before VATS were clinically feasible, but increased the complication rate and lengthened the procedure time.« less

  12. Influence of Temporal Context on Value in the Multiple-Chains and Successive-Encounters Procedures

    ERIC Educational Resources Information Center

    O'Daly, Matthew; Angulo, Samuel; Gipson, Cassandra; Fantino, Edmund

    2006-01-01

    This set of studies explored the influence of temporal context across multiple-chain and multiple-successive-encounters procedures. Following training with different temporal contexts, the value of stimuli sharing similar reinforcement schedules was assessed by presenting these stimuli in concurrent probes. The results for the multiple-chain…

  13. Considering interactive effects in the identification of influential regions with extremely rare variants via fixed bin approach

    PubMed Central

    2014-01-01

    In this study, we analyze the Genetic Analysis Workshop 18 (GAW18) data to identify regions of single-nucleotide polymorphisms (SNPs), which significantly influence hypertension status among individuals. We have studied the marginal impact of these regions on disease status in the past, but we extend the method to deal with environmental factors present in data collected over several exam periods. We consider the respective interactions between such traits as smoking status and age with the genetic information and hope to augment those genetic regions deemed influential marginally with those that contribute via an interactive effect. In particular, we focus only on rare variants and apply a procedure to combine signal among rare variants in a number of "fixed bins" along the chromosome. We extend the procedure in Agne et al [1] to incorporate environmental factors by dichotomizing subjects via traits such as smoking status and age, running the marginal procedure among each respective category (i.e., smokers or nonsmokers), and then combining their scores into a score for interaction. To avoid overlap of subjects, we examine each exam period individually. Out of a possible 629 fixed-bin regions in chromosome 3, we observe that 11 show up in multiple exam periods for gene-smoking score. Fifteen regions exhibit significance for multiple exam periods for gene-age score, with 4 regions deemed significant for all 3 exam periods. The procedure pinpoints SNPs in 8 "answer" genes, with 5 of these showing up as significant in multiple testing schemes (Gene-Smoking, Gene-Age for Exams 1, 2, and 3). PMID:25519400

  14. The presence-absence coliform test for monitoring drinking water quality.

    PubMed Central

    Rice, E W; Geldreich, E E; Read, E J

    1989-01-01

    The concern for improved monitoring of the sanitary quality of drinking water has prompted interest in alternative methods for the detection of total coliform bacteria. A simplified qualitative presence-absence test has been proposed as an alternate procedure for detecting coliform bacteria in potable water. In this paper data from four comparative studies were analyzed to compare the recovery of total coliform bacteria from drinking water using the presence-absence test, the multiple fermentation tube procedure, and the membrane filter technique. The four studies were of water samples taken from four different geographic areas of the United States: Hawaii, New England (Vermont and New Hampshire), Oregon, and Pennsylvania. Analysis of the results of these studies were compared, based upon the number of positive samples detected by each method. Combined recoveries showed the presence-absence test detected significantly higher numbers of samples with coliforms than either the fermentation tube or membrane filter methods, P less than 0.01. The fermentation tube procedure detected significantly more positive samples than the membrane filter technique, P less than 0.01. Based upon the analysis of the combined data base, it is clear that the presence-absence test is as sensitive as the current coliform methods for the examination of potable water. The presence-absence test offers a viable alternative to water utility companies that elect to use the frequency-of-occurrence approach for compliance monitoring. PMID:2493663

  15. Testing with feedback improves recall of information in informed consent: A proof of concept study.

    PubMed

    Roberts, Katherine J; Revenson, Tracey A; Urken, Mark L; Fleszar, Sara; Cipollina, Rebecca; Rowe, Meghan E; Reis, Laura L Dos; Lepore, Stephen J

    2016-08-01

    This study investigates whether applying educational testing approaches to an informed consent video for a medical procedure can lead to greater recall of the information presented. Undergraduate students (n=120) were randomly assigned to watch a 20-min video on informed consent under one of three conditions: 1) tested using multiple-choice knowledge questions and provided with feedback on their answers after each 5-min segment; 2) tested with multiple choice knowledge questions but not provided feedback after each segment; or 3) watched the video without knowledge testing. Participants who were tested and provided feedback had significantly greater information recall compared to those who were tested but not provided feedback and to those not tested. The effect of condition was stronger for moderately difficult questions versus easy questions. Inserting knowledge tests and providing feedback about the responses at timed intervals in videos can be effective in improving recall of information. Providing informed consent information through a video not only standardizes the material, but using testing with feedback inserted within the video has the potential to increase recall and retention of this material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Detection of multiple perturbations in multi-omics biological networks.

    PubMed

    Griffin, Paula J; Zhang, Yuqing; Johnson, William Evan; Kolaczyk, Eric D

    2018-05-17

    Cellular mechanism-of-action is of fundamental concern in many biological studies. It is of particular interest for identifying the cause of disease and learning the way in which treatments act against disease. However, pinpointing such mechanisms is difficult, due to the fact that small perturbations to the cell can have wide-ranging downstream effects. Given a snapshot of cellular activity, it can be challenging to tell where a disturbance originated. The presence of an ever-greater variety of high-throughput biological data offers an opportunity to examine cellular behavior from multiple angles, but also presents the statistical challenge of how to effectively analyze data from multiple sources. In this setting, we propose a method for mechanism-of-action inference by extending network filtering to multi-attribute data. We first estimate a joint Gaussian graphical model across multiple data types using penalized regression and filter for network effects. We then apply a set of likelihood ratio tests to identify the most likely site of the original perturbation. In addition, we propose a conditional testing procedure to allow for detection of multiple perturbations. We demonstrate this methodology on paired gene expression and methylation data from The Cancer Genome Atlas (TCGA). © 2018, The International Biometric Society.

  17. [Feasibility of device closure for multiple atrial septal defects using 3D printing and ultrasound-guided intervention technique].

    PubMed

    Qiu, X; Lü, B; Xu, N; Yan, C W; Ouyang, W B; Liu, Y; Zhang, F W; Yue, Z Q; Pang, K J; Pan, X B

    2017-04-25

    Objective: To investigate the feasibility of trans-catheter closure of multiple atrial septal defects (ASD) monitored by trans-thoracic echocardiography (TTE) under the guidance of 3D printing heart model. Methods: Between April and August 2016, a total of 21 patients (8 male and 13 female) with multiple ASD in Fuwai Hospital of Chinese Academy of Medical Sciences underwent CT scan and 3-dimensional echocardiography for heart disease model produced by 3D printing technique. The best occlusion program was determined through the simulation test on the model. Percutaneous device closure of multiple ASD was performed follow the predetermined program guided by TTE. Clinical follow-up including electrocardiogram and TTE was arranged at 1 month after the procedure. Results: The trans-catheter procedure was successful in all 21 patients using a single atrial septal occluder. Mild residual shunt was found in 5 patient in the immediate postoperative period, 3 of them were disappeared during postoperative follow-up. There was no death, vascular damage, arrhythmia, device migration, thromboembolism, valvular dysfunction during the follow-up period. Conclusion: The use of 3D printing heart model provides a useful reference for transcatheter device closure of multiple ASD achieving through ultrasound-guided intervention technique, which appears to be safe and feasible with good outcomes of short-term follow-up.

  18. An Investigation of the Accuracy of Alternative Methods of True Score Estimation in High-Stakes Mixed-Format Examinations.

    ERIC Educational Resources Information Center

    Klinger, Don A.; Rogers, W. Todd

    2003-01-01

    The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…

  19. Speech Perception by 6- to 8-Month-Olds in the Presence of Distracting Sounds

    ERIC Educational Resources Information Center

    Polka, Linda; Rvachew, Susan; Molnar, Monika

    2008-01-01

    The role of selective attention in infant phonetic perception was examined using a distraction masker paradigm. We compared perception of /bu/ versus /gu/ in 6- to 8-month-olds using a visual fixation procedure. Infants were habituated to multiple natural productions of 1 syllable type and then presented 4 test trials (old-new-old-new). Perception…

  20. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  1. Maximizing the Information and Validity of a Linear Composite in the Factor Analysis Model for Continuous Item Responses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2008-01-01

    This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…

  2. Delayed Feedback Disrupts the Procedural-Learning System but Not the Hypothesis-Testing System in Perceptual Category Learning

    ERIC Educational Resources Information Center

    Maddox, W. Todd; Ing, A. David

    2005-01-01

    W. T. Maddox, F. G. Ashby, and C. J. Bohil (2003) found that delayed feedback adversely affects information-integration but not rule-based category learning in support of a multiple-systems approach to category learning. However, differences in the number of stimulus dimensions relevant to solving the task and perceptual similarity failed to rule…

  3. Multiple comparisons permutation test for image based data mining in radiotherapy

    PubMed Central

    2013-01-01

    Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy. PMID:24365155

  4. Cutting costs of multiple mini-interviews – changes in reliability and efficiency of the Hamburg medical school admission test between two applications

    PubMed Central

    2014-01-01

    Background Multiple mini-interviews (MMIs) are a valuable tool in medical school selection due to their broad acceptance and promising psychometric properties. With respect to the high expenses associated with this procedure, the discussion about its feasibility should be extended to cost-effectiveness issues. Methods Following a pilot test of MMIs for medical school admission at Hamburg University in 2009 (HAM-Int), we took several actions to improve reliability and to reduce costs of the subsequent procedure in 2010. For both years, we assessed overall and inter-rater reliabilities based on multilevel analyses. Moreover, we provide a detailed specification of costs, as well as an extrapolation of the interrelation of costs, reliability, and the setup of the procedure. Results The overall reliability of the initial 2009 HAM-Int procedure with twelve stations and an average of 2.33 raters per station was ICC=0.75. Following the improvement actions, in 2010 the ICC remained stable at 0.76, despite the reduction of the process to nine stations and 2.17 raters per station. Moreover, costs were cut down from $915 to $495 per candidate. With the 2010 modalities, we could have reached an ICC of 0.80 with 16 single rater stations ($570 per candidate). Conclusions With respect to reliability and cost-efficiency, it is generally worthwhile to invest in scoring, rater training and scenario development. Moreover, it is more beneficial to increase the number of stations instead of raters within stations. However, if we want to achieve more than 80 % reliability, a minor improvement is paid with skyrocketing costs. PMID:24645665

  5. Plasma variations in stress markers: Clinical trial of two anesthetics used in regional block in the extraction of impacted inferior third molars

    PubMed Central

    Arteagoitia, Iciar; Zumarraga, Mercedes; Dávila, Ricardo; Barbier, Luis; Santamaría, Gorka

    2014-01-01

    Objectives: Was to evaluate the effect of different regional anesthetics (articaine with epinephrine versus prilocaine with felypressin) on stress in the extraction of impacted lower third molars in healthy subjects. Sutdy Desing: A prospective single-blind, split-mouth cross-over randomized study was designed, with a control group. The experimental group consisted of 24 otherwise healthy male volunteers, with two impacted lower third molars which were surgically extracted after inferior alveolar nerve block (regional anesthesia), with a fortnight’s interval: the right using 4% articaine with 1:100.000 epinephrine, and the left 3% prilocaine with 1:1.850.000 felypressin. Patients were randomized for the first surgical procedure. To analyze the variation in four stress markers, homovanillic acid, 3-methoxy-4-hydroxyphenylglycol, prolactin and cortisol, 10-mL blood samples were obtained at t = 0, 5, 60, and 120 minutes. The control group consisted of 12 healthy volunteers, who did not undergo either extractions or anesthetic procedures but from whom blood samples were collected and analyzed in the same way. Results: Plasma cortisol increased in the experimental group (multiple range test, P<0.05), the levels being significantly higher in the group receiving 3% prilocaine with 1:1.850,000 felypressin (signed rank test, p<0.0007). There was a significant reduction in homovanillic acid over time in both groups (multiple range test, P<0.05). No significant differences were observed in homovanillic acid, 3-methoxy-4-hydroxyphenylglycol or prolactin concentrations between the experimental and control groups. Conclusions: The effect of regional anesthesia on stress is lower when 4% articaine with 1:100,000 epinephrine is used in this surgical procedure. Key words:Stress markets, epinephrine versus felypressin. PMID:24316704

  6. Dynamic vehicle routing with time windows in theory and practice.

    PubMed

    Yang, Zhiwei; van Osta, Jan-Paul; van Veen, Barry; van Krevelen, Rick; van Klaveren, Richard; Stam, Andries; Kok, Joost; Bäck, Thomas; Emmerich, Michael

    2017-01-01

    The vehicle routing problem is a classical combinatorial optimization problem. This work is about a variant of the vehicle routing problem with dynamically changing orders and time windows. In real-world applications often the demands change during operation time. New orders occur and others are canceled. In this case new schedules need to be generated on-the-fly. Online optimization algorithms for dynamical vehicle routing address this problem but so far they do not consider time windows. Moreover, to match the scenarios found in real-world problems adaptations of benchmarks are required. In this paper, a practical problem is modeled based on the procedure of daily routing of a delivery company. New orders by customers are introduced dynamically during the working day and need to be integrated into the schedule. A multiple ant colony algorithm combined with powerful local search procedures is proposed to solve the dynamic vehicle routing problem with time windows. The performance is tested on a new benchmark based on simulations of a working day. The problems are taken from Solomon's benchmarks but a certain percentage of the orders are only revealed to the algorithm during operation time. Different versions of the MACS algorithm are tested and a high performing variant is identified. Finally, the algorithm is tested in situ: In a field study, the algorithm schedules a fleet of cars for a surveillance company. We compare the performance of the algorithm to that of the procedure used by the company and we summarize insights gained from the implementation of the real-world study. The results show that the multiple ant colony algorithm can get a much better solution on the academic benchmark problem and also can be integrated in a real-world environment.

  7. Wear Resistance of Aluminum Matrix Composites Reinforced with Al2O3 Particles After Multiple Remelting

    NASA Astrophysics Data System (ADS)

    Klasik, Adam; Pietrzak, Krystyna; Makowska, Katarzyna; Sobczak, Jerzy; Rudnik, Dariusz; Wojciechowski, Andrzej

    2016-08-01

    Based on previous results, the commercial composites of A359 (AlSi9Mg) alloy reinforced with 22 vol.% Al2O3 particles were submitted to multiple remelting by means of gravity casting and squeeze-casting procedures. The studies were focused on tribological tests, x-ray phase analyses, and microstructural examinations. More promising results were obtained for squeeze-casting method mainly because of the reduction of the negative microstructural effects such as shrinkage porosity or other microstructural defects and discontinuities. The results showed that direct remelting may be treated as economically well-founded and alternative way compared to other recycling processes. It was underlined that the multiple remelting method must be analyzed for any material separately.

  8. Resampling probability values for weighted kappa with multiple raters.

    PubMed

    Mielke, Paul W; Berry, Kenneth J; Johnston, Janis E

    2008-04-01

    A new procedure to compute weighted kappa with multiple raters is described. A resampling procedure to compute approximate probability values for weighted kappa with multiple raters is presented. Applications of weighted kappa are illustrated with an example analysis of classifications by three independent raters.

  9. An Optimal Bahadur-Efficient Method in Detection of Sparse Signals with Applications to Pathway Analysis in Sequencing Association Studies.

    PubMed

    Dai, Hongying; Wu, Guodong; Wu, Michael; Zhi, Degui

    2016-01-01

    Next-generation sequencing data pose a severe curse of dimensionality, complicating traditional "single marker-single trait" analysis. We propose a two-stage combined p-value method for pathway analysis. The first stage is at the gene level, where we integrate effects within a gene using the Sequence Kernel Association Test (SKAT). The second stage is at the pathway level, where we perform a correlated Lancaster procedure to detect joint effects from multiple genes within a pathway. We show that the Lancaster procedure is optimal in Bahadur efficiency among all combined p-value methods. The Bahadur efficiency,[Formula: see text], compares sample sizes among different statistical tests when signals become sparse in sequencing data, i.e. ε →0. The optimal Bahadur efficiency ensures that the Lancaster procedure asymptotically requires a minimal sample size to detect sparse signals ([Formula: see text]). The Lancaster procedure can also be applied to meta-analysis. Extensive empirical assessments of exome sequencing data show that the proposed method outperforms Gene Set Enrichment Analysis (GSEA). We applied the competitive Lancaster procedure to meta-analysis data generated by the Global Lipids Genetics Consortium to identify pathways significantly associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, and total cholesterol.

  10. Effect of pH Test-Strip Characteristics on Accuracy of Readings.

    PubMed

    Metheny, Norma A; Gunn, Emily M; Rubbelke, Cynthia S; Quillen, Terrilynn Fox; Ezekiel, Uthayashanker R; Meert, Kathleen L

    2017-06-01

    Little is known about characteristics of colorimetric pH test strips that are most likely to be associated with accurate interpretations in clinical situations. To compare the accuracy of 4 pH test strips with varying characteristics (ie, multiple vs single colorimetric squares per calibration, and differing calibration units [1.0 vs 0.5]). A convenience sample of 100 upper-level nursing students with normal color vision was recruited to evaluate the accuracy of the test strips. Six buffer solutions (pH range, 3.0 to 6.0) were used during the testing procedure. Each of the 100 participants performed 20 pH tests in random order, providing a total of 2000 readings. The sensitivity and specificity of each test strip was computed. In addition, the degree to which the test strips under- or overestimated the pH values was analyzed using descriptive statistics. Our criterion for correct readings was an exact match with the pH buffer solution being evaluated. Although none of the test strips evaluated in our study was 100% accurate at all of the measured pH values, those with multiple squares per pH calibration were clearly superior overall to those with a single test square. Test strips with multiple squares per calibration were associated with greater overall accuracy than test strips with a single square per calibration. However, because variable degrees of error were observed in all of the test strips, use of a pH meter is recommended when precise readings are crucial. ©2017 American Association of Critical-Care Nurses.

  11. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  12. Hierarchical control of procedural and declarative category-learning systems

    PubMed Central

    Turner, Benjamin O.; Crossley, Matthew J.; Ashby, F. Gregory

    2017-01-01

    Substantial evidence suggests that human category learning is governed by the interaction of multiple qualitatively distinct neural systems. In this view, procedural memory is used to learn stimulus-response associations, and declarative memory is used to apply explicit rules and test hypotheses about category membership. However, much less is known about the interaction between these systems: how is control passed between systems as they interact to influence motor resources? Here, we used fMRI to elucidate the neural correlates of switching between procedural and declarative categorization systems. We identified a key region of the cerebellum (left Crus I) whose activity was bidirectionally modulated depending on switch direction. We also identified regions of the default mode network (DMN) that were selectively connected to left Crus I during switching. We propose that the cerebellum—in coordination with the DMN—serves a critical role in passing control between procedural and declarative memory systems. PMID:28213114

  13. Specialty Payment Model Opportunities and Assessment: Gastroenterology and Cardiology Model Design Report.

    PubMed

    Mulcahy, Andrew W; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J; Kofner, Aaron; Liu, Jodi L; Lovejoy, Susan L; Popescu, Ioana; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model.

  14. Comparison of design strategies for a three-arm clinical trial with time-to-event endpoint: Power, time-to-analysis, and operational aspects.

    PubMed

    Asikanius, Elina; Rufibach, Kaspar; Bahlo, Jasmin; Bieska, Gabriele; Burger, Hans Ulrich

    2016-11-01

    To optimize resources, randomized clinical trials with multiple arms can be an attractive option to simultaneously test various treatment regimens in pharmaceutical drug development. The motivation for this work was the successful conduct and positive final outcome of a three-arm randomized clinical trial primarily assessing whether obinutuzumab plus chlorambucil in patients with chronic lympocytic lymphoma and coexisting conditions is superior to chlorambucil alone based on a time-to-event endpoint. The inference strategy of this trial was based on a closed testing procedure. We compare this strategy to three potential alternatives to run a three-arm clinical trial with a time-to-event endpoint. The primary goal is to quantify the differences between these strategies in terms of the time it takes until the first analysis and thus potential approval of a new drug, number of required events, and power. Operational aspects of implementing the various strategies are discussed. In conclusion, using a closed testing procedure results in the shortest time to the first analysis with a minimal loss in power. Therefore, closed testing procedures should be part of the statistician's standard clinical trials toolbox when planning multiarm clinical trials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Rapid insulin sensitivity test (RIST).

    PubMed

    Lautt, W W; Wang, X; Sadri, P; Legare, D J; Macedo, M P

    1998-12-01

    A rapid insulin sensitivity test (RIST) was recently introduced to assess insulin action in vivo (H. Xie, L. Zhu, Y.L. Zhang, D.J. Legare, and W.W. Lautt. J. Pharmacol. Toxicol. Methods, 35: 77-82. 1996). This technical report describes the current recommended standard operating procedure for the use of the RIST in rats based upon additional experience with approximately 100 tests. We describe the manufacture and use of an arterial-venous shunt that allows rapid multiple arterial samples and intravenous administration of drugs. The RIST procedure involves determination of a stable arterial glucose baseline to define the ideal euglycemic level to be maintained following a 5-min infusion of insulin, with the RIST index being the amount of glucose required to be infused to maintain euglycemia over the test period. Insulin administration by a 5-min infusion is preferable to a 30-s bolus administration. No significant difference was determined between the use of Toronto pork-beef or human insulin. Four consecutive RISTs were carried out in the same animal over 4-5 h with no tendency for change with time. The RIST index is sufficiently sensitive and reproducible to permit establishment of insulin dose-response curves and interference of insulin action by elimination of hepatic parasympathetic nerves, using atropine. This technical report provides the current recommended standard operating procedure for the RIST.

  16. An improved procedure for the validation of satellite-based precipitation estimates

    NASA Astrophysics Data System (ADS)

    Tang, Ling; Tian, Yudong; Yan, Fang; Habib, Emad

    2015-09-01

    The objective of this study is to propose and test a new procedure to improve the validation of remote-sensing, high-resolution precipitation estimates. Our recent studies show that many conventional validation measures do not accurately capture the unique error characteristics in precipitation estimates to better inform both data producers and users. The proposed new validation procedure has two steps: 1) an error decomposition approach to separate the total retrieval error into three independent components: hit error, false precipitation and missed precipitation; and 2) the hit error is further analyzed based on a multiplicative error model. In the multiplicative error model, the error features are captured by three model parameters. In this way, the multiplicative error model separates systematic and random errors, leading to more accurate quantification of the uncertainties. The proposed procedure is used to quantitatively evaluate the recent two versions (Version 6 and 7) of TRMM's Multi-sensor Precipitation Analysis (TMPA) real-time and research product suite (3B42 and 3B42RT) for seven years (2005-2011) over the continental United States (CONUS). The gauge-based National Centers for Environmental Prediction (NCEP) Climate Prediction Center (CPC) near-real-time daily precipitation analysis is used as the reference. In addition, the radar-based NCEP Stage IV precipitation data are also model-fitted to verify the effectiveness of the multiplicative error model. The results show that winter total bias is dominated by the missed precipitation over the west coastal areas and the Rocky Mountains, and the false precipitation over large areas in Midwest. The summer total bias is largely coming from the hit bias in Central US. Meanwhile, the new version (V7) tends to produce more rainfall in the higher rain rates, which moderates the significant underestimation exhibited in the previous V6 products. Moreover, the error analysis from the multiplicative error model provides a clear and concise picture of the systematic and random errors, with both versions of 3B42RT have higher errors in varying degrees than their research (post-real-time) counterparts. The new V7 algorithm shows obvious improvements in reducing random errors in both winter and summer seasons, compared to its predecessors V6. Stage IV, as expected, surpasses the satellite-based datasets in all the metrics over CONUS. Based on the results, we recommend the new procedure be adopted for routine validation of satellite-based precipitation datasets, and we expect the procedure will work effectively for higher resolution data to be produced in the Global Precipitation Measurement (GPM) era.

  17. Digital sun sensor multi-spot operation.

    PubMed

    Rufino, Giancarlo; Grassi, Michele

    2012-11-28

    The operation and test of a multi-spot digital sun sensor for precise sun-line determination is described. The image forming system consists of an opaque mask with multiple pinhole apertures producing multiple, simultaneous, spot-like images of the sun on the focal plane. The sun-line precision can be improved by averaging multiple simultaneous measures. Nevertheless, the sensor operation on a wide field of view requires acquiring and processing images in which the number of sun spots and the related intensity level are largely variable. To this end, a reliable and robust image acquisition procedure based on a variable shutter time has been considered as well as a calibration function exploiting also the knowledge of the sun-spot array size. Main focus of the present paper is the experimental validation of the wide field of view operation of the sensor by using a sensor prototype and a laboratory test facility. Results demonstrate that it is possible to keep high measurement precision also for large off-boresight angles.

  18. Integrated Analysis of Pharmacologic, Clinical, and SNP Microarray Data using Projection onto the Most Interesting Statistical Evidence with Adaptive Permutation Testing

    PubMed Central

    Pounds, Stan; Cao, Xueyuan; Cheng, Cheng; Yang, Jun; Campana, Dario; Evans, William E.; Pui, Ching-Hon; Relling, Mary V.

    2010-01-01

    Powerful methods for integrated analysis of multiple biological data sets are needed to maximize interpretation capacity and acquire meaningful knowledge. We recently developed Projection Onto the Most Interesting Statistical Evidence (PROMISE). PROMISE is a statistical procedure that incorporates prior knowledge about the biological relationships among endpoint variables into an integrated analysis of microarray gene expression data with multiple biological and clinical endpoints. Here, PROMISE is adapted to the integrated analysis of pharmacologic, clinical, and genome-wide genotype data that incorporating knowledge about the biological relationships among pharmacologic and clinical response data. An efficient permutation-testing algorithm is introduced so that statistical calculations are computationally feasible in this higher-dimension setting. The new method is applied to a pediatric leukemia data set. The results clearly indicate that PROMISE is a powerful statistical tool for identifying genomic features that exhibit a biologically meaningful pattern of association with multiple endpoint variables. PMID:21516175

  19. Intercorrelation of P and Pn Recordings for the North Korean Nuclear Tests

    NASA Astrophysics Data System (ADS)

    Lay, T.; Voytan, D.; Ohman, J.

    2017-12-01

    The relative waveform analysis procedure called Intercorrelation is applied to Pn and P waveforms at regional and teleseismic distances, respectively, for the 5 underground nuclear tests at the North Korean nuclear test site. Intercorrelation is a waveform equalization procedure that parameterizes the effective source function for a given explosion, including the reduced velocity potential convolved with a simplified Green's function that accounts for the free surface reflections (pPn and pP), and possibly additional arrivals such as spall. The source function for one event is convolved with the signal at a given station for a second event, and the recording at the same station for the first event is convolved with the source function for the second event. This procedure eliminates the need to predict the complex receiver function effects at the station, which are typically not well-known for short-period response. The parameters of the source function representation are yield and burial depth, and an explosion source model is required. Here we use the Mueller-Murphy representation of the explosion reduced velocity potential, which explicitly depends on yield and burial depth. We then search over yield and burial depth ranges for both events, constrained by a priori information about reasonable ranges of parameters, to optimize the simultaneous match of multiple station signals for the two events. This procedure, applied to the apparently overburied North Korean nuclear tests (no indications of spall complexity), assuming simple free surface interactions (elastic reflection from a flat surface), provides excellent waveform equalization for all combinations of 5 nuclear tests.

  20. Testing coordinate measuring arms with a geometric feature-based gauge: in situ field trials

    NASA Astrophysics Data System (ADS)

    Cuesta, E.; Alvarez, B. J.; Patiño, H.; Telenti, A.; Barreiro, J.

    2016-05-01

    This work describes in detail the definition of a procedure for calibrating and evaluating coordinate measuring arms (AACMMs or CMAs). CMAs are portable coordinate measuring machines that have been widely accepted in industry despite their sensitivity to the skill and experience of the operator in charge of the inspection task. The procedure proposed here is based on the use of a dimensional gauge that incorporates multiple geometric features, specifically designed for evaluating the measuring technique when CMAs are used, at company facilities (workshops or laboratories) and by the usual operators who handle these devices in their daily work. After establishing the procedure and manufacturing the feature-based gauge, the research project was complemented with diverse in situ field tests performed with the collaboration of companies that use these devices in their inspection tasks. Some of the results are presented here, not only comparing different operators but also comparing different companies. The knowledge extracted from these experiments has allowed the procedure to be validated, the defects of the methodologies currently used for in situ inspections to be detected, and substantial improvements for increasing the reliability of these portable instruments to be proposed.

  1. I. Excluded volume effects in Ising cluster distributions and nuclear multifragmentation. II. Multiple-chance effects in alpha-particle evaporation

    NASA Astrophysics Data System (ADS)

    Breus, Dimitry Eugene

    In Part I, geometric clusters of the Ising model are studied as possible model clusters for nuclear multifragmentation. These clusters may not be considered as non-interacting (ideal gas) due to excluded volume effect which predominantly is the artifact of the cluster's finite size. Interaction significantly complicates the use of clusters in the analysis of thermodynamic systems. Stillinger's theory is used as a basis for the analysis, which within the RFL (Reiss, Frisch, Lebowitz) fluid-of-spheres approximation produces a prediction for cluster concentrations well obeyed by geometric clusters of the Ising model. If thermodynamic condition of phase coexistence is met, these concentrations can be incorporated into a differential equation procedure of moderate complexity to elucidate the liquid-vapor phase diagram of the system with cluster interaction included. The drawback of increased complexity is outweighted by the reward of greater accuracy of the phase diagram, as it is demonstrated by the Ising model. A novel nuclear-cluster analysis procedure is developed by modifying Fisher's model to contain cluster interaction and employing the differential equation procedure to obtain thermodynamic variables. With this procedure applied to geometric clusters, the guidelines are developed to look for excluded volume effect in nuclear multifragmentation. In Part II, an explanation is offered for the recently observed oscillations in the energy spectra of alpha-particles emitted from hot compound nuclei. Contrary to what was previously expected, the oscillations are assumed to be caused by the multiple-chance nature of alpha-evaporation. In a semi-empirical fashion this assumption is successfully confirmed by a technique of two-spectra decomposition which treats experimental alpha-spectra as having contributions from at least two independent emitters. Building upon the success of the multiple-chance explanation of the oscillations, Moretto's single-chance evaporation theory is augmented to include multiple-chance emission and tested on experimental data to yield positive results.

  2. Type-II generalized family-wise error rate formulas with application to sample size determination.

    PubMed

    Delorme, Phillipe; de Micheaux, Pierre Lafaye; Liquet, Benoit; Riou, Jérémie

    2016-07-20

    Multiple endpoints are increasingly used in clinical trials. The significance of some of these clinical trials is established if at least r null hypotheses are rejected among m that are simultaneously tested. The usual approach in multiple hypothesis testing is to control the family-wise error rate, which is defined as the probability that at least one type-I error is made. More recently, the q-generalized family-wise error rate has been introduced to control the probability of making at least q false rejections. For procedures controlling this global type-I error rate, we define a type-II r-generalized family-wise error rate, which is directly related to the r-power defined as the probability of rejecting at least r false null hypotheses. We obtain very general power formulas that can be used to compute the sample size for single-step and step-wise procedures. These are implemented in our R package rPowerSampleSize available on the CRAN, making them directly available to end users. Complexities of the formulas are presented to gain insight into computation time issues. Comparison with Monte Carlo strategy is also presented. We compute sample sizes for two clinical trials involving multiple endpoints: one designed to investigate the effectiveness of a drug against acute heart failure and the other for the immunogenicity of a vaccine strategy against pneumococcus. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Characterization of Friction Joints Subjected to High Levels of Random Vibration

    NASA Technical Reports Server (NTRS)

    deSantos, Omar; MacNeal, Paul

    2012-01-01

    This paper describes the test program in detail including test sample description, test procedures, and vibration test results of multiple test samples. The material pairs used in the experiment were Aluminum-Aluminum, Aluminum- Dicronite coated Aluminum, and Aluminum-Plasmadize coated Aluminum. Levels of vibration for each set of twelve samples of each material pairing were gradually increased until all samples experienced substantial displacement. Data was collected on 1) acceleration in all three axes, 2) relative static displacement between vibration runs utilizing photogrammetry techniques, and 3) surface galling and contaminant generation. This data was used to estimate the values of static friction during random vibratory motion when "stick-slip" occurs and compare these to static friction coefficients measured before and after vibration testing.

  4. Study of solution procedures for nonlinear structural equations

    NASA Technical Reports Server (NTRS)

    Young, C. T., II; Jones, R. F., Jr.

    1980-01-01

    A method for the redution of the cost of solution of large nonlinear structural equations was developed. Verification was made using the MARC-STRUC structure finite element program with test cases involving single and multiple degrees of freedom for static geometric nonlinearities. The method developed was designed to exist within the envelope of accuracy and convergence characteristic of the particular finite element methodology used.

  5. Designing Adaptive Instructional Environments: Insights from Empirical Evidence

    DTIC Science & Technology

    2011-10-01

    theorems. Cohen’s f effect size for pretest to posttest gain, averaged across different problems = 0.46. 7 Basis for Adaptation Ability of...problems and took a posttest . Measures of Learning 26-item multiple choice pretest and posttest . Effect size on posttest scores as measured by...solving algebraic equations. Measures of Learning Pretest and posttest using rapid diagnostic testing procedure: Student had to provide their

  6. On testing an unspecified function through a linear mixed effects model with multiple variance components

    PubMed Central

    Wang, Yuanjia; Chen, Huaihou

    2012-01-01

    Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801

  7. On testing an unspecified function through a linear mixed effects model with multiple variance components.

    PubMed

    Wang, Yuanjia; Chen, Huaihou

    2012-12-01

    We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.

  8. Program CONTRAST--A general program for the analysis of several survival or recovery rate estimates

    USGS Publications Warehouse

    Hines, J.E.; Sauer, J.R.

    1989-01-01

    This manual describes the use of program CONTRAST, which implements a generalized procedure for the comparison of several rate estimates. This method can be used to test both simple and composite hypotheses about rate estimates, and we discuss its application to multiple comparisons of survival rate estimates. Several examples of the use of program CONTRAST are presented. Program CONTRAST will run on IBM-cimpatible computers, and requires estimates of the rates to be tested, along with associated variance and covariance estimates.

  9. Endurance of Multiplication Fact Fluency for Students with Attention Deficit Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Brady, Kelly K.; Kubina, Richard M., Jr.

    2010-01-01

    This study examines the relationship between a critical learning outcome of behavioral fluency and endurance, by comparing the effects of two practice procedures on multiplication facts two through nine. The first procedure, called whole time practice trial, consisted of an uninterrupted 1 minute practice time. The second procedure, endurance…

  10. Spatial scan statistics for detection of multiple clusters with arbitrary shapes.

    PubMed

    Lin, Pei-Sheng; Kung, Yi-Hung; Clayton, Murray

    2016-12-01

    In applying scan statistics for public health research, it would be valuable to develop a detection method for multiple clusters that accommodates spatial correlation and covariate effects in an integrated model. In this article, we connect the concepts of the likelihood ratio (LR) scan statistic and the quasi-likelihood (QL) scan statistic to provide a series of detection procedures sufficiently flexible to apply to clusters of arbitrary shape. First, we use an independent scan model for detection of clusters and then a variogram tool to examine the existence of spatial correlation and regional variation based on residuals of the independent scan model. When the estimate of regional variation is significantly different from zero, a mixed QL estimating equation is developed to estimate coefficients of geographic clusters and covariates. We use the Benjamini-Hochberg procedure (1995) to find a threshold for p-values to address the multiple testing problem. A quasi-deviance criterion is used to regroup the estimated clusters to find geographic clusters with arbitrary shapes. We conduct simulations to compare the performance of the proposed method with other scan statistics. For illustration, the method is applied to enterovirus data from Taiwan. © 2016, The International Biometric Society.

  11. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities

    PubMed Central

    Yao, Daoyuan; Givens, Gregg

    2015-01-01

    Abstract Introduction: Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. Materials and Methods: This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth® (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. Results: The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. Conclusions: This browser-server–based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups. PMID:25919376

  13. A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities.

    PubMed

    Yao, Jianchu Jason; Yao, Daoyuan; Givens, Gregg

    2015-09-01

    Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth(®) (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. This browser-server-based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups.

  14. 29 CFR 1926.753 - Hoisting and rigging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... lift rigging procedure. (1) A multiple lift shall only be performed if the following criteria are met: (i) A multiple lift rigging assembly is used; (ii) A maximum of five members are hoisted per lift... multiple lift have been trained in these procedures in accordance with § 1926.761(c)(1). (v) No crane is...

  15. 13 CFR 121.407 - What are the size procedures for multiple item procurements?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Requirements for Government Procurement § 121.407 What are the size procedures for multiple item procurements? If a procurement calls for two or more specific end items or types of services with different size... multiple item procurements? 121.407 Section 121.407 Business Credit and Assistance SMALL BUSINESS...

  16. Leaching of CCA-treated wood: implications for waste disposal.

    PubMed

    Townsend, Timothy; Tolaymat, Thabet; Solo-Gabriele, Helena; Dubey, Brajesh; Stook, Kristin; Wadanambi, Lakmini

    2004-10-18

    Leaching of arsenic, chromium, and copper from chromated copper arsenate (CCA)-treated wood poses possible environmental risk when disposed. Samples of un-weathered CCA-treated wood were tested using a variety of the US regulatory leaching procedures, including the toxicity characteristic leaching procedure (TCLP), synthetic precipitation leaching procedure (SPLP), extraction procedure toxicity method (EPTOX), waste extraction test (WET), multiple extraction procedure (MEP), and modifications of these procedures which utilized actual MSW landfill leachates, a construction and demolition (C and D) debris leachate, and a concrete enhanced leachate. Additional experiments were conducted to assess factors affecting leaching, such as particle size, pH, and leaching contact time. Results from the regulatory leaching tests provided similar results with the exception of the WET, which extracted greater quantities of metals. Experiments conducted using actual MSW leachate, C and D debris leachate, and concrete enhanced leachate provided results that were within the same order of magnitude as results obtained from TCLP, SPLP, and EPTOX. Eleven of 13 samples of CCA-treated dimensional lumber exceeded the US EPA's toxicity characteristic (TC) threshold for arsenic (5 mg/L). If un-weathered arsenic-treated wood were not otherwise excluded from the definition of hazardous waste, it frequently would require management as such. When extracted with simulated rainwater (SPLP), 9 of the 13 samples leached arsenic at concentrations above 5 mg/L. Metal leachability tended to increase with decreasing particle size and at pH extremes. All three metals leached above the drinking water standards thus possibly posing a potential risk to groundwater. Arsenic is a major concern from a disposal point of view with respect to ground water quality.

  17. Demonstration of disinfection procedure for the development of accurate blood glucose meters in accordance with ISO 15197:2013

    PubMed Central

    Lin, Wen-Ye; Chang, Jung-Tzu; Chu, Chun-Feng

    2017-01-01

    Despite measures to reduce disease transmission, a risk can occur when blood glucose meters (BGMs) are used on multiple individuals or by caregivers assisting a patient. The laboratory and in-clinic performance of a BGM system before and after disinfection should be demonstrated to guarantee accurate readings and reliable control of blood glucose (BG) for patients. In this study, an effective disinfection procedure, conducting wiping 10 times to assure a one minute contact time of the disinfectant on contaminated surface, was first demonstrated using test samples of the meter housing materials, including acrylonitrile butadiene styrene (ABS), polymethyl methacrylate (PMMA), and polycarbonate (PC), in accordance with ISO 15197:2013. After bench studies comprising 10,000 disinfection cycles, the elemental compositions of the disinfected ABS, PMMA, and PC samples were almost the same as in the original samples, as indicated by electron spectroscopy for chemical analysis. Subsequently, the validated disinfection procedure was then directly applied to disinfect 5 commercial BGM systems composed of ABS, PMMA, or PC to observe the effect of the validated disinfection procedure on meter accuracy. The results of HBsAg values after treatment with HBV sera and disinfectant wipes for each material were less than the LoD of each material of 0.020 IU/mL. Before and after the multiple disinfection cycles, 900 of 900 samples (100%) were within the system accuracy requirements of ISO 15197:2013. All of the systems showed high performance before and after the series of disinfection cycles and met the ISO 15197:2013 requirements. In addition, our results demonstrated multiple cleaning and disinfection cycles that represented normal use over the lifetime of a meter of 3–5 years. Our validated cleaning and disinfection procedure can be directly applied to other registered disinfectants for cleaning commercial BGM products in the future. PMID:28683148

  18. Demonstration of disinfection procedure for the development of accurate blood glucose meters in accordance with ISO 15197:2013.

    PubMed

    Lin, Shu-Ping; Lin, Wen-Ye; Chang, Jung-Tzu; Chu, Chun-Feng

    2017-01-01

    Despite measures to reduce disease transmission, a risk can occur when blood glucose meters (BGMs) are used on multiple individuals or by caregivers assisting a patient. The laboratory and in-clinic performance of a BGM system before and after disinfection should be demonstrated to guarantee accurate readings and reliable control of blood glucose (BG) for patients. In this study, an effective disinfection procedure, conducting wiping 10 times to assure a one minute contact time of the disinfectant on contaminated surface, was first demonstrated using test samples of the meter housing materials, including acrylonitrile butadiene styrene (ABS), polymethyl methacrylate (PMMA), and polycarbonate (PC), in accordance with ISO 15197:2013. After bench studies comprising 10,000 disinfection cycles, the elemental compositions of the disinfected ABS, PMMA, and PC samples were almost the same as in the original samples, as indicated by electron spectroscopy for chemical analysis. Subsequently, the validated disinfection procedure was then directly applied to disinfect 5 commercial BGM systems composed of ABS, PMMA, or PC to observe the effect of the validated disinfection procedure on meter accuracy. The results of HBsAg values after treatment with HBV sera and disinfectant wipes for each material were less than the LoD of each material of 0.020 IU/mL. Before and after the multiple disinfection cycles, 900 of 900 samples (100%) were within the system accuracy requirements of ISO 15197:2013. All of the systems showed high performance before and after the series of disinfection cycles and met the ISO 15197:2013 requirements. In addition, our results demonstrated multiple cleaning and disinfection cycles that represented normal use over the lifetime of a meter of 3-5 years. Our validated cleaning and disinfection procedure can be directly applied to other registered disinfectants for cleaning commercial BGM products in the future.

  19. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  20. Use of surveillance data to identify target populations for Staphylococcus aureus vaccines and prevent surgical site infections: A pilot study

    PubMed Central

    Gustin, Marie-Paule; Giard, Marine; Bénet, Thomas; Vanhems, Philippe

    2015-01-01

    The development of anti-staphylococcal vaccines is nowadays a priority to prevent surgical site infections (SSI). The objective of the present study was to identify a potential target population by assessing surveillance data on surgery patients for possible anti-staphylococcal vaccine administration. Individuals at high risk of SSI by Staphylococcus aureus (SA) were targeted by the French SSI Surveillance Network in south-eastern France between 2008 and 2011. Among 238,470 patients, those undergoing primary total hip replacement appeared to be an interesting and healthy enough population for anti-staphylococcal vaccine testing. These male patients, subjected to multiple procedures and with American Society of Anesthesiologists score >2, had a probability of SA SSI about 21 times higher than females with no severe systemic disease and no multiple procedures. Our study indicates that surveillance data on SSI might be an interesting epidemiological source for planning vaccine trials to prevent nosocomial infections. PMID:25668663

  1. Multiple Small Diameter Drillings Increase Femoral Neck Stability Compared with Single Large Diameter Femoral Head Core Decompression Technique for Avascular Necrosis of the Femoral Head.

    PubMed

    Brown, Philip J; Mannava, Sandeep; Seyler, Thorsten M; Plate, Johannes F; Van Sikes, Charles; Stitzel, Joel D; Lang, Jason E

    2016-10-26

    Femoral head core decompression is an efficacious joint-preserving procedure for treatment of early stage avascular necrosis. However, postoperative fractures have been described which may be related to the decompression technique used. Femoral head decompressions were performed on 12 matched human cadaveric femora comparing large 8mm single bore versus multiple 3mm small drilling techniques. Ultimate failure strength of the femora was tested using a servo-hydraulic material testing system. Ultimate load to failure was compared between the different decompression techniques using two paired ANCOVA linear regression models. Prior to biomechanical testing and after the intervention, volumetric bone mineral density was determined using quantitative computed tomography to account for variation between cadaveric samples and to assess the amount of bone disruption by the core decompression. Core decompression, using the small diameter bore and multiple drilling technique, withstood significantly greater load prior to failure compared with the single large bore technique after adjustment for bone mineral density (p< 0.05). The 8mm single bore technique removed a significantly larger volume of bone compared to the 3mm multiple drilling technique (p< 0.001). However, total fracture energy was similar between the two core decompression techniques. When considering core decompression for the treatment of early stage avascular necrosis, the multiple small bore technique removed less bone volume, thereby potentially leading to higher load to failure.

  2. Procedural aspects of the organization of the comprehensive European Board of Ophthalmology Diploma examination

    PubMed Central

    Sunaric-Mégevand, Gordana; Aclimandos, Wagih

    2016-01-01

    The comprehensive European Board of Ophthalmology Diploma (EBOD) examination is one of 38 European medical specialty examinations. This review aims at disclosing the specific procedures and content of the EBOD examination. It is a descriptive study summarizing the present organization of the EBOD examination. It is the 3rd largest European postgraduate medical assessment after anaesthesiology and cardiology. The master language is English for the Part 1 written test (knowledge test with 52 modified type X multiple-choice questions) (in the past the written test was also available in French and German). Ophthalmology training of minimum 4 years in a full or associated European Union of Medical Specialists (UEMS) member state is a prerequisite. Problem-solving skills are tested in the Part 2 oral assessment, which is a viva of 4 subjects conducted in English with support for native language whenever feasible. The comprehensive EBOD examination is one of the leading examinations organized by UEMS European Boards or Specialist Sections from the point of number of examinees, item banking, and item contents. PMID:27464640

  3. Statistical grand rounds: a review of analysis and sample size calculation considerations for Wilcoxon tests.

    PubMed

    Divine, George; Norton, H James; Hunt, Ronald; Dienemann, Jacqueline

    2013-09-01

    When a study uses an ordinal outcome measure with unknown differences in the anchors and a small range such as 4 or 7, use of the Wilcoxon rank sum test or the Wilcoxon signed rank test may be most appropriate. However, because nonparametric methods are at best indirect functions of standard measures of location such as means or medians, the choice of the most appropriate summary measure can be difficult. The issues underlying use of these tests are discussed. The Wilcoxon-Mann-Whitney odds directly reflects the quantity that the rank sum procedure actually tests, and thus it can be a superior summary measure. Unlike the means and medians, its value will have a one-to-one correspondence with the Wilcoxon rank sum test result. The companion article appearing in this issue of Anesthesia & Analgesia ("Aromatherapy as Treatment for Postoperative Nausea: A Randomized Trial") illustrates these issues and provides an example of a situation for which the medians imply no difference between 2 groups, even though the groups are, in fact, quite different. The trial cited also provides an example of a single sample that has a median of zero, yet there is a substantial shift for much of the nonzero data, and the Wilcoxon signed rank test is quite significant. These examples highlight the potential discordance between medians and Wilcoxon test results. Along with the issues surrounding the choice of a summary measure, there are considerations for the computation of sample size and power, confidence intervals, and multiple comparison adjustment. In addition, despite the increased robustness of the Wilcoxon procedures relative to parametric tests, some circumstances in which the Wilcoxon tests may perform poorly are noted, along with alternative versions of the procedures that correct for such limitations. 

  4. The Effects of Daily Intensive Tact Instruction on Preschool Students' Emission of Pure Tacts and Mands in Non-Instructional Setting

    ERIC Educational Resources Information Center

    Pistoljevic, Nirvana; Greer, R. Douglas

    2006-01-01

    We tested the effects of an intensive tact instruction procedure on numbers of tacts emitted in non-instructional settings (NIS) using a multiple probe design across 3 participants (3- and 4-year old boys with autism). The dependent variable was tacts emitted in NIS before/after the mastery of sets of 5 different stimuli. The non-instructional…

  5. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications § 1065.308 Continuous... adjusted to account for the dilution from ambient air drawn into the probe. We recommend you use the final... gases diluted in air. You may use a multi-gas span gas, such as NO-CO-CO2-C3H8-CH4, to verify multiple...

  6. Increasing Efficiency of Fecal Coliform Testing Through EPA-Approved Alternate Method Colilert*-18

    NASA Technical Reports Server (NTRS)

    Cornwell, Brian

    2017-01-01

    The 21 SM 9221 E multiple-tube fermentation method for fecal coliform analysis requires a large time and reagent investment for the performing laboratory. In late 2010, the EPA approved an alternative procedure for the determination of fecal coliforms designated as Colilert*-18. However, as of late 2016, only two VELAP-certified laboratories in the Commonwealth of Virginia have been certified in this method.

  7. Measures of Agreement Between Many Raters for Ordinal Classifications

    PubMed Central

    Nelson, Kerrie P.; Edwards, Don

    2015-01-01

    Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449

  8. A simplified calculation procedure for mass isotopomer distribution analysis (MIDA) based on multiple linear regression.

    PubMed

    Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio

    2016-10-01

    We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two 13 C atoms ( 13 C 2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of 13 C 2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% 13 C 2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Step by Step: Biology Undergraduates' Problem-Solving Procedures during Multiple-Choice Assessment

    ERIC Educational Resources Information Center

    Prevost, Luanna B.; Lemons, Paula P.

    2016-01-01

    This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this…

  10. A universal procedure for evaluation and application of surge-protective devices

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The source, nature, and frequency of occurrence of transients must be identified and a representative standard test wave chosen for proof testing. The performance of candidate suppressor devices then can be evaluated against the withstand goals set for the equipment. The various suppressors divide into two classes of generic behavior. The key to a universal procedure for evaluating both classes lies in representing transients as quasi-current sources of defined current impulse duration. The available surge current is established by the Thevenin equivalent transient voltage and source impedance. A load line drawn on the V-I characteristic graph of the suppressor quickly determines the clamping voltage and peak current. These values then can be compared to the requirement. The deposited energy and average power dissipation for multiple transients also can be calculated. The method is illustrated with a design example for motor vehicle alternator load dump suppression.

  11. Second-Order Analysis of Semiparametric Recurrent Event Processes

    PubMed Central

    Guan, Yongtao

    2011-01-01

    Summary A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a followup period. Such data have become increasingly available in medical and epidemiological studies. In this paper, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on Meningococcal disease cases in Merseyside, UK to illustrate their practical value. PMID:21361885

  12. Magnetoencephalogram blind source separation and component selection procedure to improve the diagnosis of Alzheimer's disease patients.

    PubMed

    Escudero, Javier; Hornero, Roberto; Abásolo, Daniel; Fernández, Alberto; Poza, Jesús

    2007-01-01

    The aim of this study was to improve the diagnosis of Alzheimer's disease (AD) patients applying a blind source separation (BSS) and component selection procedure to their magnetoencephalogram (MEG) recordings. MEGs from 18 AD patients and 18 control subjects were decomposed with the algorithm for multiple unknown signals extraction. MEG channels and components were characterized by their mean frequency, spectral entropy, approximate entropy, and Lempel-Ziv complexity. Using Student's t-test, the components which accounted for the most significant differences between groups were selected. Then, these relevant components were used to partially reconstruct the MEG channels. By means of a linear discriminant analysis, we found that the BSS-preprocessed MEGs classified the subjects with an accuracy of 80.6%, whereas 72.2% accuracy was obtained without the BSS and component selection procedure.

  13. Does the Type of Metal Instrumentation Affect the Risk of Surgical Site Infection in Pediatric Scoliosis Surgery?

    PubMed

    Wright, Margaret L; Skaggs, David L; Matsumoto, Hiroko; Woon, Regina P; Trocle, Ashley; Flynn, John M; Vitale, Michael G

    2016-05-01

    Retrospective cohort study. To determine the association of implant metal composition with the risk of surgical site infection (SSI) following pediatric spine surgery. SSI is a well-described complication following pediatric spine surgery. Many risk factors have been identified in the literature, but controversy remains regarding metal composition as a risk factor. This was a retrospective study of patients who underwent posterior spinal instrumentation procedures between January 1, 2006, and December 31, 2008, at three large children's hospitals for any etiology of scoliosis and had at least 1 year of postoperative follow-up. Procedures included posterior spinal fusion, growth-friendly instrumentation, and revision of spinal instrumentation. The Centers for Disease Control and Prevention definition of SSI was used. A chi-squared test was performed to determine the relationship between type of metal instrumentation and development of an SSI. The study included 874 patients who underwent 1,156 total procedures. Overall, 752 (65%) procedures used stainless steel instrumentation, 238 (21%) procedures used titanium instrumentation, and the remaining 166 (14%) procedures used cobalt chrome and titanium hybrid instrumentation. The overall risk of infection was 6.1% (70/1,156) per procedure, with 5.9% (44/752) for stainless steel, 6.7% (12/238) for titanium, and 6.0% (10/166) for cobalt chrome. The multiple regression analysis found no significant differences in the metal type used between patients with and without infection (p = .886) adjusting for etiology, instrumentation to pelvis, and type of procedures. When stratified based on etiology, the multiple regression analyses also found no significant difference in SSI between two metal type groups. This study found no difference in risk of infection with stainless steel, titanium, or cobalt chrome/titanium instrumentation and is adequately powered to detect a true difference in risk of SSI. Level II, prognostic. Copyright © 2016 Scoliosis Research Society. Published by Elsevier Inc. All rights reserved.

  14. Toward a standard line for use in multibeam echo sounder calibration

    NASA Astrophysics Data System (ADS)

    Weber, Thomas C.; Rice, Glen; Smith, Michael

    2018-06-01

    A procedure is suggested in which a relative calibration for the intensity output of a multibeam echo sounder (MBES) can be performed. This procedure identifies a common survey line (i.e., a standard line), over which acoustic backscatter from the seafloor is collected with multiple MBES systems or by the same system multiple times. A location on the standard line which exhibits temporal stability in its seafloor backscatter response is used to bring the intensity output of the multiple MBES systems to a common reference. This relative calibration procedure has utility for MBES users wishing to generate an aggregate seafloor backscatter mosaic using multiple systems, revisiting an area to detect changes in substrate type, and comparing substrate types in the same general area but with different systems or different system settings. The calibration procedure is demonstrated using three different MBES systems over 3 different years in New Castle, NH, USA.

  15. An Exemplar-Based Multi-View Domain Generalization Framework for Visual Recognition.

    PubMed

    Niu, Li; Li, Wen; Xu, Dong; Cai, Jianfei

    2018-02-01

    In this paper, we propose a new exemplar-based multi-view domain generalization (EMVDG) framework for visual recognition by learning robust classifier that are able to generalize well to arbitrary target domain based on the training samples with multiple types of features (i.e., multi-view features). In this framework, we aim to address two issues simultaneously. First, the distribution of training samples (i.e., the source domain) is often considerably different from that of testing samples (i.e., the target domain), so the performance of the classifiers learnt on the source domain may drop significantly on the target domain. Moreover, the testing data are often unseen during the training procedure. Second, when the training data are associated with multi-view features, the recognition performance can be further improved by exploiting the relation among multiple types of features. To address the first issue, considering that it has been shown that fusing multiple SVM classifiers can enhance the domain generalization ability, we build our EMVDG framework upon exemplar SVMs (ESVMs), in which a set of ESVM classifiers are learnt with each one trained based on one positive training sample and all the negative training samples. When the source domain contains multiple latent domains, the learnt ESVM classifiers are expected to be grouped into multiple clusters. To address the second issue, we propose two approaches under the EMVDG framework based on the consensus principle and the complementary principle, respectively. Specifically, we propose an EMVDG_CO method by adding a co-regularizer to enforce the cluster structures of ESVM classifiers on different views to be consistent based on the consensus principle. Inspired by multiple kernel learning, we also propose another EMVDG_MK method by fusing the ESVM classifiers from different views based on the complementary principle. In addition, we further extend our EMVDG framework to exemplar-based multi-view domain adaptation (EMVDA) framework when the unlabeled target domain data are available during the training procedure. The effectiveness of our EMVDG and EMVDA frameworks for visual recognition is clearly demonstrated by comprehensive experiments on three benchmark data sets.

  16. Use of Data Libraries for IAEA Nuclear Security Assessment Methodologies (NUSAM) [section 5.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, D.; Lane, M.

    2015-06-23

    Data libraries are essential for the characterization of the facility and provide the documented input which enables the facility assessment results and subsequent conclusions. Data Libraries are historical, verifiable, quantified, and applicable collections of testing data on different types of barriers, sensors, cameras, procedures, and/or personnel. Data libraries are developed and maintained as part of any assessment program or process. Data is collected during the initial stages of facility characterization to aid in the model and/or simulation development process. Data library values may also be developed through the use of state testing centers and/or site resources by testing different typesmore » of barriers, sensors, cameras, procedures, and/or personnel. If no data exists, subject matter expert opinion and manufacturer's specifications/ testing values can be the basis for initially assigning values, but are generally less reliable and lack appropriate confidence measures. The use of existing data libraries that have been developed by a state testing organization reduces the assessment costs by establishing standard delay, detection and assessment values for use by multiple sites or facilities where common barriers and alarms systems exist.« less

  17. Retrospective Evaluation of Safety, Efficacy and Risk Factors for Pneumothorax in Simultaneous Localizations of Multiple Pulmonary Nodules Using Hook Wire System.

    PubMed

    Zhong, Yan; Xu, Xiao-Quan; Pan, Xiang-Long; Zhang, Wei; Xu, Hai; Yuan, Mei; Kong, Ling-Yan; Pu, Xue-Hui; Chen, Liang; Yu, Tong-Fu

    2017-09-01

    To evaluate the safety and efficacy of the hook wire system in the simultaneous localizations for multiple pulmonary nodules (PNs) before video-assisted thoracoscopic surgery (VATS), and to clarify the risk factors for pneumothorax associated with the localization procedure. Between January 2010 and February 2016, 67 patients (147 nodules, Group A) underwent simultaneous localizations for multiple PNs using a hook wire system. The demographic, localization procedure-related information and the occurrence rate of pneumothorax were assessed and compared with a control group (349 patients, 349 nodules, Group B). Multivariate logistic regression analyses were used to determine the risk factors for pneumothorax during the localization procedure. All the 147 nodules were successfully localized. Four (2.7%) hook wires dislodged before VATS procedure, but all these four lesions were successfully resected according to the insertion route of hook wire. Pathological diagnoses were acquired for all 147 nodules. Compared with Group B, Group A demonstrated significantly longer procedure time (p < 0.001) and higher occurrence rate of pneumothorax (p = 0.019). Multivariate logistic regression analysis indicated that position change during localization procedure (OR 2.675, p = 0.021) and the nodules located in the ipsilateral lung (OR 9.404, p < 0.001) were independent risk factors for pneumothorax. Simultaneous localizations for multiple PNs using a hook wire system before VATS procedure were safe and effective. Compared with localization for single PN, simultaneous localizations for multiple PNs were prone to the occurrence of pneumothorax. Position change during localization procedure and the nodules located in the ipsilateral lung were independent risk factors for pneumothorax.

  18. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  19. Improved pupil dilation with medication-soaked pledget sponges.

    PubMed

    Weddle, Celeste; Thomas, Nikki; Dienemann, Jacqueline

    2013-08-01

    Use of multiple preoperative drops for pupil dilation has been shown to be inexact, to delay surgery, and to cause dissatisfaction among perioperative personnel. This article reports on an evidence-based, quality improvement project to locate and appraise research on improved effectiveness and efficiency of mydriasis (ie, pupillary dilation), and the subsequent implementation of a pledget-sponge procedure for pupil dilation at one ambulatory surgery center. Project leaders used an evidence-based practice model to assess the problem, research options for improvement, define goals, and implement a pilot project to test the new dilation technique. Outcomes from the pilot project showed a reduced number of delays caused by poor pupil dilation and a decrease in procedure turnover time. The project team solicited informal feedback from preoperative nurses, which reflected increased satisfaction in preparing patients for cataract procedures. After facility administrators and surgeons accepted the procedure change, it was adopted for preoperative use for all patients undergoing cataract surgery at the ambulatory surgery center. Copyright © 2013 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  20. Characterizing the Joint Effect of Diverse Test-Statistic Correlation Structures and Effect Size on False Discovery Rates in a Multiple-Comparison Study of Many Outcome Measures

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James

    2011-01-01

    In their 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report the results of a simulation assessing the robustness of their adaptive step-down procedure (GBS) for controlling the false discovery rate (FDR) when normally distributed test statistics are serially correlated. In this study we extend the investigation to the case of multiple comparisons involving correlated non-central t-statistics, in particular when several treatments or time periods are being compared to a control in a repeated-measures design with many dependent outcome measures. In addition, we consider several dependence structures other than serial correlation and illustrate how the FDR depends on the interaction between effect size and the type of correlation structure as indexed by Foerstner s distance metric from an identity. The relationship between the correlation matrix R of the original dependent variables and R, the correlation matrix of associated t-statistics is also studied. In general R depends not only on R, but also on sample size and the signed effect sizes for the multiple comparisons.

  1. Modal analysis using a Fourier analyzer, curve-fitting, and modal tuning

    NASA Technical Reports Server (NTRS)

    Craig, R. R., Jr.; Chung, Y. T.

    1981-01-01

    The proposed modal test program differs from single-input methods in that preliminary data may be acquired using multiple inputs, and modal tuning procedures may be employed to define closely spaced frquency modes more accurately or to make use of frequency response functions (FRF's) which are based on several input locations. In some respects the proposed modal test proram resembles earlier sine-sweep and sine-dwell testing in that broadband FRF's are acquired using several input locations, and tuning is employed to refine the modal parameter estimates. The major tasks performed in the proposed modal test program are outlined. Data acquisition and FFT processing, curve fitting, and modal tuning phases are described and examples are given to illustrate and evaluate them.

  2. Step by Step: Biology Undergraduates’ Problem-Solving Procedures during Multiple-Choice Assessment

    PubMed Central

    Prevost, Luanna B.; Lemons, Paula P.

    2016-01-01

    This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their problem-solving procedures. We identified a range of procedures and organized them as domain general, domain specific, or hybrid. We also identified domain-general and domain-specific errors made by students during problem solving. We found that students use domain-general and hybrid procedures more frequently when solving lower-order problems than higher-order problems, while they use domain-specific procedures more frequently when solving higher-order problems. Additionally, the more domain-specific procedures students used, the higher the likelihood that they would answer the problem correctly, up to five procedures. However, if students used just one domain-general procedure, they were as likely to answer the problem correctly as if they had used two to five domain-general procedures. Our findings provide a categorization scheme and framework for additional research on biology problem solving and suggest several important implications for researchers and instructors. PMID:27909021

  3. Cosmetic surgery procedures as luxury goods: measuring price and demand in facial plastic surgery.

    PubMed

    Alsarraf, Ramsey; Alsarraf, Nicole W; Larrabee, Wayne F; Johnson, Calvin M

    2002-01-01

    To evaluate the relationship between cosmetic facial plastic surgery procedure price and demand, and to test the hypothesis that these procedures function as luxury goods in the marketplace, with an upward-sloping demand curve. Data were derived from a survey that was sent to every (N = 1727) active fellow, member, or associate of the American Academy of Facial Plastic and Reconstructive Surgery, assessing the costs and frequency of 4 common cosmetic facial plastic surgery procedures (face-lift, brow-lift, blepharoplasty, and rhinoplasty) for 1999 and 1989. An economic analysis was performed to assess the relationship of price and demand for these procedures. A significant association was found between increasing surgeons' fees and total charges for cosmetic facial plastic surgery procedures and increasing demand for these procedures, as measured by their annual frequency (P

  4. The Use of Triadic Dialogue in the Science Classroom: a Teacher Negotiating Conceptual Learning with Teaching to the Test

    NASA Astrophysics Data System (ADS)

    Salloum, Sara; BouJaoude, Saouma

    2017-08-01

    The purpose of this research is to better understand the uses and potential of triadic dialogue (initiation-response-feedback) as a dominant discourse pattern in test-driven environments. We used a Bakhtinian dialogic perspective to analyze interactions among high-stakes tests and triadic dialogue. Specifically, the study investigated (a) the global influence of high-stakes tests on knowledge types and cognitive processes presented and elicited by the science teacher in triadic dialogue and (b) the teacher's meaning making of her discourse patterns. The classroom talk occurred in a classroom where the teacher tried to balance conceptual learning with helping low-income public school students pass the national tests. Videos and transcripts of 20 grade 8 and 9 physical science sessions were analyzed qualitatively. Teacher utterances were categorized in terms of science knowledge types and cognitive processes. Explicitness and directionality of shifts among different knowledge types were analyzed. It was found that shifts between factual/conceptual/procedural-algorithmic and procedural inquiry were mostly dialectical and implicit, and dominated the body of concept development lessons. These shifts called for medium-level cognitive processes. Shifts between the different knowledge types and procedural-testing were more explicit and occurred mostly at the end of lessons. Moreover, the science teacher's focus on success and high expectations, her explicitness in dealing with high-stakes tests, and the relaxed atmosphere she created built a constructive partnership with the students toward a common goal of cracking the test. We discuss findings from a Bakhtinian dialogic perspective and the potential of triadic dialogue for teachers negotiating multiple goals and commitments.

  5. Measurement of Three-dimensional Density Distributions by Holographic Interferometry and Computer Tomography

    NASA Technical Reports Server (NTRS)

    Vest, C. M.

    1982-01-01

    The use of holographic interferometry to measure two and threedimensional flows and the interpretation of multiple-view interferograms with computer tomography are discussed. Computational techniques developed for tomography are reviewed. Current research topics are outlined including the development of an automated fringe readout system, optimum reconstruction procedures for when an opaque test model is present in the field, and interferometry and tomography with strongly refracting fields and shocks.

  6. Survey of Current Doctrine, Training, and Special Considerations for Military Operations on Urbanized Terrain (MOUT).

    DTIC Science & Technology

    1983-11-01

    Securing and fortifying (a) Doors (b) Hallways (c) Stairs (d) Windows (e) Floors (f) Ceilings 3 (g) Unoccupied rooms (h) Basements (i) Upper floors...observed, the instructors were interviewed, and training K was assessed through administration of a multiple-choice test and a Perception of Training...instructing clearing procedures. It would provide the opportunity to both critique and practice, using one structure. A Perception of Training

  7. A harmonic pulse testing method for leakage detection in deep subsurface storage formations

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Lu, Jiemin; Hovorka, Susan

    2015-06-01

    Detection of leakage in deep geologic storage formations (e.g., carbon sequestration sites) is a challenging problem. This study investigates an easy-to-implement frequency domain leakage detection technology based on harmonic pulse testing (HPT). Unlike conventional constant-rate pressure interference tests, HPT stimulates a reservoir using periodic injection rates. The fundamental principle underlying HPT-based leakage detection is that leakage modifies a storage system's frequency response function, thus providing clues of system malfunction. During operations, routine HPTs can be conducted at multiple pulsing frequencies to obtain experimental frequency response functions, using which the possible time-lapse changes are examined. In this work, a set of analytical frequency response solutions is derived for predicting system responses with and without leaks for single-phase flow systems. Sensitivity studies show that HPT can effectively reveal the presence of leaks. A search procedure is then prescribed for locating the actual leaks using amplitude and phase information obtained from HPT, and the resulting optimization problem is solved using the genetic algorithm. For multiphase flows, the applicability of HPT-based leakage detection procedure is exemplified numerically using a carbon sequestration problem. Results show that the detection procedure is applicable if the average reservoir conditions in the testing zone stay relatively constant during the tests, which is a working assumption under many other interpretation methods for pressure interference tests. HPT is a cost-effective tool that only requires periodic modification of the nominal injection rate. Thus it can be incorporated into existing monitoring plans with little additional investment.

  8. The Effect of Multiple Shot Peening on the Corrosion Behavior of Duplex Stainless Steel

    NASA Astrophysics Data System (ADS)

    Feng, Qiang; She, Jia; Wu, Xueyan; Wang, Chengxi; Jiang, Chuanhai

    2018-03-01

    Various types of shot peening treatments were applied to duplex stainless steel. The effects of shot peening intensity and working procedures on the microstructure were investigated. The domain size and microstrain evolution in the surface layer were characterized utilizing the Rietveld method. As the shot peening intensity increased, the surface roughness increased in the surface layer; however, it decreased after multiple (dual and triple) shot peening. The mole fraction of strain-induced martensite as a function of the intensity of shot peening was evaluated by XRD measurements. Both potentiodynamic polarization curves and salt spray tests of shot-peened samples in NaCl solution were investigated. The results indicate that traditional shot peening has negative effects on corrosion resistance with increasing shot peening intensity; however, the corrosion rate can be reduced by means of multiple shot peening.

  9. Infrared Thermography for Temperature Measurement and Non-Destructive Testing

    PubMed Central

    Usamentiaga, Rubèn; Venegas, Pablo; Guerediaga, Jon; Vega, Laura; Molleda, Julio; Bulnes, Francisco G.

    2014-01-01

    The intensity of the infrared radiation emitted by objects is mainly a function of their temperature. In infrared thermography, this feature is used for multiple purposes: as a health indicator in medical applications, as a sign of malfunction in mechanical and electrical maintenance or as an indicator of heat loss in buildings. This paper presents a review of infrared thermography especially focused on two applications: temperature measurement and non-destructive testing, two of the main fields where infrared thermography-based sensors are used. A general introduction to infrared thermography and the common procedures for temperature measurement and non-destructive testing are presented. Furthermore, developments in these fields and recent advances are reviewed. PMID:25014096

  10. Functional analysis screening for multiple topographies of problem behavior.

    PubMed

    Bell, Marlesha C; Fahmie, Tara A

    2018-04-23

    The current study evaluated a screening procedure for multiple topographies of problem behavior in the context of an ongoing functional analysis. Experimenters analyzed the function of a topography of primary concern while collecting data on topographies of secondary concern. We used visual analysis to predict the function of secondary topographies and a subsequent functional analysis to test those predictions. Results showed that a general function was accurately predicted for five of six (83%) secondary topographies. A specific function was predicted and supported for a subset of these topographies. The experimenters discuss the implication of these results for clinicians who have limited time for functional assessment. © 2018 Society for the Experimental Analysis of Behavior.

  11. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    NASA Astrophysics Data System (ADS)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  12. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  13. Second-order analysis of semiparametric recurrent event processes.

    PubMed

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  14. A simple Lagrangian forecast system with aviation forecast potential

    NASA Technical Reports Server (NTRS)

    Petersen, R. A.; Homan, J. H.

    1983-01-01

    A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.

  15. Early detection of disease program: Evaluation of the cellular immune response

    NASA Technical Reports Server (NTRS)

    Criswell, B. S.; Knight, V.; Martin, R. R.; Kasel, J. A.

    1974-01-01

    The early cellular responses of specific components of the leukocyte and epithelial cell populations to foreign challenges of both an infectious and noninfectious character were evaluated. Procedures for screening potential flight crews were developed, documented, and tested on a control population. Methods for preparing suitable populations of lymphocytes, polymorphonuclear leukocytes, macrophages, and epithelial cells were first established and evaluated. Epithelial cells from viral infected individuals were screened with a number of anti-viral antisera. This procedure showed the earliest indication of disease as well as providing a specific diagnosis to the physicians. Both macrophages and polymorphonuclear leukocytes were studied from normal individuals, smokers, and patients with viral infections. Newer techniques enabling better definition of lymphocyte subpopulations were then developed, namely the E and EAC rosette procedures for recognition of T (thymus-derived) and B (bone-marrow-derived) lymphocyte subpopulations. Lymphocyte and lymphocyte subpopulation response to multiple mitogens have been evaluated.

  16. Intraoperative language localization in multilingual patients with gliomas.

    PubMed

    Bello, Lorenzo; Acerbi, Francesco; Giussani, Carlo; Baratta, Pietro; Taccone, Paolo; Songa, Valeria; Fava, Marica; Stocchetti, Nino; Papagno, Costanza; Gaini, Sergio M

    2006-07-01

    Intraoperative localization of speech is problematic in patients who are fluent in different languages. Previous studies have generated various results depending on the series of patients studied, the type of language, and the sensitivity of the tasks applied. It is not clear whether languages are mediated by multiple and separate cortical areas or shared by common areas. Globally considered, previous studies recommended performing a multiple intraoperative mapping for all the languages in which the patient is fluent. The aim of this work was to study the feasibility of performing an intraoperative multiple language mapping in a group of multilingual patients with a glioma undergoing awake craniotomy for tumor removal and to describe the intraoperative cortical and subcortical findings in the area of craniotomy, with the final goal to maximally preserve patients' functional language. Seven late, highly proficient multilingual patients with a left frontal glioma were submitted preoperatively to a battery of tests to evaluate oral language production, comprehension, and repetition. Each language was tested serially starting from the first acquired language. Items that were correctly named during these tests were used to build personalized blocks to be used intraoperatively. Language mapping was undertaken during awake craniotomies by the use of an Ojemann cortical stimulator during counting and oral naming tasks. Subcortical stimulation by using the same current threshold was applied during tumor resection, in a back and forth fashion, and the same tests. Cortical sites essential for oral naming were found in 87.5% of patients, those for the first acquired language in one to four sites, those for the other languages in one to three sites. Sites for each language were distinct and separate. Number and location of sites were not predictable, being randomly and widely distributed in the cortex around or less frequently over the tumor area. Subcortical stimulations found tracts for the first acquired language in four patients and for the other languages in three patients. Three of these patients decreased their fluency immediately after surgery, affecting the first acquired language, which fully recovered in two patients and partially in one. The procedure was agile and well tolerated by the patients. These findings show that multiple cortical and subcortical language mapping during awake craniotomy for tumor removal is a feasible procedure. They support the concept that intraoperative mapping should be performed for all the languages in which the patient is fluent in to preserve functional integrity.

  17. Task-switching cost and repetition priming: two overlooked confounds in the first-set procedure of the Sternberg paradigm and how they affect memory set-size effects.

    PubMed

    Jou, Jerwen

    2014-10-01

    Subjects performed Sternberg-type memory recognition tasks (Sternberg paradigm) in four experiments. Category-instance names were used as learning and testing materials. Sternberg's original experiments demonstrated a linear relation between reaction time (RT) and memory-set size (MSS). A few later studies found no relation, and other studies found a nonlinear relation (logarithmic) between the two variables. These deviations were used as evidence undermining Sternberg's serial scan theory. This study identified two confounding variables in the fixed-set procedure of the paradigm (where multiple probes are presented at test for a learned memory set) that could generate a MSS RT function that was either flat or logarithmic rather than linearly increasing. These two confounding variables were task-switching cost and repetition priming. The former factor worked against smaller memory sets and in favour of larger sets whereas the latter factor worked in the opposite way. Results demonstrated that a null or a logarithmic RT-to-MSS relation could be the artefact of the combined effects of these two variables. The Sternberg paradigm has been used widely in memory research, and a thorough understanding of the subtle methodological pitfalls is crucial. It is suggested that a varied-set procedure (where only one probe is presented at test for a learned memory set) is a more contamination-free procedure for measuring the MSS effects, and that if a fixed-set procedure is used, it is worthwhile examining the RT function of the very first trials across the MSSs, which are presumably relatively free of contamination by the subsequent trials.

  18. Identification of treatment responders based on multiple longitudinal outcomes with applications to multiple sclerosis patients.

    PubMed

    Kondo, Yumi; Zhao, Yinshan; Petkau, John

    2017-05-30

    Identification of treatment responders is a challenge in comparative studies where treatment efficacy is measured by multiple longitudinally collected continuous and count outcomes. Existing procedures often identify responders on the basis of only a single outcome. We propose a novel multiple longitudinal outcome mixture model that assumes that, conditionally on a cluster label, each longitudinal outcome is from a generalized linear mixed effect model. We utilize a Monte Carlo expectation-maximization algorithm to obtain the maximum likelihood estimates of our high-dimensional model and classify patients according to their estimated posterior probability of being a responder. We demonstrate the flexibility of our novel procedure on two multiple sclerosis clinical trial datasets with distinct data structures. Our simulation study shows that incorporating multiple outcomes improves the responder identification performance; this can occur even if some of the outcomes are ineffective. Our general procedure facilitates the identification of responders who are comprehensively defined by multiple outcomes from various distributions. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Pre-treatment red blood cell distribution width provides prognostic information in multiple myeloma.

    PubMed

    Zhou, Di; Xu, Peipei; Peng, Miaoxin; Shao, Xiaoyan; Wang, Miao; Ouyang, Jian; Chen, Bing

    2018-06-01

    The red blood cell distribution width (RDW), a credible marker for abnormal erythropoiesis, has recently been studied as a prognostic factor in oncology, but its role in multiple myeloma (MM) hasn't been thoroughly investigated. We performed a retrospective study in 162 patients with multiple myeloma. Categorical parameters were analyzed using Pearson chi-squared test. The Mann-Whitney and Wilcoxon tests were used for group comparisons. Comparisons of repeated samples data were analyzed with the general linear model repeated-measures procedure. The Kaplan-Meier product-limit method was used to determine OS and PFS, and the differences were assessed by the log-rank test. High RDW baseline was significantly associated with indexes including haemoglobin, bone marrow plasma cell infiltration, and cytogenetics risk stratification. After chemotherapy, the overall response rate (ORR) decreased as RDW baseline increased. In 24 patients with high RDW baseline, it was revealed RDW value decreased when patients achieved complete remission (CR), but increased when the disease progressed. The normal-RDW baseline group showed both longer overall survival (OS) and progression-free survival (PFS) than the high-RDW baseline group. Our study suggests pre-treatment RDW level is a prognostic factor in MM and should be regarded as an important parameter for assessment of therapeutic efficiency. Copyright © 2018. Published by Elsevier B.V.

  20. Analysis of Cosmetic Topics on the Plastic Surgery In-Service Training Exam.

    PubMed

    Silvestre, Jason; Taglienti, Anthony J; Serletti, Joseph M; Chang, Benjamin

    2015-08-01

    The Plastic Surgery In-Service Training Exam (PSITE) is a multiple-choice examination taken by plastic surgery trainees to provide an assessment of plastic surgery knowledge. The purpose of this study was to evaluate cosmetic questions and determine overlap with national procedural data. Digital syllabi of six consecutive PSITE administrations (2008-2013) were analyzed for cosmetic surgery topics. Questions were classified by taxonomy, focus, anatomy, and procedure. Answer references were tabulated by source. Relationships between tested material and national procedural volume were assessed via Pearson correlation. 301 questions addressed cosmetic topics (26% of all questions) and 20 required image interpretations (7%). Question-stem taxonomy favored decision-making (40%) and recall (37%) skills over interpretation (23%, P < .001). Answers focused on treatments/outcomes (67%) over pathology/anatomy (20%) and diagnoses (13%, P < .001). Tested procedures were largely surgical (85%) and focused on the breast (25%), body (18%), nose (13%), and eye (10%). The most common surgeries were breast augmentation (12%), rhinoplasty (11%), blepharoplasty (10%), and body contouring (6%). Minimally invasive procedures were lasers (5%), neuromodulators (4%), and fillers (3%). Plastic and Reconstructive Surgery (58%), Clinics in Plastic Surgery (7%), and Aesthetic Surgery Journal (6%) were the most cited journals, with a median 5-year publication lag. There was poor correlation between PSITE content and procedural volume data (r(2) = 0.138, P = .539). Plastic surgeons receive routine evaluation of cosmetic surgery knowledge. These data may help optimize clinical and didactic experiences for training in cosmetic surgery. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  1. Tensile bond strength of an aged resin composite repaired with different protocols.

    PubMed

    Celik, Esra Uzer; Ergücü, Zeynep; Türkün, L Sebnem; Ercan, Utku Kürșat

    2011-08-01

    To evaluate the effect of different surface treatments and bonding procedures on the tensile bond strength (TBS) of resin composites repaired 6 months after polymerization. Resin composite sticks were aged in distilled water at 37°C for 6 months. They were divided into 12 groups (n = 10) according to the combination of surface treatment/bonding procedures [none, only bur treatment, XP Bond (XPB/Dentsply/DeTrey) with/without bur, AdheSE (A-SE/Ivoclar/Vivadent) with/without bur, Composite Primer (CP/GC) with/without bur, CP after bur and acid-etching, XPB after acid etching and CP with bur, A-SE after bur and CP]. The ultimate tensile bond strength (UTS) of the resin composites was tested in intact but aged specimens. Tensile bond strengths were tested with a universal testing machine (Shimadzu). Data were analyzed using one-way ANOVA and Duncan Multiple Comparisons tests (p < 0.05). All repaired groups showed significantly higher TBS than the group without any sureface treatment (p < 0.05). Four groups resulted in TBS similar to those of intact resin composite UTS: A-SE, A-SE with bur, A-SE after CP with bur, and XPB after acid etching+CP with bur. Bur treatment, silane primer or etch-and-rinse adhesive application alone were not successful in the repair process of aged resin composite, whereas self-etching adhesive alone showed similar performance to the intact specimens. Combined procedures generally showed better performance: A-SE with bur, A-SE after CP with bur, and XPB after acid etching +CP with bur showed TBS similar to those of the intact specimens. It was concluded that bur roughening of the surfaces and rebonding procedures were essential for repairing aged resin composites.

  2. Robust estimation of the proportion of treatment effect explained by surrogate marker information.

    PubMed

    Parast, Layla; McDermott, Mary M; Tian, Lu

    2016-05-10

    In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Comparison of Deep Brain Stimulation Lead Targeting Accuracy and Procedure Duration between 1.5- and 3-Tesla Interventional Magnetic Resonance Imaging Systems: An Initial 12-Month Experience.

    PubMed

    Southwell, Derek G; Narvid, Jared A; Martin, Alastair J; Qasim, Salman E; Starr, Philip A; Larson, Paul S

    2016-01-01

    Interventional magnetic resonance imaging (iMRI) allows deep brain stimulator lead placement under general anesthesia. While the accuracy of lead targeting has been described for iMRI systems utilizing 1.5-tesla magnets, a similar assessment of 3-tesla iMRI procedures has not been performed. To compare targeting accuracy, the number of lead targeting attempts, and surgical duration between procedures performed on 1.5- and 3-tesla iMRI systems. Radial targeting error, the number of targeting attempts, and procedure duration were compared between surgeries performed on 1.5- and 3-tesla iMRI systems (SmartFrame and ClearPoint systems). During the first year of operation of each system, 26 consecutive leads were implanted using the 1.5-tesla system, and 23 consecutive leads were implanted using the 3-tesla system. There was no significant difference in radial error (Mann-Whitney test, p = 0.26), number of lead placements that required multiple targeting attempts (Fisher's exact test, p = 0.59), or bilateral procedure durations between surgeries performed with the two systems (p = 0.15). Accurate DBS lead targeting can be achieved with iMRI systems utilizing either 1.5- or 3-tesla magnets. The use of a 3-tesla magnet, however, offers improved visualization of the target structures and allows comparable accuracy and efficiency of placement at the selected targets. © 2016 S. Karger AG, Basel.

  4. Objective assessment of operator performance during ultrasound-guided procedures.

    PubMed

    Tabriz, David M; Street, Mandie; Pilgram, Thomas K; Duncan, James R

    2011-09-01

    Simulation permits objective assessment of operator performance in a controlled and safe environment. Image-guided procedures often require accurate needle placement, and we designed a system to monitor how ultrasound guidance is used to monitor needle advancement toward a target. The results were correlated with other estimates of operator skill. The simulator consisted of a tissue phantom, ultrasound unit, and electromagnetic tracking system. Operators were asked to guide a needle toward a visible point target. Performance was video-recorded and synchronized with the electromagnetic tracking data. A series of algorithms based on motor control theory and human information processing were used to convert raw tracking data into different performance indices. Scoring algorithms converted the tracking data into efficiency, quality, task difficulty, and targeting scores that were aggregated to create performance indices. After initial feasibility testing, a standardized assessment was developed. Operators (N = 12) with a broad spectrum of skill and experience were enrolled and tested. Overall scores were based on performance during ten simulated procedures. Prior clinical experience was used to independently estimate operator skill. When summed, the performance indices correlated well with estimated skill. Operators with minimal or no prior experience scored markedly lower than experienced operators. The overall score tended to increase according to operator's clinical experience. Operator experience was linked to decreased variation in multiple aspects of performance. The aggregated results of multiple trials provided the best correlation between estimated skill and performance. A metric for the operator's ability to maintain the needle aimed at the target discriminated between operators with different levels of experience. This study used a highly focused task model, standardized assessment, and objective data analysis to assess performance during simulated ultrasound-guided needle placement. The performance indices were closely related to operator experience.

  5. Quality specifications for articles of botanical origin from the United States Pharmacopeia.

    PubMed

    Ma, Cuiying; Oketch-Rabah, Hellen; Kim, Nam-Cheol; Monagas, Maria; Bzhelyansky, Anton; Sarma, Nandakumara; Giancaspro, Gabriel

    2018-06-01

    In order to define appropriate quality of botanical dietary supplements, botanical drugs, and herbal medicines, the United States Pharmacopeia (USP) and the Herbal Medicines Compendium (HMC) contain science-based quality standards that include multiple interrelated tests to provide a full quality characterization for each article in terms of its identity, purity, and content. To provide a comprehensive description of the pharmacopeial tests and requirements for articles of botanical origin in the aforementioned compendia. Selective chromatographic procedures, such as high-performance liquid chromatography (HPLC) and high-performance thin-layer chromatography (HPTLC), are used as Identification tests in pharmacopeial monographs to detect species substitution or other confounders. HPLC quantitative tests are typically used to determine the content of key constituents, i.e., the total or individual amount of plant secondary metabolites that are considered bioactive constituents or analytical marker compounds. Purity specifications are typically set to limit the content of contaminants such as toxic elements, pesticides, and fungal toxins. Additional requirements highlight the importance of naming, definition, use of reference materials, and packaging/storage conditions. Technical requirements for each section of the monographs were illustrated with specific examples. Tests were performed on authentic samples using pharmacopeial reference standards. The chromatographic analytical procedures were validated to provide characteristic profiles for the identity and/or accurate determination of the content of quality markers. The multiple tests included in each monograph complement each other to provide an appropriate pharmacopeial quality characterization for the botanicals used as herbal medicines and dietary supplements. The monographs provide detailed specifications for identity, content of bioactive constituents or quality markers, and limits of contaminants, adulterants, and potentially toxic substances. Additional requirements such as labeling and packaging further contribute to preserve the quality of these products. Compliance with pharmacopeial specifications should be required to ensure the reliability of botanical articles used for health care purposes. Copyright © 2018. Published by Elsevier GmbH.

  6. Computationally efficient stochastic optimization using multiple realizations

    NASA Astrophysics Data System (ADS)

    Bayer, P.; Bürger, C. M.; Finkel, M.

    2008-02-01

    The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.

  7. Joint Forward Area Air Defense Test Program Definition.

    DTIC Science & Technology

    1984-03-30

    Visibility Conditions 23 CHAPTER 6. ACRONYMS LIST 24 . CHAPTER 7. REFERENCE 26 APPENDIX A. IDENTIFICATION ISSUE ANALAYSIS PLAN A-1 to A-17 B. C3...and kill ratios between single and multiple pass aircraft. A " multivariate analysis" will be performed to determine if there is any significant...killed will be compared for each set of identification procedure". A " multivariate analysis" will be performed on the number of hostile and friendly

  8. Operator priming and generalization of practice in adults' simple arithmetic.

    PubMed

    Chen, Yalin; Campbell, Jamie I D

    2016-04-01

    There is a renewed debate about whether educated adults solve simple addition problems (e.g., 2 + 3) by direct fact retrieval or by fast, automatic counting-based procedures. Recent research testing adults' simple addition and multiplication showed that a 150-ms preview of the operator (+ or ×) facilitated addition, but not multiplication, suggesting that a general addition procedure was primed by the + sign. In Experiment 1 (n = 36), we applied this operator-priming paradigm to rule-based problems (0 + N = N, 1 × N = N, 0 × N = 0) and 1 + N problems with N ranging from 0 to 9. For the rule-based problems, we found both operator-preview facilitation and generalization of practice (e.g., practicing 0 + 3 sped up unpracticed 0 + 8), the latter being a signature of procedure use; however, we also found operator-preview facilitation for 1 + N in the absence of generalization, which implies the 1 + N problems were solved by fact retrieval but nonetheless were facilitated by an operator preview. Thus, the operator preview effect does not discriminate procedure use from fact retrieval. Experiment 2 (n = 36) investigated whether a population with advanced mathematical training-engineering and computer science students-would show generalization of practice for nonrule-based simple addition problems (e.g., 1 + 4, 4 + 7). The 0 + N problems again presented generalization, whereas no nonzero problem type did; but all nonzero problems sped up when the identical problems were retested, as predicted by item-specific fact retrieval. The results pose a strong challenge to the generality of the proposal that skilled adults' simple addition is based on fast procedural algorithms, and instead support a fact-retrieval model of fast addition performance. (c) 2016 APA, all rights reserved).

  9. Application of stroboscopic and pulsed-laser electronic speckle pattern interferometry (ESPI) to modal analysis problems

    NASA Astrophysics Data System (ADS)

    Van der Auweraer, H.; Steinbichler, H.; Vanlanduit, S.; Haberstok, C.; Freymann, R.; Storer, D.; Linet, V.

    2002-04-01

    Accurate structural models are key to the optimization of the vibro-acoustic behaviour of panel-like structures. However, at the frequencies of relevance to the acoustic problem, the structural modes are very complex, requiring high-spatial-resolution measurements. The present paper discusses a vibration testing system based on pulsed-laser holographic electronic speckle pattern interferometry (ESPI) measurements. It is a characteristic of the method that time-triggered (and not time-averaged) vibration images are obtained. Its integration into a practicable modal testing and analysis procedure is reviewed. The accumulation of results at multiple excitation frequencies allows one to build up frequency response functions. A novel parameter extraction approach using spline-based data reduction and maximum-likelihood parameter estimation was developed. Specific extensions have been added in view of the industrial application of the approach. These include the integration of geometry and response information, the integration of multiple views into one single model, the integration with finite-element model data and the prior identification of the critical panels and critical modes. A global procedure was hence established. The approach has been applied to several industrial case studies, including car panels, the firewall of a monovolume car, a full vehicle, panels of a light truck and a household product. The research was conducted in the context of the EUREKA project HOLOMODAL and the Brite-Euram project SALOME.

  10. What are the appropriate methods for analyzing patient-reported outcomes in randomized trials when data are missing?

    PubMed

    Hamel, J F; Sebille, V; Le Neel, T; Kubis, G; Boyer, F C; Hardouin, J B

    2017-12-01

    Subjective health measurements using Patient Reported Outcomes (PRO) are increasingly used in randomized trials, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: Classical Test Theory (CTT) and Item Response Theory models (IRT). These two strategies display very similar characteristics when data are complete, but in the common case when data are missing, whether IRT or CTT would be the most appropriate remains unknown and was investigated using simulations. We simulated PRO data such as quality of life data. Missing responses to items were simulated as being completely random, depending on an observable covariate or on an unobserved latent trait. The considered CTT-based methods allowed comparing scores using complete-case analysis, personal mean imputations or multiple-imputations based on a two-way procedure. The IRT-based method was the Wald test on a Rasch model including a group covariate. The IRT-based method and the multiple-imputations-based method for CTT displayed the highest observed power and were the only unbiased method whatever the kind of missing data. Online software and Stata® modules compatibles with the innate mi impute suite are provided for performing such analyses. Traditional procedures (listwise deletion and personal mean imputations) should be avoided, due to inevitable problems of biases and lack of power.

  11. Mapping wildfire vulnerability in Mediterranean Europe. Testing a stepwise approach for operational purposes.

    PubMed

    Oliveira, Sandra; Félix, Fernando; Nunes, Adélia; Lourenço, Luciano; Laneve, Giovanni; Sebastián-López, Ana

    2018-01-15

    Vulnerability assessment is a vital component of wildfire management. This research focused on the development of a framework to measure and map vulnerability levels in several areas within Mediterranean Europe, where wildfires are a major concern. The framework followed a stepwise approach to evaluate its main components, expressed by exposure, sensitivity and coping capacity. Data on population density, fuel types, protected areas location, roads infrastructure and surveillance activities, among others, were integrated to create composite indices, representing each component and articulated in multiple dimensions. Maps were created for several test areas, in northwest Portugal, southwest Sardinia in Italy and northeast Corsica in France, with the contribution of local participants from civil protection institutions and forest services. Results showed the influence of fuel sensitivity levels, population distribution and protected areas coverage for the overall vulnerability classes. Reasonable levels of accuracy were found on the maps provided through the validation procedure, with an overall match above 72% for the several sites. The systematic and flexible approach applied allowed for adjustments to local circumstances with regards to data availability and fire management procedures, without compromising its consistency and with substantial operational capabilities. The results obtained and the positive feedback of end-users encourage its further application, as a means to improve wildfire management strategies at multiple levels with the latest scientific outputs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The Italian validation of the minimal assessment of cognitive function in multiple sclerosis (MACFIMS) and the application of the Cognitive Impairment Index scoring procedure in MS patients.

    PubMed

    Argento, Ornella; Incerti, Chiara C; Quartuccio, Maria E; Magistrale, Giuseppe; Francia, Ada; Caltagirone, Carlo; Pisani, Valerio; Nocentini, Ugo

    2018-04-27

    Cognitive dysfunction occurs in almost 50-60% of patients with multiple sclerosis (MS) even in early stages of the disease and affects different aspects of patient's life. Aims of the present study were (1) to introduce and validate an Italian version of the minimal assessment of cognitive functions in MS (MACFIMS) battery and (2) to propose the use of the Cognitive Impairment Index (CII) as a scoring procedure to define the degree of impairment in relapsing-remitting (RRMS) and secondary-progressive (SPMS) patients. A total of 240 HC and 123 MS patients performed the Italian version of the MACFIMS composed by the same tests as the original except for the Paced Auditory Serial Addition Test. The CII was derived for each score of the 11 scales for participants of both groups. The results of the study show that cognitive impairment affects around 50% of our sample of MS patients. In RRMS group, only the 15.7% of patients reported a severe impairment, while in the group of SPMS, the 51.4% of patients felt in the "severely impaired" group. Results are in line with previously reported percentages of impairment in MS patients, showing that the calculation of the CII applied to the Italian version of the MACFIMS is sensitive and reliable in detecting different degrees of impairment in MS patients.

  13. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  14. Crop status evaluations and yield predictions

    NASA Technical Reports Server (NTRS)

    Haun, J. R.

    1975-01-01

    A model was developed for predicting the day 50 percent of the wheat crop is planted in North Dakota. This model incorporates location as an independent variable. The Julian date when 50 percent of the crop was planted for the nine divisions of North Dakota for seven years was regressed on the 49 variables through the step-down multiple regression procedure. This procedure begins with all of the independent variables and sequentially removes variables that are below a predetermined level of significance after each step. The prediction equation was tested on daily data. The accuracy of the model is considered satisfactory for finding the historic dates on which to initiate yield prediction model. Growth prediction models were also developed for spring wheat.

  15. Teaching Manual Signs to Adults With Mental Retardation Using Matching-to-Sample Procedures and Stimulus Equivalence

    PubMed Central

    Elias, Nassim Chamel; Goyos, Celso; Saunders, Muriel; Saunders, Richard

    2008-01-01

    The objective of this study was to teach manual signs through an automated matching-to-sample procedure and to test for the emergence of new conditional relations and imitative behaviors. Seven adults with mild to severe mental retardation participated. Four were also hearing impaired. Relations between manual signs (set A) and pictures (set B) were initially taught, followed by the training of corresponding printed words (set C) and pictures (set B). Further presentations of conditional discriminations tested for the emergence of AC, followed by tests for the emergence of imitative signing behavior (D) in the presence of either pictures (B) or printed words (C). Each stimulus set was comprised of 9 elements. The stimuli were still pictures, printed words, and dynamic presentations of manual signs. A pretest was conducted to determine which signs the participants could make pre-experimentally. Teaching was arranged in a multiple baseline design across 3 groups of 3 words each. The purpose of the present study was to determine whether participants would emit manual signs in expressive signs tests as a result of observation (video modeling) during match-to-sample training in the absence of explicit training. Five of the 7 subjects passed tests of emergence and emitted at least 50% of the signs. Two were hearing impaired with signing experience, and 3 were not hearing impaired and had no signing experience. Thus, observation of video recorded manual signs in a matching-to-sample training procedure was effective at establishing some signs by adults with mental retardation. PMID:22477400

  16. An examination of predictive variables toward graduation of minority students in science at a selected urban university

    NASA Astrophysics Data System (ADS)

    Hunter, Evelyn M. Irving

    1998-12-01

    The purpose of this study was to examine the relationship and predictive power of the variables gender, high school GPA, class rank, SAT scores, ACT scores, and socioeconomic status on the graduation rates of minority college students majoring in the sciences at a selected urban university. Data was examined on these variables as they related to minority students majoring in science. The population consisted of 101 minority college students who had majored in the sciences from 1986 to 1996 at an urban university in the southwestern region of Texas. A non-probability sampling procedure was used in this study. The non-probability sampling procedure in this investigation was incidental sampling technique. A profile sheet was developed to record the information regarding the variables. The composite scores from SAT and ACT testing were used in the study. The dichotomous variables gender and socioeconomic status were dummy coded for analysis. For the gender variable, zero (0) indicated male, and one (1) indicated female. Additionally, zero (0) indicated high SES, and one (1) indicated low SES. Two parametric procedures were used to analyze the data in this investigation. They were the multiple correlation and multiple regression procedures. Multiple correlation is a statistical technique that indicates the relationship between one variable and a combination of two other variables. The variables socioeconomic status and GPA were found to contribute significantly to the graduation rates of minority students majoring in all sciences when combined with chemistry (Hypotheses Two and Four). These variables accounted for 7% and 15% of the respective variance in the graduation rates of minority students in the sciences and in chemistry. Hypotheses One and Three, the predictor variables gender, high school GPA, SAT Total Scores, class rank, and socioeconomic status did not contribute significantly to the graduation rates of minority students in biology and pharmacy.

  17. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  18. Protein alignment algorithms with an efficient backtracking routine on multiple GPUs.

    PubMed

    Blazewicz, Jacek; Frohmberg, Wojciech; Kierzynka, Michal; Pesch, Erwin; Wojciechowski, Pawel

    2011-05-20

    Pairwise sequence alignment methods are widely used in biological research. The increasing number of sequences is perceived as one of the upcoming challenges for sequence alignment methods in the nearest future. To overcome this challenge several GPU (Graphics Processing Unit) computing approaches have been proposed lately. These solutions show a great potential of a GPU platform but in most cases address the problem of sequence database scanning and computing only the alignment score whereas the alignment itself is omitted. Thus, the need arose to implement the global and semiglobal Needleman-Wunsch, and Smith-Waterman algorithms with a backtracking procedure which is needed to construct the alignment. In this paper we present the solution that performs the alignment of every given sequence pair, which is a required step for progressive multiple sequence alignment methods, as well as for DNA recognition at the DNA assembly stage. Performed tests show that the implementation, with performance up to 6.3 GCUPS on a single GPU for affine gap penalties, is very efficient in comparison to other CPU and GPU-based solutions. Moreover, multiple GPUs support with load balancing makes the application very scalable. The article shows that the backtracking procedure of the sequence alignment algorithms may be designed to fit in with the GPU architecture. Therefore, our algorithm, apart from scores, is able to compute pairwise alignments. This opens a wide range of new possibilities, allowing other methods from the area of molecular biology to take advantage of the new computational architecture. Performed tests show that the efficiency of the implementation is excellent. Moreover, the speed of our GPU-based algorithms can be almost linearly increased when using more than one graphics card.

  19. Obtaining eigensolutions for multiple frequency ranges in a single NASTRAN execution

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Brown, W. K.

    1990-01-01

    A novel and general procedure for obtaining eigenvalues and eigenvectors for multiple frequency ranges in a single NASTRAN execution is presented. The scheme is applicable to normal modes analyzes employing the FEER and Inverse Power methods of eigenvalue extraction. The procedure is illustrated by examples.

  20. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    PubMed

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  1. Statistical Significance for Hierarchical Clustering

    PubMed Central

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  2. Detection of Salmonella spp. in veterinary samples by combining selective enrichment and real-time PCR.

    PubMed

    Goodman, Laura B; McDonough, Patrick L; Anderson, Renee R; Franklin-Guild, Rebecca J; Ryan, James R; Perkins, Gillian A; Thachil, Anil J; Glaser, Amy L; Thompson, Belinda S

    2017-11-01

    Rapid screening for enteric bacterial pathogens in clinical environments is essential for biosecurity. Salmonella found in veterinary hospitals, particularly Salmonella enterica serovar Dublin, can pose unique challenges for culture and testing because of its poor growth. Multiple Salmonella serovars including Dublin are emerging threats to public health given increasing prevalence and antimicrobial resistance. We adapted an automated food testing method to veterinary samples and evaluated the performance of the method in a variety of matrices including environmental samples ( n = 81), tissues ( n = 52), feces ( n = 148), and feed ( n = 29). A commercial kit was chosen as the basis for this approach in view of extensive performance characterizations published by multiple independent organizations. A workflow was established for efficiently and accurately testing veterinary matrices and environmental samples by use of real-time PCR after selective enrichment in Rappaport-Vassiliadis soya (RVS) medium. Using this method, the detection limit for S. Dublin improved by 100-fold over subculture on selective agars (eosin-methylene blue, brilliant green, and xylose-lysine-deoxycholate). Overall, the procedure was effective in detecting Salmonella spp. and provided next-day results.

  3. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  4. Invasive prenatal diagnosis of fetal thalassemia.

    PubMed

    Li, Dong-Zhi; Yang, Yan-Dong

    2017-02-01

    Thalassemia is the most common monogenic inherited disease worldwide, affecting individuals originating from many countries to various extents. As the disease requires long-term care, prevention of the homozygous state presents a substantial global disease burden. The comprehensively preventive programs involve carrier detections, molecular diagnostics, genetic counseling, and prenatal diagnosis. Invasive prenatal diagnosis refers to obtaining fetal material by chorionic villus sampling (CVS) at the first trimester, and by amniocentesis or cordocentesis at the second trimester. Molecular diagnosis, which includes multiple techniques that are aimed at the detection of mutations in the α- or β-globin genes, facilitates prenatal diagnosis and definitive diagnosis of the fetus. These are valuable procedures for couples at risk, so that they can be offered options to have healthy offspring. According to local practices and legislation, genetic counseling should accompany the invasive diagnostic procedures, DNA testing, and disclosure of the results. The most critical issue in any type of prenatal molecular testing is maternal cell contamination (MCC), especially when a fetus is found to inherit a particular mutation from the mother. The best practice is to perform MCC studies on all prenatal samples. The recent successful studies of fetal DNA in maternal plasma may allow future prenatal testing that is non-invasive for the fetus and result in significant reduction of invasive diagnostic procedures. Copyright © 2016. Published by Elsevier Ltd.

  5. Risk factors for indications of intraoperative blood transfusion among patients undergoing surgical treatment for colorectal adenocarcinoma.

    PubMed

    Gonçalves, Iara; Linhares, Marcelo; Bordin, Jose; Matos, Delcio

    2009-01-01

    Identification of risk factors for requiring transfusions during surgery for colorectal cancer may lead to preventive actions or alternative measures, towards decreasing the use of blood components in these procedures, and also rationalization of resources use in hemotherapy services. This was a retrospective case-control study using data from 383 patients who were treated surgically for colorectal adenocarcinoma at 'Fundação Pio XII', in Barretos-SP, Brazil, between 1999 and 2003. To recognize significant risk factors for requiring intraoperative blood transfusion in colorectal cancer surgical procedures. Univariate analyses were performed using Fisher's exact test or the chi-squared test for dichotomous variables and Student's t test for continuous variables, followed by multivariate analysis using multiple logistic regression. In the univariate analyses, height (P = 0.06), glycemia (P = 0.05), previous abdominal or pelvic surgery (P = 0.031), abdominoperineal surgery (P<0.001), extended surgery (P<0.001) and intervention with radical intent (P<0.001) were considered significant. In the multivariate analysis using logistic regression, intervention with radical intent (OR = 10.249, P<0.001, 95% CI = 3.071-34.212) and abdominoperineal amputation (OR = 3.096, P = 0.04, 95% CI = 1.445-6.623) were considered to be independently significant. This investigation allows the conclusion that radical intervention and the abdominoperineal procedure in the surgical treatment of colorectal adenocarcinoma are risk factors for requiring intraoperative blood transfusion.

  6. Statistics in biomedical laboratory and clinical science: applications, issues and pitfalls.

    PubMed

    Ludbrook, John

    2008-01-01

    This review is directed at biomedical scientists who want to gain a better understanding of statistics: what tests to use, when, and why. In my view, even during the planning stage of a study it is very important to seek the advice of a qualified biostatistician. When designing and analyzing a study, it is important to construct and test global hypotheses, rather than to make multiple tests on the data. If the latter cannot be avoided, it is essential to control the risk of making false-positive inferences by applying multiple comparison procedures. For comparing two means or two proportions, it is best to use exact permutation tests rather then the better known, classical, ones. For comparing many means, analysis of variance, often of a complex type, is the most powerful approach. The correlation coefficient should never be used to compare the performances of two methods of measurement, or two measures, because it does not detect bias. Instead the Altman-Bland method of differences or least-products linear regression analysis should be preferred. Finally, the educational value to investigators of interaction with a biostatistician, before, during and after a study, cannot be overemphasized. (c) 2007 S. Karger AG, Basel.

  7. Outcomes of reintervention after failed urethroplasty.

    PubMed

    Ekerhult, Teresa Olsen; Lindqvist, Klas; Peeker, Ralph; Grenabo, Lars

    2017-02-01

    Urethroplasty is a procedure that has a high success rate. However, there exists a small subgroup of patients who require multiple procedures to achieve an acceptable result. This study analyses the outcomes of a series of patients with failed urethroplasty. This is a retrospective review of 82 failures out of 407 patients who underwent urethroplasty due to urethral stricture during the period 1999-2013. Failure was defined as the need for an additional surgical procedure. Of the failures, 26 patients had penile strictures and 56 had bulbar strictures. Meatal strictures were not included. The redo procedures included one or multiple direct vision internal urethrotomies, dilatations or new urethroplasties, all with a long follow-up time. The patients underwent one to seven redo surgeries (mean 2.4 procedures per patient). In the present series of patients, endourological procedures cured 34% (28/82) of the patients. Ten patients underwent multiple redo urethroplasties until a satisfactory outcome was achieved; the penile strictures were the most difficult to cure. In patients with bulbar strictures, excision with anastomosis and substitution urethroplasty were equally successful. Nevertheless, 18 patients were defined as treatment failures. Of these patients, nine ended up with clean intermittent self-dilatation as a final solution, five had perineal urethrostomy and four are awaiting a new reintervention. Complicated cases need centralized professional care. Despite the possibility of needing multiple reinterventions, the majority of patients undergoing urethroplasty have a good chance of successful treatment.

  8. Microbiological methods for the water recovery systems test, revision 1.1

    NASA Technical Reports Server (NTRS)

    Rhoads, Tim; Kilgore, M. V., Jr.; Mikell, A. T., Jr.

    1990-01-01

    Current microbiological parameters specified to verify microbiological quality of Space Station Freedom water quality include the enumeration of total bacteria, anaerobes, aerobes, yeasts and molds, enteric bacteria, gram positives, gram negatives, and E. coli. In addition, other parameters have been identified as necessary to support the Water Recovery Test activities to be conducted at the NASA/MSFC later this year. These other parameters include aerotolerant eutrophic mesophiles, legionellae, and an additional method for heterotrophic bacteria. If inter-laboratory data are to be compared to evaluate quality, analytical methods must be eliminated as a variable. Therefore, each participating laboratory must utilize the same analytical methods and procedures. Without this standardization, data can be neither compared nor validated between laboratories. Multiple laboratory participation represents a conservative approach to insure quality and completeness of data. Invariably, sample loss will occur in transport and analyses. Natural variance is a reality on any test of this magnitude and is further enhanced because biological entities, capable of growth and death, are specific parameters of interest. The large variation due to the participation of human test subjects has been noted with previous testing. The resultant data might be dismissed as 'out of control' unless intra-laboratory control is included as part of the method or if participating laboratories are not available for verification. The purpose of this document is to provide standardized laboratory procedures for the enumeration of certain microorganisms in water and wastewater specific to the water recovery systems test. The document consists of ten separate cultural methods and one direct count procedure. It is not intended nor is it implied to be a complete microbiological methods manual.

  9. The Production Data Approach for Full Lifecycle Management

    NASA Astrophysics Data System (ADS)

    Schopf, J.

    2012-04-01

    The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.

  10. Two self-test methods applied to an inertial system problem. [estimating gyroscope and accelerometer bias

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.; Deyst, J. J.; Crawford, B. S.

    1975-01-01

    The paper describes two self-test procedures applied to the problem of estimating the biases in accelerometers and gyroscopes on an inertial platform. The first technique is the weighted sum-squared residual (WSSR) test, with which accelerator bias jumps are easily isolated, but gyro bias jumps are difficult to isolate. The WSSR method does not take full advantage of the knowledge of system dynamics. The other technique is a multiple hypothesis method developed by Buxbaum and Haddad (1969). It has the advantage of directly providing jump isolation information, but suffers from computational problems. It might be possible to use the WSSR to detect state jumps and then switch to the BH system for jump isolation and estimate compensation.

  11. A Powerful Test for Comparing Multiple Regression Functions.

    PubMed

    Maity, Arnab

    2012-09-01

    In this article, we address the important problem of comparison of two or more population regression functions. Recently, Pardo-Fernández, Van Keilegom and González-Manteiga (2007) developed test statistics for simple nonparametric regression models: Y(ij) = θ(j)(Z(ij)) + σ(j)(Z(ij))∊(ij), based on empirical distributions of the errors in each population j = 1, … , J. In this paper, we propose a test for equality of the θ(j)(·) based on the concept of generalized likelihood ratio type statistics. We also generalize our test for other nonparametric regression setups, e.g, nonparametric logistic regression, where the loglikelihood for population j is any general smooth function [Formula: see text]. We describe a resampling procedure to obtain the critical values of the test. In addition, we present a simulation study to evaluate the performance of the proposed test and compare our results to those in Pardo-Fernández et al. (2007).

  12. Latent Heating Retrieval from TRMM Observations Using a Simplified Thermodynamic Model

    NASA Technical Reports Server (NTRS)

    Grecu, Mircea; Olson, William S.

    2003-01-01

    A procedure for the retrieval of hydrometeor latent heating from TRMM active and passive observations is presented. The procedure is based on current methods for estimating multiple-species hydrometeor profiles from TRMM observations. The species include: cloud water, cloud ice, rain, and graupel (or snow). A three-dimensional wind field is prescribed based on the retrieved hydrometeor profiles, and, assuming a steady-state, the sources and sinks in the hydrometeor conservation equations are determined. Then, the momentum and thermodynamic equations, in which the heating and cooling are derived from the hydrometeor sources and sinks, are integrated one step forward in time. The hydrometeor sources and sinks are reevaluated based on the new wind field, and the momentum and thermodynamic equations are integrated one more step. The reevalution-integration process is repeated until a steady state is reached. The procedure is tested using cloud model simulations. Cloud-model derived fields are used to synthesize TRMM observations, from which hydrometeor profiles are derived. The procedure is applied to the retrieved hydrometeor profiles, and the latent heating estimates are compared to the actual latent heating produced by the cloud model. Examples of procedure's applications to real TRMM data are also provided.

  13. An Innovative Concept for Testing Rutting Susceptibility of Asphalt Mixture

    NASA Astrophysics Data System (ADS)

    Mohseni, Alaeddin; Azari, Haleh

    Currently, flow number (FN) is being used for measuring permanent deformation resistance of asphalt mixtures. The provisional AASHTO TP 79-10 test method specifies the requirements of the FN test; however, there are undefined levels of test variables, such as temperature, axial stress, and confinement. Therefore, agreeable FN criteria that can reliably discriminate between various mixtures have not been established yet. As the asphalt industry continues to develop more sophisticated mixtures (Warm Mix, RAP and RAS), the FN value has failed to capture the true complexity of the asphalt mixtures. These shortcomings and the unpredictable testing time of the FN test have affected its usefulness for evaluating high temperature performance of asphalt mixtures. A new test procedure for evaluation of rutting susceptibility of asphalt mixtures is being proposed. The new procedure is conducted at one temperature and multiple stresses on the same replicate in three increments of 500 cycles, which only takes 33 minutes to complete. The property of the test is the permanent strain due to the last cycle of each test increment (Minimum Strain Rate, or MSR). A master curve is developed by plotting the MSR values versus parameter TP, which is a product of Temperature and Pressure. The MSR master curve represents the unit rutting damage (rut per axle) of asphalt mixtures at any stress and temperature and can be used in laboratory for material characterization, mix design verification, ranking of the mixtures, or for pavement design applications to predict rut depth for project climate and design traffic.

  14. [Occupational exposure to blood in multiple trauma care].

    PubMed

    Wicker, S; Wutzler, S; Schachtrupp, A; Zacharowski, K; Scheller, B

    2015-01-01

    Trauma care personnel are at risk of occupational exposure to blood-borne pathogens. Little is known regarding compliance with standard precautions or occupational exposure to blood and body fluids among multiple trauma care personnel in Germany. Compliance rates of multiple trauma care personnel in applying standard precautions, knowledge about transmission risks of blood-borne pathogens, perceived risks of acquiring hepatitis B, hepatitis C and human immunodeficiency virus (HIV) and the personal attitude towards testing of the index patient for blood-borne pathogens after a needlestick injury were evaluated. In the context of an advanced multiple trauma training an anonymous questionnaire was administered to the participants. Almost half of the interviewees had sustained a needlestick injury within the last 12 months. Approximately three quarters of the participants were concerned about the risk of HIV and hepatitis. Trauma care personnel had insufficient knowledge of the risk of blood-borne pathogens, overestimated the risk of hepatitis C infection and underused standard precautionary measures. Although there was excellent compliance for using gloves, there was poor compliance in using double gloves (26.4 %), eye protectors (19.7 %) and face masks (15.8 %). The overwhelming majority of multiple trauma care personnel believed it is appropriate to test an index patient for blood-borne pathogens following a needlestick injury. The process of treatment in prehospital settings is less predictable than in other settings in which invasive procedures are performed. Periodic training and awareness programs for trauma care personnel are required to increase the knowledge of occupational infections and the compliance with standard precautions. The legal and ethical aspects of testing an index patient for blood-borne pathogens after a needlestick injury of a healthcare worker have to be clarified in Germany.

  15. An improved standardization procedure to remove systematic low frequency variability biases in GCM simulations

    NASA Astrophysics Data System (ADS)

    Mehrotra, Rajeshwar; Sharma, Ashish

    2012-12-01

    The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.

  16. Optimization and validation of moving average quality control procedures using bias detection curves and moving average validation charts.

    PubMed

    van Rossum, Huub H; Kemperman, Hans

    2017-02-01

    To date, no practical tools are available to obtain optimal settings for moving average (MA) as a continuous analytical quality control instrument. Also, there is no knowledge of the true bias detection properties of applied MA. We describe the use of bias detection curves for MA optimization and MA validation charts for validation of MA. MA optimization was performed on a data set of previously obtained consecutive assay results. Bias introduction and MA bias detection were simulated for multiple MA procedures (combination of truncation limits, calculation algorithms and control limits) and performed for various biases. Bias detection curves were generated by plotting the median number of test results needed for bias detection against the simulated introduced bias. In MA validation charts the minimum, median, and maximum numbers of assay results required for MA bias detection are shown for various bias. Their use was demonstrated for sodium, potassium, and albumin. Bias detection curves allowed optimization of MA settings by graphical comparison of bias detection properties of multiple MA. The optimal MA was selected based on the bias detection characteristics obtained. MA validation charts were generated for selected optimal MA and provided insight into the range of results required for MA bias detection. Bias detection curves and MA validation charts are useful tools for optimization and validation of MA procedures.

  17. Using the Multiple Choice Procedure to Measure College Student Gambling

    ERIC Educational Resources Information Center

    Butler, Leon Harvey

    2010-01-01

    Research suggests that gambling is similar to addictive behaviors such as substance use. In the current study, gambling was investigated from a behavioral economics perspective. The Multiple Choice Procedure (MCP) with gambling as the target behavior was used to assess for relative reinforcing value, the effect of alternative reinforcers, and…

  18. A New Zero-Inflated Negative Binomial Methodology for Latent Category Identification

    ERIC Educational Resources Information Center

    Blanchard, Simon J.; DeSarbo, Wayne S.

    2013-01-01

    We introduce a new statistical procedure for the identification of unobserved categories that vary between individuals and in which objects may span multiple categories. This procedure can be used to analyze data from a proposed sorting task in which individuals may simultaneously assign objects to multiple piles. The results of a synthetic…

  19. Effects of Estimation Bias on Multiple-Category Classification with an IRT-Based Adaptive Classification Procedure

    ERIC Educational Resources Information Center

    Yang, Xiangdong; Poggio, John C.; Glasnapp, Douglas R.

    2006-01-01

    The effects of five ability estimators, that is, maximum likelihood estimator, weighted likelihood estimator, maximum a posteriori, expected a posteriori, and Owen's sequential estimator, on the performances of the item response theory-based adaptive classification procedure on multiple categories were studied via simulations. The following…

  20. Residual Strength Prediction of Fuselage Structures with Multiple Site Damage

    NASA Technical Reports Server (NTRS)

    Chen, Chuin-Shan; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1999-01-01

    This paper summarizes recent results on simulating full-scale pressure tests of wide body, lap-jointed fuselage panels with multiple site damage (MSD). The crack tip opening angle (CTOA) fracture criterion and the FRANC3D/STAGS software program were used to analyze stable crack growth under conditions of general yielding. The link-up of multiple cracks and residual strength of damaged structures were predicted. Elastic-plastic finite element analysis based on the von Mises yield criterion and incremental flow theory with small strain assumption was used. A global-local modeling procedure was employed in the numerical analyses. Stress distributions from the numerical simulations are compared with strain gage measurements. Analysis results show that accurate representation of the load transfer through the rivets is crucial for the model to predict the stress distribution accurately. Predicted crack growth and residual strength are compared with test data. Observed and predicted results both indicate that the occurrence of small MSD cracks substantially reduces the residual strength. Modeling fatigue closure is essential to capture the fracture behavior during the early stable crack growth. Breakage of a tear strap can have a major influence on residual strength prediction.

  1. Facilitating relational framing in children and individuals with developmental delay using the relational completion procedure.

    PubMed

    Walsh, Sinead; Horgan, Jennifer; May, Richard J; Dymond, Simon; Whelan, Robert

    2014-01-01

    The Relational Completion Procedure is effective for establishing same, opposite and comparative derived relations in verbally able adults, but to date it has not been used to establish relational frames in young children or those with developmental delay. In Experiment 1, the Relational Completion Procedure was used with the goal of establishing two 3-member sameness networks in nine individuals with Autism Spectrum Disorder (eight with language delay). A multiple exemplar intervention was employed to facilitate derived relational responding when required. Seven of nine participants in Experiment 1 passed tests for derived relations. In Experiment 2, eight participants (all of whom, except one, had a verbal repertoire) were given training with the aim of establishing two 4-member sameness networks. Three of these participants were typically developing young children aged between 5 and 6 years old, all of whom demonstrated derived relations, as did four of the five participants with developmental delay. These data demonstrate that it is possible to reliably establish derived relations in young children and those with developmental delay using an automated procedure. © Society for the Experimental Analysis of Behavior.

  2. The Launch Processing System for Space Shuttle.

    NASA Technical Reports Server (NTRS)

    Springer, D. A.

    1973-01-01

    In order to reduce costs and accelerate vehicle turnaround, a single automated system will be developed to support shuttle launch site operations, replacing a multiplicity of systems used in previous programs. The Launch Processing System will provide real-time control, data analysis, and information display for the checkout, servicing, launch, landing, and refurbishment of the launch vehicles, payloads, and all ground support systems. It will also provide real-time and historical data retrieval for management and sustaining engineering (test records and procedures, logistics, configuration control, scheduling, etc.).

  3. Ultra-low mass drift chambers

    NASA Astrophysics Data System (ADS)

    Assiro, R.; Cappelli, L.; Cascella, M.; De Lorenzis, L.; Grancagnolo, F.; Ignatov, F.; L'Erario, A.; Maffezzoli, A.; Miccoli, A.; Onorato, G.; Perillo, M.; Piacentino, G.; Rella, S.; Rossetti, F.; Spedicato, M.; Tassielli, G.; Zavarise, G.

    2013-08-01

    We present a novel low mass drift chamber concept, developed in order to fulfill the stringent requirements imposed by the experiments for extremely rare processes, which require high resolutions (order of 100-200 keV/c) for particle momenta in a range (50-100 MeV/c) totally dominated by the multiple scattering contribution. We describe a geometry optimization procedure and a new wiring strategy with a feed-through-less wire anchoring system developed and tested on a drift chamber prototype under completion at INFN-Lecce .

  4. Analytical techniques and instrumentation, a compilation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Procedures for conducting materials tests and structural analyses of aerospace components are presented as a part of the NASA technology utilization program. Some of the subjects discussed are as follows: (1) failures in cryogenic tank insulation, (2) friction characteristics of graphite and graphite-metal combinations, (3) evaluation of polymeric products in thermal-vacuum environment, (4) erosion of metals by multiple impacts with water, (5) mass loading effects on vibrated ring and shell structures, (6) nonlinear damping in structures, and (7) method for estimating reliability of randomly excited structures.

  5. Ambiguity resolution in systems using Omega for position location

    NASA Technical Reports Server (NTRS)

    Frenkel, G.; Gan, D. G.

    1974-01-01

    The lane ambiguity problem prevents the utilization of the Omega system for many applications such as locating buoys and balloons. The method of multiple lines of position introduced herein uses signals from four or more Omega stations for ambiguity resolution. The coordinates of the candidate points are determined first through the use of the Newton iterative procedure. Subsequently, a likelihood function is generated for each point, and the ambiguity is resolved by selecting the most likely point. The method was tested through simulation.

  6. Multiple-Objective Stepwise Calibration Using Luca

    USGS Publications Warehouse

    Hay, Lauren E.; Umemoto, Makiko

    2007-01-01

    This report documents Luca (Let us calibrate), a multiple-objective, stepwise, automated procedure for hydrologic model calibration and the associated graphical user interface (GUI). Luca is a wizard-style user-friendly GUI that provides an easy systematic way of building and executing a calibration procedure. The calibration procedure uses the Shuffled Complex Evolution global search algorithm to calibrate any model compiled with the U.S. Geological Survey's Modular Modeling System. This process assures that intermediate and final states of the model are simulated consistently with measured values.

  7. A procedure of multiple period searching in unequally spaced time-series with the Lomb-Scargle method

    NASA Technical Reports Server (NTRS)

    Van Dongen, H. P.; Olofsen, E.; VanHartevelt, J. H.; Kruyt, E. W.; Dinges, D. F. (Principal Investigator)

    1999-01-01

    Periodogram analysis of unequally spaced time-series, as part of many biological rhythm investigations, is complicated. The mathematical framework is scattered over the literature, and the interpretation of results is often debatable. In this paper, we show that the Lomb-Scargle method is the appropriate tool for periodogram analysis of unequally spaced data. A unique procedure of multiple period searching is derived, facilitating the assessment of the various rhythms that may be present in a time-series. All relevant mathematical and statistical aspects are considered in detail, and much attention is given to the correct interpretation of results. The use of the procedure is illustrated by examples, and problems that may be encountered are discussed. It is argued that, when following the procedure of multiple period searching, we can even benefit from the unequal spacing of a time-series in biological rhythm research.

  8. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  9. Multiple window spatial registration error of a gamma camera: 133Ba point source as a replacement of the NEMA procedure.

    PubMed

    Bergmann, Helmar; Minear, Gregory; Raith, Maria; Schaffarich, Peter M

    2008-12-09

    The accuracy of multiple window spatial resolution characterises the performance of a gamma camera for dual isotope imaging. In the present study we investigate an alternative method to the standard NEMA procedure for measuring this performance parameter. A long-lived 133Ba point source with gamma energies close to 67Ga and a single bore lead collimator were used to measure the multiple window spatial registration error. Calculation of the positions of the point source in the images used the NEMA algorithm. The results were validated against the values obtained by the standard NEMA procedure which uses a liquid 67Ga source with collimation. Of the source-collimator configurations under investigation an optimum collimator geometry, consisting of a 5 mm thick lead disk with a diameter of 46 mm and a 5 mm central bore, was selected. The multiple window spatial registration errors obtained by the 133Ba method showed excellent reproducibility (standard deviation < 0.07 mm). The values were compared with the results from the NEMA procedure obtained at the same locations and showed small differences with a correlation coefficient of 0.51 (p < 0.05). In addition, the 133Ba point source method proved to be much easier to use. A Bland-Altman analysis showed that the 133Ba and the 67Ga Method can be used interchangeably. The 133Ba point source method measures the multiple window spatial registration error with essentially the same accuracy as the NEMA-recommended procedure, but is easier and safer to use and has the potential to replace the current standard procedure.

  10. Parameter estimation and forecasting for multiplicative log-normal cascades.

    PubMed

    Leövey, Andrés E; Lux, Thomas

    2012-04-01

    We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing et al. [Physica D 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica D 193, 195 (2004)] and Kiyono et al. [Phys. Rev. E 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.

  11. Heavy Ion and Proton-Induced Single Event Upset Characteristics of a 3D NAND Flash Memory

    NASA Technical Reports Server (NTRS)

    Chen, Dakai; Wilcox, Edward; Ladbury, Raymond; Seidleck, Christina; Kim, Hak; Phan, Anthony; Label, Kenneth

    2017-01-01

    We evaluated the effects of heavy ion and proton irradiation for a 3D NAND flash. The 3D NAND showed similar single-event upset (SEU) sensitivity to a planar NAND of identical density in the multiple-cell level (MLC) storage mode. The 3D NAND showed significantly reduced SEU susceptibility in single-level-cell (SLC) storage mode. Additionally, the 3D NAND showed less multiple-bit upset susceptibility than the planar NAND, with fewer number of upset bits per byte and smaller cross sections overall. However, the 3D architecture exhibited angular sensitivities for both base and face angles, reflecting the anisotropic nature of the SEU vulnerability in space. Furthermore, the SEU cross section decreased with increasing fluence for both the 3D NAND and the Micron 16 nm planar NAND, which suggests that typical heavy ion test fluences will underestimate the upset rate during a space mission. These unique characteristics introduce complexity to traditional ground irradiation test procedures.

  12. 40 CFR 600.111-80 - Test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Test procedures. 600.111-80 Section... Model Year Automobiles-Test Procedures § 600.111-80 Test procedures. (a) The test procedures to be...-78 of this chapter, as applicable. (The evaporative and refueling loss portions of the test procedure...

  13. 40 CFR 600.111-93 - Test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Test procedures. 600.111-93 Section... Model Year Automobiles-Test Procedures § 600.111-93 Test procedures. (a) The test procedures to be... loss portion of the test procedure may be omitted unless specifically required by the Administrator...

  14. A vertical-energy-thresholding procedure for data reduction with multiple complex curves.

    PubMed

    Jung, Uk; Jeong, Myong K; Lu, Jye-Chyi

    2006-10-01

    Due to the development of sensing and computer technology, measurements of many process variables are available in current manufacturing processes. It is very challenging, however, to process a large amount of information in a limited time in order to make decisions about the health of the processes and products. This paper develops a "preprocessing" procedure for multiple sets of complicated functional data in order to reduce the data size for supporting timely decision analyses. The data type studied has been used for fault detection, root-cause analysis, and quality improvement in such engineering applications as automobile and semiconductor manufacturing and nanomachining processes. The proposed vertical-energy-thresholding (VET) procedure balances the reconstruction error against data-reduction efficiency so that it is effective in capturing key patterns in the multiple data signals. The selected wavelet coefficients are treated as the "reduced-size" data in subsequent analyses for decision making. This enhances the ability of the existing statistical and machine-learning procedures to handle high-dimensional functional data. A few real-life examples demonstrate the effectiveness of our proposed procedure compared to several ad hoc techniques extended from single-curve-based data modeling and denoising procedures.

  15. 40 CFR 600.111-08 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Test procedures. 600.111-08 Section... Emission Test Procedures § 600.111-08 Test procedures. This section describes test procedures for the FTP, highway fuel economy test (HFET), US06, SC03, and the cold temperature FTP tests. Perform testing...

  16. Interaction Models for Functional Regression.

    PubMed

    Usset, Joseph; Staicu, Ana-Maria; Maity, Arnab

    2016-02-01

    A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data.

  17. A function-based approach to cockpit procedure aids

    NASA Technical Reports Server (NTRS)

    Phatak, Anil V.; Jain, Parveen; Palmer, Everett

    1990-01-01

    The objective of this research is to develop and test a cockpit procedural aid that can compose and present procedures that are appropriate for the given flight situation. The procedure would indicate the status of the aircraft engineering systems, and the environmental conditions. Prescribed procedures already exist for normal as well as for a number of non-normal and emergency situations, and can be presented to the crew using an interactive cockpit display. However, no procedures are prescribed or recommended for a host of plausible flight situations involving multiple malfunctions compounded by adverse environmental conditions. Under these circumstances, the cockpit procedural aid must review the prescribed procedures for the individual malfunction (when available), evaluate the alternatives or options, and present one or more composite procedures (prioritized or unprioritized) in response to the given situation. A top-down function-based conceptual approach towards composing and presenting cockpit procedures is being investigated. This approach is based upon the thought process that an operating crew must go through while attempting to meet the flight objectives given the current flight situation. In order to accomplish the flight objectives, certain critical functions must be maintained during each phase of the flight, using the appropriate procedures or success paths. The viability of these procedures depends upon the availability of required resources. If resources available are not sufficient to meet the requirements, alternative procedures (success paths) using the available resources must be constructed to maintain the critical functions and the corresponding objectives. If no success path exists that can satisfy the critical functions/objectives, then the next level of critical functions/objectives must be selected and the process repeated. Information is given in viewgraph form.

  18. Project START: Using a Multiple Intelligences Model in Identifying and Promoting Talent in High-Risk Students. Research Monograph 95136.

    ERIC Educational Resources Information Center

    Callahan, Carolyn M.; Tomlinson, Carol A.; Moon, Tonya R.; Tomchin, Ellen M.; Plucker, Jonathan A.

    This monograph describes Project START (Support To Affirm Rising Talent), a three-year collaborative research effort to develop and apply gifted identification procedures based on Howard Gardner's (1983) theory of multiple intelligences. Specifically, the study attempted to: (1) develop identification procedures; (2) identify high-potential…

  19. Constructing Standards: A Study of Nurses Negotiating with Multiple Modes of Knowledge

    ERIC Educational Resources Information Center

    Nes, Sturle; Moen, Anne

    2010-01-01

    Purpose: The aim of the paper is to explore how multiple modes of knowledge play out in the consolidation of nursing procedures in construction of "local universality". The paper seeks to explore processes where nurses negotiate universal procedures that are to become local standards in a hospital. Design/methodology/approach: The paper…

  20. On Two-Stage Multiple Comparison Procedures When There Are Unequal Sample Sizes in the First Stage.

    ERIC Educational Resources Information Center

    Wilcox, Rand R.

    1984-01-01

    Two stage multiple-comparison procedures give an exact solution to problems of power and Type I errors, but require equal sample sizes in the first stage. This paper suggests a method of evaluating the experimentwise Type I error probability when the first stage has unequal sample sizes. (Author/BW)

  1. Pairwise Multiple Comparisons in Single Group Repeated Measures Analysis.

    ERIC Educational Resources Information Center

    Barcikowski, Robert S.; Elliott, Ronald S.

    Research was conducted to provide educational researchers with a choice of pairwise multiple comparison procedures (P-MCPs) to use with single group repeated measures designs. The following were studied through two Monte Carlo (MC) simulations: (1) The T procedure of J. W. Tukey (1953); (2) a modification of Tukey's T (G. Keppel, 1973); (3) the…

  2. The Effect of a Multiple Treatment Program and Maintenance Procedures on Smoking Cessation.

    ERIC Educational Resources Information Center

    Powell, Don R.

    The efficacy of a multiple treatment smoking cessation program and three maintenance strategies was evaluated. Phases I and II of the study involved 51 subjects who participated in a five-day smoking cessation project consisting of lectures, demonstrations, practice exercises, negative smoking, and the teaching of self-control procedures. At the…

  3. The Inverse Relation between Multiplication and Division: Concepts, Procedures, and a Cognitive Framework

    ERIC Educational Resources Information Center

    Robinson, Katherine M.; LeFevre, Jo-Anne

    2012-01-01

    Researchers have speculated that children find it more difficult to acquire conceptual understanding of the inverse relation between multiplication and division than that between addition and subtraction. We reviewed research on children and adults' use of shortcut procedures that make use of the inverse relation on two kinds of problems:…

  4. 40 CFR 1033.315 - Test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Test procedures. 1033.315 Section 1033... Programs § 1033.315 Test procedures. (a) Test procedures. Use the test procedures described in subpart F of this part, except as specified in this section. (1) You may ask to use other test procedures. We will...

  5. 40 CFR 1033.315 - Test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Test procedures. 1033.315 Section 1033... Programs § 1033.315 Test procedures. (a) Test procedures. Use the test procedures described in subpart F of this part, except as specified in this section. (1) You may ask to use other test procedures. We will...

  6. 40 CFR 1033.315 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Test procedures. 1033.315 Section 1033... Programs § 1033.315 Test procedures. (a) Test procedures. Use the test procedures described in subpart F of this part, except as specified in this section. (1) You may ask to use other test procedures. We will...

  7. Higher certainty of the laser-induced damage threshold test with a redistributing data treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Lars; Mrohs, Marius; Gyamfi, Mark

    2015-10-15

    As a consequence of its statistical nature, the measurement of the laser-induced damage threshold holds always risks to over- or underestimate the real threshold value. As one of the established measurement procedures, the results of S-on-1 (and 1-on-1) tests outlined in the corresponding ISO standard 21 254 depend on the amount of data points and their distribution over the fluence scale. With the limited space on a test sample as well as the requirements on test site separation and beam sizes, the amount of data from one test is restricted. This paper reports on a way to treat damage testmore » data in order to reduce the statistical error and therefore measurement uncertainty. Three simple assumptions allow for the assignment of one data point to multiple data bins and therefore virtually increase the available data base.« less

  8. A default Bayesian hypothesis test for mediation.

    PubMed

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  9. Significance tests for functional data with complex dependence structure.

    PubMed

    Staicu, Ana-Maria; Lahiri, Soumen N; Carroll, Raymond J

    2015-01-01

    We propose an L 2 -norm based global testing procedure for the null hypothesis that multiple group mean functions are equal, for functional data with complex dependence structure. Specifically, we consider the setting of functional data with a multilevel structure of the form groups-clusters or subjects-units, where the unit-level profiles are spatially correlated within the cluster, and the cluster-level data are independent. Orthogonal series expansions are used to approximate the group mean functions and the test statistic is estimated using the basis coefficients. The asymptotic null distribution of the test statistic is developed, under mild regularity conditions. To our knowledge this is the first work that studies hypothesis testing, when data have such complex multilevel functional and spatial structure. Two small-sample alternatives, including a novel block bootstrap for functional data, are proposed, and their performance is examined in simulation studies. The paper concludes with an illustration of a motivating experiment.

  10. Clinical Outcome Metrics for Optimization of Robust Training

    NASA Technical Reports Server (NTRS)

    Ebert, D.; Byrne, V. E.; McGuire, K. M.; Hurst, V. W., IV; Kerstman, E. L.; Cole, R. W.; Sargsyan, A. E.; Garcia, K. M.; Reyes, D.; Young, M.

    2016-01-01

    Introduction: The emphasis of this research is on the Human Research Program (HRP) Exploration Medical Capability's (ExMC) "Risk of Unacceptable Health and Mission Outcomes Due to Limitations of In-Flight Medical Capabilities." Specifically, this project aims to contribute to the closure of gap ExMC 2.02: We do not know how the inclusion of a physician crew medical officer quantitatively impacts clinical outcomes during exploration missions. The experiments are specifically designed to address clinical outcome differences between physician and non-physician cohorts in both near-term and longer-term (mission impacting) outcomes. Methods: Medical simulations will systematically compare success of individual diagnostic and therapeutic procedure simulations performed by physician and non-physician crew medical officer (CMO) analogs using clearly defined short-term (individual procedure) outcome metrics. In the subsequent step of the project, the procedure simulation outcomes will be used as input to a modified version of the NASA Integrated Medical Model (IMM) to analyze the effect of the outcome (degree of success) of individual procedures (including successful, imperfectly performed, and failed procedures) on overall long-term clinical outcomes and the consequent mission impacts. The procedures to be simulated are endotracheal intubation, fundoscopic examination, kidney/urinary ultrasound, ultrasound-guided intravenous catheter insertion, and a differential diagnosis exercise. Multiple assessment techniques will be used, centered on medical procedure simulation studies occurring at 3, 6, and 12 months after initial training (as depicted in the following flow diagram of the experiment design). Discussion: Analysis of procedure outcomes in the physician and non-physician groups and their subsets (tested at different elapsed times post training) will allow the team to 1) define differences between physician and non-physician CMOs in terms of both procedure performance (pre-IMM analysis) and overall mitigation of the mission medical impact (IMM analysis); 2) refine the procedure outcome and clinical outcome metrics themselves; 3) refine or develop innovative medical training products and solutions to maximize CMO performance; and 4) validate the methods and products of this experiment for operational use in the planning, execution, and quality assurance of the CMO training process The team has finalized training protocols and developed a software training/testing tool in collaboration with Butler Graphics (Detroit, MI). In addition to the "hands on" medical procedure modules, the software includes a differential diagnosis exercise (limited clinical decision support tool) to evaluate the diagnostic skills of participants. Human subject testing will occur over the next year.

  11. Patient-centered outcomes comparing digital and conventional implant impression procedures: a randomized crossover trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2016-12-01

    The aim of this randomized controlled trial was to compare patient-centered outcomes during digital and conventional implant impressions. In a crossover study design, intraoral scanning (IOS) [test] as well as classical polyether impressions [control] were both performed on 20 patients for single-tooth replacement with implant-supported crowns. The sequential distribution of either starting with the test or the control procedure was randomly selected. Patients' perception and satisfaction on the level of convenience-related factors were assessed with visual analogue scale (VAS) questionnaires. In addition, clinical work time was separately recorded for test and control procedures. Statistical analyses were performed with Wilcoxon signed-rank tests and corrected for multiple testing by the method of Holm. On VAS ranging from 0 to 100, patients scored a mean convenience level of 78.6 (SD ± 14.0) in favor of IOS compared to conventional impressions with 53.6 (SD ± 15.4) [P = 0.0001]. All included patients would prefer the digital workflow if in the future they could choose between the two techniques. Secondary, IOS was significantly faster with 14.8 min (SD ± 2.2) compared to the conventional approach with 17.9 min (SD ± 1.1) [P = 0.0001]. Based on the findings of this investigation, both impression protocols worked successfully for all study participants capturing the 3D implant positions. However, the digital technique emerges as the most preferred one according to patient-centered outcomes and was more time-effective compared to conventional impressions. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. 40 CFR 600.111-08 - Test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Test procedures. 600.111-08 Section... Emission Test Procedures § 600.111-08 Test procedures. This section provides test procedures for the FTP, highway, US06, SC03, and the cold temperature FTP tests. Testing shall be performed according to test...

  13. Step by Step: Biology Undergraduates' Problem-Solving Procedures during Multiple-Choice Assessment.

    PubMed

    Prevost, Luanna B; Lemons, Paula P

    2016-01-01

    This study uses the theoretical framework of domain-specific problem solving to explore the procedures students use to solve multiple-choice problems about biology concepts. We designed several multiple-choice problems and administered them on four exams. We trained students to produce written descriptions of how they solved the problem, and this allowed us to systematically investigate their problem-solving procedures. We identified a range of procedures and organized them as domain general, domain specific, or hybrid. We also identified domain-general and domain-specific errors made by students during problem solving. We found that students use domain-general and hybrid procedures more frequently when solving lower-order problems than higher-order problems, while they use domain-specific procedures more frequently when solving higher-order problems. Additionally, the more domain-specific procedures students used, the higher the likelihood that they would answer the problem correctly, up to five procedures. However, if students used just one domain-general procedure, they were as likely to answer the problem correctly as if they had used two to five domain-general procedures. Our findings provide a categorization scheme and framework for additional research on biology problem solving and suggest several important implications for researchers and instructors. © 2016 L. B. Prevost and P. P. Lemons. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. Development and testing of a new disposable sterile device for labelling white blood cells.

    PubMed

    Signore, A; Glaudemans, A W J M; Malviya, G; Lazzeri, E; Prandini, N; Viglietti, A L; De Vries, E F J; Dierckx, R A J O

    2012-08-01

    White blood cell (WBC) labelling requires isolation of cells from patient's blood under sterile conditions using sterile materials, buffers and disposables under good manufacturing practice (GMP) conditions. Till now, this limited the use of white blood cell scintigraphy (WBC-S) only to well equipped laboratories with trained personnel. We invented, developed and tested a disposable, sterile, closed device for blood manipulation, WBC purification and radionuclide labelling without exposing patient's blood and the operator to contamination risks. This device prototype and a final industrialized device (Leukokit®) were tested for WBC labelling and compared to standard procedure. Leukokit® was also tested in an international multi-centre study for easiness of WBC purification and labelling. On the device prototype we tested in parallel, with blood samples from 7 volunteers, the labelling procedure compared to the standard procedure of the International Society of Radiolabeled Blood Elements (ISORBE) consensus protocol with respect to cell recovery, labelling efficiency (LE), cell viability (Trypan Blue test) and sterility (haemoculture). On the final Leukokit® we tested the biocompatibility of all components, and again the LE, erythro-sedimentation rate, cell viability, sterility and apyrogenicity. ACD-A, HES and PBS provided by Leukokit® were also compared to Heparin, Dextran and autologous plasma, respectively. In 4 samples, we tested the chemotactic activity of purified WBC against 1 mg/ml of lipopolysaccharide (LPS) and chemotaxis of 99mTc-HMPAO-labelled WBC (925 MBq) was compared to that of unlabelled cells. For the multi-centre study, 70 labellings were performed with the Leukokit® by 9 expert operators and 3 beginners from five centers using blood from both patients and volunteers. Finally, Media-Fill tests were performed by 3 operators on two different days (11 procedures) by replacing blood and kit reagents with bacterial culture media (Tryptic Soy Broth) and testing sterility of aliquots of the medium at the end of procedure. Tests performed with the prototype showed no significant differences with the standard procedure but a faster and safer approach. Tests performed with the final Leukokit® confirmed full biocompatibility, sterility and apyrogenicity of all reagents and plastic ware. Average WBC recovery with Leukokit® was comparable to that of the ISORBE protocol (117x106±24x106 vs. 132x106±29x106 cells, P=not significant). No differences in red blood cells and platelet content were observed. LE was 82% ± 3% for Leukokit® and 65±5% for control (P=0.0003) being PBS vs autologous plasma the main reason of such difference. Cell viability was always >99.9% in both conditions. Chemotactic tests showed no differences between all Leukokit® samples and controls. Haemocultures and Media-Fill tests were always sterile. The procedure was well accepted by expert operators and beginners, with a very fast learning curve (confidence after 2±2 labellings). The invented device offers high level of protection to operators and patients. The derived Leukokit® is safe and easy to use, and gives a high LE of WBC without affecting cell viability and function. Being a registered closed, sterile medical device, it may allow easier and faster WBC labelling that is not limited to only well equipped laboratories. Also simultaneously labelling of multiple patients is possible.

  15. Building a computer program to support children, parents, and distraction during healthcare procedures.

    PubMed

    Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L

    2012-10-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.

  16. 40 CFR 600.111-08 - Test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Test procedures. 600.111-08 Section... Model Year Automobiles-Test Procedures § 600.111-08 Test procedures. This section provides test procedures for the FTP, highway, US06, SC03, and the cold temperature FTP tests. Testing shall be performed...

  17. Defining unnecessary disinfection procedures for single-dose and multiple-dose vials.

    PubMed

    Buckley, T; Dudley, S M; Donowitz, L G

    1994-11-01

    Recommendations in the literature conflict on the necessity of disinfecting single-use vials prior to aspiration of fluid. Interventions to disinfect the stopper surface on multiple-dose vials vary considerably. To determine the necessity of alcohol disinfection of the stopper on single-dose vials and to compare povidone-iodine and alcohol versus alcohol-only disinfection of the stopper prior to each needle penetration on multiple-dose vials. The rubber stopper surfaces of 100 single-dose vials were cultured for the presence of bacteria. To determine the efficacy of two procedures for disinfection of multiple-dose vials, 87 stopper surfaces routinely disinfected with both povidone-iodine and alcohol were cultured for bacteria. After a change in practice, 100 multiple-dose vials routinely disinfected with alcohol only were cultured for the presence of bacteria. Of the cultures done on single-dose vial stoppers, 99% were sterile. A comparison of the two disinfection techniques for multiple-dose vials revealed that 83 (95%) of the 87 vials prepped with both povidone-iodine and alcohol were sterile, compared with all stoppers disinfected with alcohol only. This study shows the lack of necessity of any disinfection procedure on the rubber stopper of single-dose vials and the efficacy of alcohol only for disinfecting the stopper of multiple-dose vials.

  18. Data-driven inference for the spatial scan statistic.

    PubMed

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  19. Implementing a routine outcome assessment procedure to evaluate the quality of assistive technology service delivery for children with physical or multiple disabilities: Perceived effectiveness, social cost, and user satisfaction.

    PubMed

    Desideri, Lorenzo; Bizzarri, Martina; Bitelli, Claudio; Roentgen, Uta; Gelderblom, Gert-Jan; de Witte, Luc

    2016-01-01

    There is a lack of evidence on the effects and quality of assistive technology service delivery (ATSD). This study presents a quasi-experimental 3-months follow-up using a pre-test/post-test design aimed at evaluating outcomes of assistive technology (AT) interventions targeting children with physical and multiple disabilities. A secondary aim was to evaluate the feasibility of the follow-up assessment adopted in this study with a view to implement the procedure in routine clinical practice. Forty-five children aged 3-17 years were included. Parents were asked to complete the Individual Prioritised Problem Assessment (IPPA) for AT effectiveness; KWAZO (Kwaliteit van Zorg [Quality of Care]) and Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST) 2.0 for satisfaction with ATSD; Siva Cost Analysis Instrument (SCAI) for estimating the social cost of AT interventions. At follow-up, 25 children used the AT recommended. IPPA effect sizes ranged from 1.4 to 0.7, showing a large effect of AT interventions. Overall, parents were satisfied with ATSD, but Maintenance, Professional Services, and AT Delivery were rated not satisfactory. SCAI showed more resources spent for AT intervention compared to human assistance without technological supports. AT may be an effective intervention for children with disabilities. Issues concerning responsiveness and feasibility of the IPPA and the SCAI instruments are discussed with a view to inform routine clinical practice.

  20. Choosing the best image processing method for masticatory performance assessment when using two-coloured specimens.

    PubMed

    Vaccaro, G; Pelaez, J I; Gil, J A

    2016-07-01

    Objective masticatory performance assessment using two-coloured specimens relies on image processing techniques; however, just a few approaches have been tested and no comparative studies are reported. The aim of this study was to present a selection procedure of the optimal image analysis method for masticatory performance assessment with a given two-coloured chewing gum. Dentate participants (n = 250; 25 ± 6·3 years) chewed red-white chewing gums for 3, 6, 9, 12, 15, 18, 21 and 25 cycles (2000 samples). Digitalised images of retrieved specimens were analysed using 122 image processing methods (IPMs) based on feature extraction algorithms (pixel values and histogram analysis). All IPMs were tested following the criteria of: normality of measurements (Kolmogorov-Smirnov), ability to detect differences among mixing states (anova corrected with post hoc Bonferroni) and moderate-to-high correlation with the number of cycles (Spearman's Rho). The optimal IPM was chosen using multiple criteria decision analysis (MCDA). Measurements provided by all IPMs proved to be normally distributed (P < 0·05), 116 proved sensible to mixing states (P < 0·05), and 35 showed moderate-to-high correlation with the number of cycles (|ρ| > 0·5; P < 0·05). The variance of the histogram of the Hue showed the highest correlation with the number of cycles (ρ = 0·792; P < 0·0001) and the highest MCDA score (optimal). The proposed procedure proved to be reliable and able to select the optimal approach among multiple IPMs. This experiment may be reproduced to identify the optimal approach for each case of locally available test foods. © 2016 John Wiley & Sons Ltd.

  1. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    PubMed

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  2. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China

    PubMed Central

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-01-01

    Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390

  3. A study on single lane-change manoeuvres for determining rearward amplification of multi-trailer articulated heavy vehicles with active trailer steering systems

    NASA Astrophysics Data System (ADS)

    Wang, Qiushi; He, Yuping

    2016-01-01

    The Society of Automotive Engineers issued a test procedure, SAE-J2179, to determine the rearward amplification (RA) of multi-trailer articulated heavy vehicles (MTAHVs). Built upon the procedure, the International Organization for Standardization released the test manoeuvres, ISO-14791, for evaluating directional performance of MTAHVs. For the RA measures, ISO-14791 recommends two single lane-change manoeuvres: (1) an open-loop procedure with a single sine-wave steering input; and (2) a closed-loop manoeuvre with a single sine-wave lateral acceleration input. For an articulated vehicle with active trailer steering (ATS), the RA measure in lateral acceleration under the open-loop manoeuvre was not in good agreement with that under the closed-loop manoeuvre. This observation motivates the research on the applicability of the two manoeuvres for the RA measures of MTAHVs with ATS. It is reported that transient response under the open-loop manoeuvre often leads to asymmetric curve of tractor lateral acceleration [Winkler CB, Fancher PS, Bareket Z, Bogard S, Johnson G, Karamihas S, Mink C. Heavy vehicle size and weight - test procedures for minimum safety performance standards. Final technical report, NHTSA, US DOT, contract DTNH22-87-D-17174, University of Michigan Transportation Research Institute, Report No. UMTRI-92-13; 1992]. To explore the effect of the transient response, a multiple cycle sine-wave steering input (MCSSI) manoeuvre is proposed. Simulation demonstrates that the steady-state RA measures of an MTAHV with and without ATS under the MCSSI manoeuvre are in excellent agreement with those under the closed-loop manoeuvre. It is indicated that between the two manoeuvres by ISO-14791, the closed-loop manoeuvre is more applicable for determining the RA measures of MTAHVs with ATS.

  4. Behavioral mechanisms of context fear generalization in mice

    PubMed Central

    Huckleberry, Kylie A.; Ferguson, Laura B.

    2016-01-01

    There is growing interest in generalization of learned contextual fear, driven in part by the hypothesis that mood and anxiety disorders stem from impaired hippocampal mechanisms of fear generalization and discrimination. However, there has been relatively little investigation of the behavioral and procedural mechanisms that might control generalization of contextual fear. We assessed the relative contribution of different contextual features to context fear generalization and characterized how two common conditioning protocols—foreground (uncued) and background (cued) contextual fear conditioning—affected context fear generalization. In one experiment, mice were fear conditioned in context A, and then tested for contextual fear both in A and in an alternate context created by changing a subset of A's elements. The results suggest that floor configuration and odor are more salient features than chamber shape. A second experiment compared context fear generalization in background and foreground context conditioning. Although foreground conditioning produced more context fear than background conditioning, the two procedures produced equal amounts of generalized fear. Finally, results indicated that the order of context tests (original first versus alternate first) significantly modulates context fear generalization, perhaps because the original and alternate contexts are differentially sensitive to extinction. Overall, results demonstrate that context fear generalization is sensitive to procedural variations and likely reflects the operation of multiple interacting psychological and neural mechanisms. PMID:27918275

  5. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  6. An adaptive two-stage dose-response design method for establishing proof of concept.

    PubMed

    Franchetti, Yoko; Anderson, Stewart J; Sampson, Allan R

    2013-01-01

    We propose an adaptive two-stage dose-response design where a prespecified adaptation rule is used to add and/or drop treatment arms between the stages. We extend the multiple comparison procedures-modeling (MCP-Mod) approach into a two-stage design. In each stage, we use the same set of candidate dose-response models and test for a dose-response relationship or proof of concept (PoC) via model-associated statistics. The stage-wise test results are then combined to establish "global" PoC using a conditional error function. Our simulation studies showed good and more robust power in our design method compared to conventional and fixed designs.

  7. Diode laser-based air mass flux sensor for subsonic aeropropulsion inlets

    NASA Astrophysics Data System (ADS)

    Miller, Michael F.; Kessler, William J.; Allen, Mark G.

    1996-08-01

    An optical air mass flux sensor based on a compact, room-temperature diode laser in a fiber-coupled delivery system has been tested on a full-scale gas turbine engine. The sensor is based on simultaneous measurements of O 2 density and Doppler-shifted velocity along a line of sight across the inlet duct. Extensive tests spanning engine power levels from idle to full afterburner demonstrate accuracy and precision of the order of 1 2 of full scale in density, velocity, and mass flux. The precision-limited velocity at atmospheric pressure was as low as 40 cm s. Multiple data-reduction procedures are quantitatively compared to suggest optimal strategies for flight sensor packages.

  8. Damage monitoring of aircraft structures made of composite materials using wavelet transforms

    NASA Astrophysics Data System (ADS)

    Molchanov, D.; Safin, A.; Luhyna, N.

    2016-10-01

    The present article is dedicated to the study of the acoustic properties of composite materials and the application of non-destructive testing methods to aircraft components. A mathematical model of a wavelet transformed signal is presented. The main acoustic (vibration) properties of different composite material structures were researched. Multiple vibration parameter dependencies on the noise reduction factor were derived. The main steps of a research procedure and new method algorithm are presented. The data obtained was compared with the data from a three dimensional laser-Doppler scanning vibrometer, to validate the results. The new technique was tested in the laboratory and on civil aircraft at a training airfield.

  9. Heavy Ion Irradiation Fluence Dependence for Single-Event Upsets in a NAND Flash Memory

    NASA Technical Reports Server (NTRS)

    Chen, Dakai; Wilcox, Edward; Ladbury, Raymond L.; Kim, Hak; Phan, Anthony; Seidleck, Christina; Label, Kenneth

    2016-01-01

    We investigated the single-event effect (SEE) susceptibility of the Micron 16 nm NAND flash, and found that the single-event upset (SEU) cross section varied inversely with cumulative fluence. We attribute the effect to the variable upset sensitivities of the memory cells. Furthermore, the effect impacts only single cell upsets in general. The rate of multiple-bit upsets remained relatively constant with fluence. The current test standards and procedures assume that SEU follow a Poisson process and do not take into account the variability in the error rate with fluence. Therefore, traditional SEE testing techniques may underestimate the on-orbit event rate for a device with variable upset sensitivity.

  10. Representation of the Physiological Factors Contributing to Postflight Changes in Functional Performance Using Motion Analysis Software

    NASA Technical Reports Server (NTRS)

    Parks, Kelsey

    2010-01-01

    Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.

  11. DIALIGN P: fast pair-wise and multiple sequence alignment using parallel processors.

    PubMed

    Schmollinger, Martin; Nieselt, Kay; Kaufmann, Michael; Morgenstern, Burkhard

    2004-09-09

    Parallel computing is frequently used to speed up computationally expensive tasks in Bioinformatics. Herein, a parallel version of the multi-alignment program DIALIGN is introduced. We propose two ways of dividing the program into independent sub-routines that can be run on different processors: (a) pair-wise sequence alignments that are used as a first step to multiple alignment account for most of the CPU time in DIALIGN. Since alignments of different sequence pairs are completely independent of each other, they can be distributed to multiple processors without any effect on the resulting output alignments. (b) For alignments of large genomic sequences, we use a heuristics by splitting up sequences into sub-sequences based on a previously introduced anchored alignment procedure. For our test sequences, this combined approach reduces the program running time of DIALIGN by up to 97%. By distributing sub-routines to multiple processors, the running time of DIALIGN can be crucially improved. With these improvements, it is possible to apply the program in large-scale genomics and proteomics projects that were previously beyond its scope.

  12. Evaluating the Effectiveness of the Stimulus Pairing Observation Procedure and Multiple Exemplar Instruction on Tact and Listener Responses in Children with Autism

    ERIC Educational Resources Information Center

    Byrne, Brittany L.; Rehfeldt, Ruth Anne; Aguirre, Angelica A.

    2014-01-01

    The stimulus pairing observation procedure (SPOP) combined with multiple exemplar instruction (MEI) has been shown to be effective with typically developing preschoolers in establishing the joint stimulus control required for the development of naming. The purpose of the current investigation was to evaluate the effectiveness and efficiency of the…

  13. An automated device for appetitive conditioning in zebrafish (Danio rerio).

    PubMed

    Manabe, Kazuchika; Dooling, R J; Takaku, Shinichi

    2013-12-01

    An automated device and a procedure for the operant conditioning individual zebrafish were developed. The key feature of this procedure was the construction of a simple, inexpensive feeder that can deliver extremely small amounts of food, thus preventing rapid satiation. This allows the experimenter to run multiple trails in a single test session and multiple sessions in one day. In addition, small response keys made from acryl rods and fiber sensors were developed that were sufficiently sensitive to detect fish contact. To illustrate the efficiency and utility of the device for traditional learning paradigms, we trained zebrafish in a fixed ratio schedule where subjects were reinforced with food after 10 responses. Zebrafish reliably responded on the response key for sessions that lasted as long 80-reinforcements. They also showed the traditional "break and run" response pattern that has been found in many species. These results show that this system will be valuable for behavioral studies with zebrafish, especially for experiments that need many repeated trials using food reinforcer in a session. The present system can be used for sensory and learning investigations, as well applications in behavioral pharmacology, behavioral genetics, and toxicology where the zebrafish is becoming the vertebrate model of choice.

  14. MultipleColposcopyJCO

    Cancer.gov

    Performing multiple biopsies during a procedure known as colposcopy—visual inspection of the cervix—is more effective than performing only a single biopsy of the worst-appearing area for detecting cervical cancer precursors. This multiple biopsy approach

  15. Age- and sex-specific reference values of a test of neck muscle endurance.

    PubMed

    Peolsson, Anneli; Almkvist, Cecilia; Dahlberg, Camilla; Lindqvist, Sara; Pettersson, Susanne

    2007-01-01

    This study evaluates age- and sex-specific reference values for neck muscle endurance (NME). In this cross-sectional study, 116 randomly selected, healthy volunteers (ages 25-64 years) stratified according to age and gender participated. Dorsal and ventral NME was measured in seconds until exhaustion in a laying-down position. A weight of 4 kg for men or 2 kg for women was used in the dorsal procedure. The ventral procedure was performed without external load. Background and physical activity data were obtained and used in the analysis of NME performance. Mean values for dorsal and ventral NME were about 7 and 2.5 minutes for men and 8.5 and 0.5 minutes for women, respectively. The cutoff values for subnormal dorsal and ventral NME were 157 and 56 seconds for men and 173 and 23 seconds for women, respectively. Women's NME was 122% of men's NME in the dorsal (P = .17) and 24% of men's NME in the ventral (P < .0001) procedure. There were no significant differences among age groups. In multiple regression analysis, physical activity explained 4% of variability in the performance of the dorsal NME; and sex explained 37% of the variability in the performance of ventral NME. The reference values and the cutoff points obtained could be used in clinical practice to identify patients with a subnormal NME. Sex is an important consideration when using both the test procedure and the reference values.

  16. Comparing Universal Lynch Syndrome Tumor Screening Programs to Evaluate Associations Between Implementation Strategies and Patient Follow-through

    PubMed Central

    Cragun, Deborah; DeBate, Rita D.; Vadaparampil, Susan T.; Baldwin, Julie; Hampel, Heather; Pal, Tuya

    2014-01-01

    Purpose Universal tumor screening (UTS) for all colorectal cancer (CRC) patients can improve the identification of Lynch syndrome, the most common cause of hereditary CRC. This multiple-case study explored how variability in UTS procedures influence patient follow-through (PF) with germline testing after a screen-positive result. Methods Data were obtained through web-based surveys and telephone interviews with institutional informants. Institutions were categorized as Low-PF (≤10% underwent germline testing), Medium-PF (11–40%), or High-PF (>40%). To identify implementation procedures (i.e., conditions) unique High-PF institutions, qualitative comparative analysis was performed. Results Twenty-one informants from fifteen institutions completed surveys and/or interviews. Conditions present among all five High-PF institutions included: 1) disclosure of screen-positive results to patients by genetic counselors (GCs); and 2) GCs either facilitate physician referrals to genetics or eliminated the need for referrals. Although both of these High-PF conditions were present among two Medium-PF institutions, automatic reflex testing was lacking and difficulty contacting screen-positive patients was a barrier. The three remaining Medium-PF and five Low-PF institutions lacked High-PF conditions. Conclusion Methods for streamlining UTS procedures, incorporating a high level of involvement of GCs in results tracking and communication, and reducing barriers to patient contact are reviewed within a broader discussion on maximizing the effectiveness and public health impact of UTS. PMID:24651603

  17. Generating Models of Surgical Procedures using UMLS Concepts and Multiple Sequence Alignment

    PubMed Central

    Meng, Frank; D’Avolio, Leonard W.; Chen, Andrew A.; Taira, Ricky K.; Kangarloo, Hooshang

    2005-01-01

    Surgical procedures can be viewed as a process composed of a sequence of steps performed on, by, or with the patient’s anatomy. This sequence is typically the pattern followed by surgeons when generating surgical report narratives for documenting surgical procedures. This paper describes a methodology for semi-automatically deriving a model of conducted surgeries, utilizing a sequence of derived Unified Medical Language System (UMLS) concepts for representing surgical procedures. A multiple sequence alignment was computed from a collection of such sequences and was used for generating the model. These models have the potential of being useful in a variety of informatics applications such as information retrieval and automatic document generation. PMID:16779094

  18. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  19. Competence in the musculoskeletal system: assessing the progression of knowledge through an undergraduate medical course.

    PubMed

    Basu, Subhashis; Roberts, Chris; Newble, David I; Snaith, Michael

    2004-12-01

    Professional bodies have expressed concerns that medical students lack appropriate knowledge in musculoskeletal medicine despite its high prevalence of use within the community. Changes in curriculum and teaching strategies may be contributing factors to this. There is little evidence to evaluate the degree to which these concerns are justified. To design and evaluate an assessment procedure that tests the progress of medical students in achieving a core level of knowledge in musculoskeletal medicine during the course. A stratified sample of 136 volunteer students from all 5 years of the medical course at Sheffield University. The progress test concept was adapted to provide a cross-sectional view of student knowledge gain during each year of the course. A test was devised which aimed to provide an assessment of competence set at the standard required of the newly qualified doctor in understanding basic and clinical sciences relevant to musculoskeletal medicine. The test was blueprinted against internal and external guidelines. It comprised 40 multiple-choice and extended matching questions administered by computer. Six musculoskeletal practitioners set the standard using a modified Angoff procedure. Test reliability was 0.6 (Cronbach's alpha). Mean scores of students increased from 41% in Year 1 to 84% by the final year. Data suggest that, from a baseline score in Year 1, there is a disparate experience of learning in Year 2 that evens out in Year 3, with knowledge progression becoming more consistent thereafter. All final year participants scored above the standard predicted by the Angoff procedure. This short computer-based test was a feasible method of estimating student knowledge acquisition in musculoskeletal medicine across the undergraduate curriculum. Tested students appear to have acquired a satisfactory knowledge base by the end of the course. Knowledge gain seemed relatively independent of specialty-specific clinical training. Proposals from specialty bodies to include long periods of disciplinary teaching may be unnecessary.

  20. Medical school admission test: advantages for students whose parents are medical doctors?

    PubMed

    Simmenroth-Nayda, Anne; Görlich, Yvonne

    2015-04-23

    Admission candidates especially in medicine do not represent the socio-demographic proportions of the average population: children of parents with an academic background are highly overrepresented, and those with parents who are medical doctors represent quite a large and special group. At Göttingen University Medicine, a new admission procedure was established with the intention to broaden the base of applicants towards including candidates with previous medical training or lower final school grades. With a view to family background, we wished to know whether candidates differ in the test scores in our admission procedure. In February 2014 we asked all admission candidates of Göttingen University Medicine by questionnaire (nine closed, four open questions) about the academic background in their families, specifically, the medical background, school exam grades, and previous medical training as well as about how they prepared for the admission test. We also analysed data from admission scores of this group (semi-structured interview and four multiple mini-interviews). In addition to descriptive statistics, we used a Pearson correlation, means comparisons (t-test, analysis of variance), ANOVA, and a Scheffé test. In February 2014 nearly half of the applicants (44%) at Göttingen University Medicine had a medical background, most frequently, their parents were physicians. This rate is much higher than reported in the literature. Other socio-demographic baseline data did not differ from the percentages given in the literature. Of all applicants, 20% had previous medical training. The group of applicants with parents who were medical doctors did not show any advantage in either test-scoring (MMI and interview), their individual preparation for the admission test, or in receiving or accepting a place at medical school. Candidates with parents who were medical doctors had scored slightly lower in school exam grades. Our results suggest that there is a self-selection bias as well as a pre-selection for this particular group of applicants. This effect has to be observed during future admission procedures.

  1. Lateral interactions and speed of information processing in highly functioning multiple sclerosis patients.

    PubMed

    Nagy, Helga; Bencsik, Krisztina; Rajda, Cecília; Benedek, Krisztina; Janáky, Márta; Beniczky, Sándor; Kéri, Szabolcs; Vécsei, László

    2007-06-01

    Visual impairment is a common feature of multiple sclerosis. The aim of this study was to investigate lateral interactions in the visual cortex of highly functioning patients with multiple sclerosis and to compare that with basic visual and neuropsychologic functions. Twenty-two young, visually unimpaired multiple sclerosis patients with minimal symptoms (Expanded Disability Status Scale <2) and 30 healthy controls subjects participated in the study. Lateral interactions were investigated with the flanker task, during which participants were asked to detect the orientation of a low-contrast Gabor patch (vertical or horizontal), flanked with 2 collinear or orthogonal Gabor patches. Stimulus exposure time was 40, 60, 80, and 100 ms. Digit span forward/backward, digit symbol, verbal fluency, and California Verbal Learning Test procedures were used for background neuropsychologic assessment. Results revealed that patients with multiple sclerosis showed intact visual contrast sensitivity and neuropsychologic functions, whereas orientation detection in the orthogonal condition was significantly impaired. At 40-ms exposure time, collinear flankers facilitated the orientation detection performance of the patients resulting in normal performance. In conclusion, the detection of briefly presented, low-contrast visual stimuli was selectively impaired in multiple sclerosis. Lateral interactions between target and flankers robustly facilitated target detection in the patient group.

  2. 10 CFR 431.154 - Test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Test procedures. 431.154 Section 431.154 Energy DEPARTMENT... EQUIPMENT Commercial Clothers Washers Test Procedures § 431.154 Test procedures. The test procedures for residential clothes washers in Appendix J1 to subpart B of part 430 of this title shall be used to test...

  3. Using audio script fading and multiple-exemplar training to increase vocal interactions in children with autism.

    PubMed

    Garcia-Albea, Elena; Reeve, Sharon A; Brothers, Kevin J; Reeve, Kenneth F

    2014-01-01

    Script-fading procedures have been shown to be effective for teaching children with autism to initiate and participate in social interactions without vocal prompts from adults. In previous script and script-fading research, however, there has been no demonstration of a generalized repertoire of vocal interactions under the control of naturally occurring relevant stimuli. In this study, 4 boys with autism were taught to initiate a conversation in the presence of toys through the use of a script and script-fading procedure. Training with multiple categories and exemplars of toys was used to increase the likelihood of generalization of vocal interactions across novel toys. A multiple-probe design across participants was used to assess the effects of these procedures. The intervention successfully brought interactions by children with autism under the control of relevant stimuli in the environment. Future research pertaining to the specific implementation of these procedures (e.g., fading, script placement, participant characteristics) is discussed. © Society for the Experimental Analysis of Behavior.

  4. Assessing group differences in biodiversity by simultaneously testing a user-defined selection of diversity indices.

    PubMed

    Pallmann, Philip; Schaarschmidt, Frank; Hothorn, Ludwig A; Fischer, Christiane; Nacke, Heiko; Priesnitz, Kai U; Schork, Nicholas J

    2012-11-01

    Comparing diversities between groups is a task biologists are frequently faced with, for example in ecological field trials or when dealing with metagenomics data. However, researchers often waver about which measure of diversity to choose as there is a multitude of approaches available. As Jost (2008, Molecular Ecology, 17, 4015) has pointed out, widely used measures such as the Shannon or Simpson index have undesirable properties which make them hard to compare and interpret. Many of the problems associated with the use of these 'raw' indices can be corrected by transforming them into 'true' diversity measures. We introduce a technique that allows the comparison of two or more groups of observations and simultaneously tests a user-defined selection of a number of 'true' diversity measures. This procedure yields multiplicity-adjusted P-values according to the method of Westfall and Young (1993, Resampling-Based Multiple Testing: Examples and Methods for p-Value Adjustment, 49, 941), which ensures that the rate of false positives (type I error) does not rise when the number of groups and/or diversity indices is extended. Software is available in the R package 'simboot'. © 2012 Blackwell Publishing Ltd.

  5. Rate of revisions or conversion after bariatric surgery over 10 years in the state of New York.

    PubMed

    Altieri, Maria S; Yang, Jie; Nie, Lizhou; Blackstone, Robin; Spaniolas, Konstantinos; Pryor, Aurora

    2018-04-01

    A primary measure of the success of a procedure is the whether or not additional surgery may be necessary. Multi-institutional studies regarding the need for reoperation after bariatric surgery are scarce. The purpose of this study is to evaluate the rate of revisions/conversions (RC) after 3 common bariatric procedures over 10 years in the state of New York. University Hospital, involving a large database in New York State. The Statewide Planning and Research Cooperative System database was used to identify all patients undergoing laparoscopic adjustable gastric banding (LAGB), sleeve gastrectomy (SG), and Roux-en-Y gastric bypass (RYGB) between 2004 and 2010. Patients were followed for RC to other bariatric procedures for at least 4 years (up to 2014). Multivariable cox proportional hazard regression analysis was performed to identify risk factors for additional surgery after each common bariatric procedure. Multivariable logistic regression was used to check the factors associated with having ≥2 follow-up procedures. There were 40,994 bariatric procedures with 16,444 LAGB, 22,769 RYGB, and 1781 SG. Rate of RC was 26.0% for LAGB, 9.8% for SG, and 4.9% for RYGB. Multiple RC ( = />2) were more common for LAGB (5.7% for LAGB, .5% for RYGB, and .2% for LSG). Band revision/replacements required further procedures compared with patients who underwent conversion to RYGB/SG (939 compared with 48 procedures). Majority of RC were not performed at initial institution (68.2% of LAGB patients, 75.9% for RYGB, 63.7% of SG). Risk factors for multiple procedures included surgery type, as LAGB was more likely to have multiple RC. Reoperation was common for LAGB, but less common for RYGB (4.9%) and SG (9.8%). RC rate are almost twice after SG than after RYGB. LAGB had the highest rate (5.7%) of multiple reoperations. Conversion was the procedure of choice after a failed LAGB. Copyright © 2018 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  6. Combining abdominal and cosmetic breast surgery does not increase short-term complication rates: a comparison of each individual procedure and pretreatment risk stratification tool.

    PubMed

    Khavanin, Nima; Jordan, Sumanas W; Vieira, Brittany L; Hume, Keith M; Mlodinow, Alexei S; Simmons, Christopher J; Murphy, Robert X; Gutowski, Karol A; Kim, John Y S

    2015-11-01

    Combined abdominal and breast surgery presents a convenient and relatively cost-effective approach for accomplishing both procedures. This study is the largest to date assessing the safety of combined procedures, and it aims to develop a simple pretreatment risk stratification method for patients who desire a combined procedure. All women undergoing abdominoplasty, panniculectomy, augmentation mammaplasty, and/or mastopexy in the TOPS database were identified. Demographics and outcomes for combined procedures were compared to individual procedures using χ(2) and Student's t-tests. Multiple logistic regression provided adjusted odds ratios for the effect of a combined procedure on 30-day complications. Among combined procedures, a logistic regression model determined point values for pretreatment risk factors including diabetes (1 point), age over 53 (1), obesity (2), and 3+ ASA status (3), creating a 7-point pretreatment risk stratification tool. A total of 58,756 cases met inclusion criteria. Complication rates among combined procedures (9.40%) were greater than those of aesthetic breast surgery (2.66%; P < .001) but did not significantly differ from abdominal procedures (9.75%; P = .530). Nearly 77% of combined cases were classified as low-risk (0 points total) with a 9.78% complication rates. Medium-risk patients (1 to 3 points) had a 16.63% complication rate, and high-risk (4 to 7 points) 38.46%. Combining abdominal and breast procedures is safe in the majority of patients and does not increase 30-day complications rates. The risk stratification tool can continue to ensure favorable outcomes for patients who may desire a combined surgery. 4 Risk. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  7. Experimental Assessment and Enhancement of Planar Laser-Induced Fluorescence Measurements of Nitric Oxide in an Inverse Diffusion Flame

    NASA Technical Reports Server (NTRS)

    Partridge, William P.; Laurendeau, Normand M.

    1997-01-01

    We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.

  8. An efficient genome-wide association test for multivariate phenotypes based on the Fisher combination function.

    PubMed

    Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne

    2016-01-05

    In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.

  9. MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.

    PubMed

    Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin

    2015-04-01

    Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  10. 48 CFR 232.1004 - Procedure.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 232.1004 Procedure. (c) Instructions for multiple appropriations. If the contract contains foreign military sales...

  11. 48 CFR 232.1004 - Procedure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF DEFENSE GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Performance-Based Payments 232.1004 Procedure. (c) Instructions for multiple appropriations. If the contract contains foreign military sales...

  12. High-resolution imaging using a wideband MIMO radar system with two distributed arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Chen, A-Lei; Su, Yi

    2010-05-01

    Imaging a fast maneuvering target has been an active research area in past decades. Usually, an array antenna with multiple elements is implemented to avoid the motion compensations involved in the inverse synthetic aperture radar (ISAR) imaging. Nevertheless, there is a price dilemma due to the high level of hardware complexity compared to complex algorithm implemented in the ISAR imaging system with only one antenna. In this paper, a wideband multiple-input multiple-output (MIMO) radar system with two distributed arrays is proposed to reduce the hardware complexity of the system. Furthermore, the system model, the equivalent array production method and the imaging procedure are presented. As compared with the classical real aperture radar (RAR) imaging system, there is a very important contribution in our method that the lower hardware complexity can be involved in the imaging system since many additive virtual array elements can be obtained. Numerical simulations are provided for testing our system and imaging method.

  13. Sharply curved turn around duct flow predictions using spectral partitioning of the turbulent kinetic energy and a pressure modified wall law

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1986-01-01

    Computational predictions of turbulent flow in sharply curved 180 degree turn around ducts are presented. The CNS2D computer code is used to solve the equations of motion for two-dimensional incompressible flows transformed to a nonorthogonal body-fitted coordinate system. This procedure incorporates the pressure velocity correction algorithm SIMPLE-C to iteratively solve a discretized form of the transformed equations. A multiple scale turbulence model based on simplified spectral partitioning is employed to obtain closure. Flow field predictions utilizing the multiple scale model are compared to features predicted by the traditional single scale k-epsilon model. Tuning parameter sensitivities of the multiple scale model applied to turn around duct flows are also determined. In addition, a wall function approach based on a wall law suitable for incompressible turbulent boundary layers under strong adverse pressure gradients is tested. Turn around duct flow characteristics utilizing this modified wall law are presented and compared to results based on a standard wall treatment.

  14. 40 CFR 63.694 - Testing methods and procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Testing methods and procedures. 63.694....694 Testing methods and procedures. (a) This section specifies the testing methods and procedures... this subpart, the testing methods and procedures are specified in paragraph (b) of this section. (2) To...

  15. 40 CFR 86.425-78 - Test procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Test procedures. 86.425-78 Section 86... Later New Motorcycles, General Provisions § 86.425-78 Test procedures. (a) Motorcycle emission test procedures are found in subpart F. (b) The Administrator may prescribe emission test procedures for any...

  16. 40 CFR 86.425-78 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 19 2014-07-01 2014-07-01 false Test procedures. 86.425-78 Section 86... Later New Motorcycles, General Provisions § 86.425-78 Test procedures. (a) Motorcycle emission test procedures are found in subpart F. (b) The Administrator may prescribe emission test procedures for any...

  17. 40 CFR 86.425-78 - Test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Test procedures. 86.425-78 Section 86... Later New Motorcycles, General Provisions § 86.425-78 Test procedures. (a) Motorcycle emission test procedures are found in subpart F. (b) The Administrator may prescribe emission test procedures for any...

  18. 40 CFR 600.111-93 - Test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Test procedures. 600.111-93 Section... Emission Regulations for 1978 and Later Model Year Automobiles-Test Procedures § 600.111-93 Test procedures. (a) The test procedures to be followed for generation of the city fuel economy data are those...

  19. 40 CFR 90.508 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Test procedures. 90.508 Section 90.508....508 Test procedures. (a) For nonroad engines subject to the provisions of this subpart, the prescribed test procedures are the appropriate small SI engine test procedures as described in subpart E of this...

  20. 40 CFR 89.508 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Test procedures. 89.508 Section 89.508... Test procedures. (a)(1) For nonroad engines subject to the provisions of this subpart, the prescribed test procedures are the nonroad engine 8-mode test procedure as described in subpart E of this part...

  1. 40 CFR 600.111-80 - Test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Test procedures. 600.111-80 Section... Emission Regulations for 1978 and Later Model Year Automobiles-Test Procedures § 600.111-80 Test procedures. (a) The test procedures to be followed for generation of the city fuel economy data are those...

  2. 21 CFR 355.70 - Testing procedures for fluoride dentifrice drug products.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Testing procedures for fluoride dentifrice drug... Procedures § 355.70 Testing procedures for fluoride dentifrice drug products. (a) A fluoride dentifrice drug... tests: Enamel solubility reduction or fluoride enamel uptake. The testing procedures for these...

  3. 21 CFR 355.70 - Testing procedures for fluoride dentifrice drug products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Testing procedures for fluoride dentifrice drug... Procedures § 355.70 Testing procedures for fluoride dentifrice drug products. (a) A fluoride dentifrice drug... tests: Enamel solubility reduction or fluoride enamel uptake. The testing procedures for these...

  4. Endodontic flare-ups: comparison of incidence between single and multiple visit procedures in patients attending a Nigerian teaching hospital.

    PubMed

    Oginni, Adeleke O; Udoye, Christopher I

    2004-11-26

    BACKGROUND: Until recently the most accepted technique of doing root canal treatment stresses multiple visit procedure. Most schools also concentrated upon teaching the multi-visit concept. However, it has now been reported that the procedure of single visit treatment is advocated by at least 70% of schools in all geographical areas. It was therefore the aims of the present study to find the incidence of post-obturation flare-ups following single and multiple visit endodontic treatment procedures, and to establish the relationship between pre-operative and post-obturation pain in patients referred for endodontic therapy in a Nigerian teaching Hospital. METHODS: Data collected included pulp vitality status, the presence or absence of pre-operative, inter-appointment and post-obturation pain. Pain was recorded as none, slight, or moderate/severe. Flare-ups were defined as either patient's report of pain not controlled with over the counter medication or as increasing swelling. The patients were recalled at three specific post-obturation periods, 1st, 7th and 30th day. The presence or absence of pain, or the appropriate degree of pain was recorded for each recall visits and the interval between visits. The compiled data were analysed using chi-square where applicable. P level

  5. Hybrid transfer-matrix FDTD method for layered periodic structures.

    PubMed

    Deinega, Alexei; Belousov, Sergei; Valuev, Ilya

    2009-03-15

    A hybrid transfer-matrix finite-difference time-domain (FDTD) method is proposed for modeling the optical properties of finite-width planar periodic structures. This method can also be applied for calculation of the photonic bands in infinite photonic crystals. We describe the procedure of evaluating the transfer-matrix elements by a special numerical FDTD simulation. The accuracy of the new method is tested by comparing computed transmission spectra of a 32-layered photonic crystal composed of spherical or ellipsoidal scatterers with the results of direct FDTD and layer-multiple-scattering calculations.

  6. Mobile Offshore Drilling Unit (MODU) Ocean Ranger, O.N. 615641, Capsizing and Sinking in the Atlantic Ocean, on 15 February 1982 with Multiple Loss of Life.

    DTIC Science & Technology

    1983-05-20

    features of off-load and on-load release gears. Model tests in a wave tank have shown this system to reliably provide automatic release of the boat. It...similar to the lifeboats. The approved release hook system automatically releases the raft when the hook is aet during lowering and the raft becomes...the severe storm; the lack of written casualty control procedures; the inadequate ballast system pump and piping design and arrangement for dewatering

  7. A Preliminary Clinical Comparison of the Use of Fascia Lata Allograft and Autogenous Connective Tissue Graft in Multiple Gingival Recession Coverage Based on the Tunnel Technique.

    PubMed

    Bednarz, Wojciech; Żurek, Jacek; Gedrange, Thomas; Dominiak, Marzena

    2016-01-01

    The most effective method for treating gingival recessions (GR) is with an autogenous connective tissue graft (CTG) via flap surgery. Often, however, the amount of CTG that can be grafted is insufficient to cover all of a patient's gingival recessions at one time. The objective of this study was to provide a 6-month comparative assessment of the results of covering multiple Miller Class I and II gingival recessions with a Fascia Lata Allograft (FL) and a CTG harvested from palatal mucosa. The study comprised a total of 30 people who underwent multiple gingival recession (GR) procedures using a modified, coronally advanced tunnel technique (MCAT). The patients were divided into two groups of 15 according to the type of materials used for gingival augmentation purposes: FL for the test group and CTG for the control group. A clinical assessment was made at baseline, as well as 3 and 6 months following surgery. The following factors were assessed: recession depth, recession width, probing depth, clinical attachment level, height of keratinized tissue (HKT), distance between the cemento-enamel junction and the muco-gingival junction (CEJ-MGJ), API, SBI. The following values were calculated: average root coverage (ARC), complete root coverage (CRC). No statistically significant differences were observed between the groups in terms of clinical parameters assessed after 6 months, apart from CRC, which was 94.87 ± 0.14 mm in the control group and 94.24 ± 0.20 mm in the study group (p = 0.034). The average HKT in the control group after 6 months amounted to 2.86 ± 1.60 mm, and in the test group to 3.09 ± 0.95 mm, which translates into an increase in comparison to the baseline values of 0.73 mm (p < 0.001) and 0.48 mm (p = 0.017), respectively. FL Allografts may serve as an alternative to autogenous CTG in multiple gingival recession coverage procedures based on the tunnel technique.

  8. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    PubMed Central

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  9. Methods for detecting long-term CNS dysfunction after prenatal exposure to neurotoxins.

    PubMed

    Vorhees, C V

    1997-11-01

    Current U.S. Environmental Protection Agency regulatory guidelines for developmental neurotoxicity emphasize functional categories such as motor activity, auditory startle, and learning and memory. A single test of some simple form of learning and memory is accepted to meet the latter category. The rationale for this emphasis has been that sensitive and reliable methods for assessing complex learning and memory are either not available or are too burdensome, and that insufficient data exist to endorse one approach over another. There has been little discussion of the fact that learning and memory is not a single identifiable functional category and no single test can assess all types of learning and memory. Three methods for assessing complex learning and memory are presented that assess two different types of learning and memory, are relatively efficient to conduct, and are sensitive to several known neurobehavioral teratogens. The tests are a 9-unit multiple-T swimming maze, and the Morris and Barnes mazes. The first of these assesses sequential learning, while the latter two assess spatial learning. A description of each test is provided, along with procedures for their use, and data exemplifying effects obtained using developmental exposure to phenytoin, methamphetamine, and MDMA. It is argued that multiple tests of learning and memory are required to ascertain cognitive deficits; something no single method can accomplish. Methods for acoustic startle are also presented.

  10. Multiplicative Thinking: Much More than Knowing Multiplication Facts and Procedures

    ERIC Educational Resources Information Center

    Hurst, Chris; Hurrell, Derek

    2016-01-01

    Multiplicative thinking is accepted as a "big idea" of mathematics that underpins important mathematical concepts such as fraction understanding, proportional reasoning, and algebraic thinking. It is characterised by understandings such as the multiplicative relationship between places in the number system, basic and extended number…

  11. Optimal decision making on the basis of evidence represented in spike trains.

    PubMed

    Zhang, Jiaxiang; Bogacz, Rafal

    2010-05-01

    Experimental data indicate that perceptual decision making involves integration of sensory evidence in certain cortical areas. Theoretical studies have proposed that the computation in neural decision circuits approximates statistically optimal decision procedures (e.g., sequential probability ratio test) that maximize the reward rate in sequential choice tasks. However, these previous studies assumed that the sensory evidence was represented by continuous values from gaussian distributions with the same variance across alternatives. In this article, we make a more realistic assumption that sensory evidence is represented in spike trains described by the Poisson processes, which naturally satisfy the mean-variance relationship observed in sensory neurons. We show that for such a representation, the neural circuits involving cortical integrators and basal ganglia can approximate the optimal decision procedures for two and multiple alternative choice tasks.

  12. Investigating the failure of repeated standard cleaning and disinfection of a Pseudomonas aeruginosa-infected pancreatic and biliary endoscope.

    PubMed

    Qiu, Lijun; Zhou, Zhihui; Liu, Qifang; Ni, Yuhua; Zhao, Feng; Cheng, Hao

    2015-08-01

    Digestive endoscopy is an important technique for the diagnosis and treatment of digestive system disease. To assure medical safety, a digestive endoscope must be cleaned and disinfected before its use in an operation on the next patient. The most common treatment procedure on a digestive endoscope is high-level disinfection. The potential risk associated with digestive endoscopes is always the focus of endoscopic management in clinical practice. In this study, a polluted pancreatic and biliary endoscope after surgery was cleaned and disinfected multiple times with the standard procedure but still tested positive for Pseudomonas aeruginosa culture, which is very rare and has not been reported in China or abroad. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  13. Plasma-sprayed self-lubricating coatings

    NASA Technical Reports Server (NTRS)

    Nakamura, H. H.; Logan, W. R.; Harada, Y.

    1982-01-01

    One of the most important criterion for acceptable commercial application of a multiple phase composition is uniformity and reproducibility. This means that the performance characteristics of the coat - e.g., its lubricating properties, bond strength to the substrate, and thermal properties - can be readily predicted to give a desired performance. The improvement of uniformity and reproducibility of the coats, the oxidation behavior at three temperature ranges, the effect of bond coat and the effect of preheat treatment as measured by adhesive strength tests, coating examination procedures, and physical property measurements were studied. The following modifications improved the uniformity and reproducibility: (1) changes and closer control in the particle size range of the raw materials used, (2) increasing the binder content from 3.2% to 4.1% (dried weight), and (3) analytical processing procedures using step by step checking to assure consistency.

  14. Toddler test or procedure preparation

    MedlinePlus

    Preparing toddler for test/procedure; Test/procedure preparation - toddler; Preparing for a medical test or procedure - toddler ... Before the test, know that your child will probably cry. Even if you prepare, your child may feel some discomfort or ...

  15. Developing Procedural Flexibility: Are Novices Prepared to Learn from Comparing Procedures?

    ERIC Educational Resources Information Center

    Rittle-Johnson, Bethany; Star, Jon R.; Durkin, Kelley

    2012-01-01

    Background: A key learning outcome in problem-solving domains is the development of procedural flexibility, where learners know multiple procedures and use them appropriately to solve a range of problems (e.g., Verschaffel, Luwel, Torbeyns, & Van Dooren, 2009). However, students often fail to become flexible problem solvers in mathematics. To…

  16. Elastic-net regularization approaches for genome-wide association studies of rheumatoid arthritis.

    PubMed

    Cho, Seoae; Kim, Haseong; Oh, Sohee; Kim, Kyunga; Park, Taesung

    2009-12-15

    The current trend in genome-wide association studies is to identify regions where the true disease-causing genes may lie by evaluating thousands of single-nucleotide polymorphisms (SNPs) across the whole genome. However, many challenges exist in detecting disease-causing genes among the thousands of SNPs. Examples include multicollinearity and multiple testing issues, especially when a large number of correlated SNPs are simultaneously tested. Multicollinearity can often occur when predictor variables in a multiple regression model are highly correlated, and can cause imprecise estimation of association. In this study, we propose a simple stepwise procedure that identifies disease-causing SNPs simultaneously by employing elastic-net regularization, a variable selection method that allows one to address multicollinearity. At Step 1, the single-marker association analysis was conducted to screen SNPs. At Step 2, the multiple-marker association was scanned based on the elastic-net regularization. The proposed approach was applied to the rheumatoid arthritis (RA) case-control data set of Genetic Analysis Workshop 16. While the selected SNPs at the screening step are located mostly on chromosome 6, the elastic-net approach identified putative RA-related SNPs on other chromosomes in an increased proportion. For some of those putative RA-related SNPs, we identified the interactions with sex, a well known factor affecting RA susceptibility.

  17. Immunological strain specificity within type 1 poliovirus*

    PubMed Central

    Gard, Sven

    1960-01-01

    The demonstration of immunological differences between poliovirus strains of any one type is a valuable procedure in epidemiological research as it may allow a virus strain to be identified as derived from or unrelated to a given possible source of infection. It is obviously of particular importance in connexion with live poliovirus vaccination campaigns. Both kinetic tests and conventional neutralization and complement-fixation techniques have been used to this end, the former involving a more complicated test procedure and the latter demanding greater nicety in the pre-standardization of reagents. The present paper reports on attempts to establish a simplified technique. Neutralization titres of sera obtained by immunization of guinea-pigs with three strains of type 1 poliovirus (including one isolated from a patient in the 1958-59 epidemic in Léopoldville described in the two preceding papers) indicated a degree of strain specificity sufficient to permit the design of a simple screening method for the purpose of a rough immunological classification. Preliminary observations on isolates from persons fed attenuated virus indicate that antigenic changes may occur in the course of multiplication of the virus in the human intestinal tract. PMID:13826481

  18. Quantification of the effects of environmental leaching factors on emissions from bottom ash in road construction.

    PubMed

    Ecke, Holger; Aberg, Annika

    2006-06-01

    The re-use of bottom ash in road construction necessitates a tool to predict the impact of trace metals on the surroundings over the lifetime of the road. The aim of this work was to quantify the effect of environmental factors that are supposed to influence leaching, so as to suggest guidelines in developing a leaching procedure for the testing of incineration residues re-used in road constructions. The effects of pH, L/S (liquid-to-solid ratio), leaching time, and leaching atmosphere on the leachate concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were studied using a two-level full factorial design. The most significant factor for all responses was the pH, followed by L/S, though the importance of pH and L/S is often ignored in leaching tests. Multiple linear regression models describing the variation in leaching data had R(2) values ranging from 61-97%. A two-step pH-stat leaching procedure that considers pH as well as L/S and leaching time was suggested.

  19. MultiGeMS: detection of SNVs from multiple samples using model selection on high-throughput sequencing data.

    PubMed

    Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping

    2016-05-15

    Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. What Is Evidence-Based Behavior Analysis?

    PubMed Central

    Smith, Tristram

    2013-01-01

    Although applied behavior analysts often say they engage in evidence-based practice, they express differing views on what constitutes “evidence” and “practice.” This article describes a practice as a service offered by a provider to help solve a problem presented by a consumer. Solving most problems (e.g., increasing or decreasing a behavior and maintaining this change) requires multiple intervention procedures (i.e., a package). Single-subject studies are invaluable in investigating individual procedures, but researchers still need to integrate the procedures into a package. The package must be standardized enough for independent providers to replicate yet flexible enough to allow individualization; intervention manuals are the primary technology for achieving this balance. To test whether the package is effective in solving consumers' problems, researchers must evaluate outcomes of the package as a whole, usually in group studies such as randomized controlled trials. From this perspective, establishing an evidence-based practice involves more than analyzing the effects of discrete intervention procedures on behavior; it requires synthesizing information so as to offer thorough solutions to problems. Recognizing the need for synthesis offers behavior analysts many promising opportunities to build on their existing research to increase the quality and quantity of evidence-based practices. PMID:25729130

  1. Abatement of waste gases and water during the processes of semiconductor fabrication.

    PubMed

    Wen, Rui-mei; Liang, Jun-wu

    2002-10-01

    The purpose of this article is to examine the methods and equipment for abating waste gases and water produced during the manufacture of semiconductor materials and devices. Three separating methods and equipment are used to control three different groups of electronic wastes. The first group includes arsine and phosphine emitted during the processes of semiconductor materials manufacture. The abatement procedure for this group of pollutants consists of adding iodates, cupric and manganese salts to a multiple shower tower (MST) structure. The second group includes pollutants containing arsenic, phosphorus, HF, HCl, NO2, and SO3 emitted during the manufacture of semiconductor materials and devices. The abatement procedure involves mixing oxidants and bases in an oval column with a separator in the middle. The third group consists of the ions of As, P and heavy metals contained in the waste water. The abatement procedure includes adding CaCO3 and ferric salts in a flocculation-sedimentation compact device equipment. Test results showed that all waste gases and water after the abatement procedures presented in this article passed the discharge standards set by the State Environmental Protection Administration of China.

  2. Procedural performance following sleep deprivation remains impaired despite extended practice and an afternoon nap

    PubMed Central

    Kurniawan, Irma Triasih; Cousins, James Nicholas; Chong, Pearlynne L. H.; Chee, Michael W. L.

    2016-01-01

    The negative impact of sleep loss on procedural memory is well established, yet it remains unclear how extended practice opportunities or daytime naps can modulate the effect of a night of sleep deprivation. Here, participants underwent three training and test conditions on a sequential finger tapping task (SFTT) separated by at least one week. In the first condition they were trained in the evening followed by a night of sleep. Two further conditions took place where evening training was followed by a night of total sleep deprivation (TSD). One of the TSD conditions included a one-hour nap opportunity (15:00). Compared to the condition in which sleep was permitted, a night of TSD resulted in poorer performance across 4 practices the following day (10:00–19:00). The deleterious effect of a single night of TSD on procedural performance, was neither clearly alleviated by an afternoon nap nor by multiple practice opportunities. Interestingly, significant gains in performance were observed in all conditions after a one-week delay. Recovery sleep on subsequent nights thus appeared to nullify the effect of a single night of sleep deprivation, underscoring the importance of offline consolidation on the acquisition of procedural skill. PMID:27782172

  3. Survival and re-operation rates after neurosurgical procedures in Scotland: implications for targeted surveillance of sub-clinical variant Creutzfeldt-Jakob disease.

    PubMed

    Bird, Sheila M; Merrall, Elizabeth L C; Ward, Hester J T; Will, Robert G

    2009-01-01

    To assess the feasibility of post-mortem surveillance for subclinical variant Creutzfeldt-Jakob disease (vCJD) at least 5 years after neurosurgical procedures. Using Scottish record linkage, we estimated 5-year survival and re-operation rates after 4 neurosurgical procedures performed during 1993-2001 and identified as high or medium risk for transmitting vCJD: [B] drainage of extra- or subdural haematoma, [E] primary or revisional decompression operations and [H] creation of other ventricular shunts were classified as high risk; [C] operations on cerebral aneurysm (clipping) were classified as medium risk. Fatality rate at 1 year depended strongly on procedure, weakly or not at all on sex and era, and increased with age. Procedure rates differed by sex. The rate of subsequent neurosurgical operations was highest for procedure [H] (sole: 21%; multiple: 28%). Each year, the UK has a new cohort of some 5,000 5-year survivors after a high- or medium-risk neurosurgical procedure, whose subsequent annual mortality is at least 3%. Even if half the surviving 5-year survivors of neurosurgery since 1996 gave consent-in-life for vCJD-informative testing at post-mortem, there would be too few relevant post-mortems in 2008-2010 (around 1,600) for 'nil detections' to exclude a 1 in 1,000 subclinical vCJD rate. Autopsy surveillance beyond 2010, or among 5-year survivors of non-neurosurgical at-risk operations, would be needed. (c) 2009 S. Karger AG, Basel.

  4. Parameter estimation and forecasting for multiplicative log-normal cascades

    NASA Astrophysics Data System (ADS)

    Leövey, Andrés E.; Lux, Thomas

    2012-04-01

    We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing [Physica DPDNPDT0167-278910.1016/0167-2789(90)90035-N 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica DPDNPDT0167-278910.1016/j.physd.2004.01.020 193, 195 (2004)] and Kiyono [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.76.041113 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono 's procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.

  5. Modelling CO2 flow in naturally fractured geological media using MINC and multiple subregion upscaling procedure

    NASA Astrophysics Data System (ADS)

    Tatomir, Alexandru Bogdan A. C.; Flemisch, Bernd; Class, Holger; Helmig, Rainer; Sauter, Martin

    2017-04-01

    Geological storage of CO2 represents one viable solution to reduce greenhouse gas emission in the atmosphere. Potential leakage of CO2 storage can occur through networks of interconnected fractures. The geometrical complexity of these networks is often very high involving fractures occurring at various scales and having hierarchical structures. Such multiphase flow systems are usually hard to solve with a discrete fracture modelling (DFM) approach. Therefore, continuum fracture models assuming average properties are usually preferred. The multiple interacting continua (MINC) model is an extension of the classic double porosity model (Warren and Root, 1963) which accounts for the non-linear behaviour of the matrix-fracture interactions. For CO2 storage applications the transient representation of the inter-porosity two phase flow plays an important role. This study tests the accuracy and computational efficiency of the MINC method complemented with the multiple sub-region (MSR) upscaling procedure versus the DFM. The two phase flow MINC simulator is implemented in the free-open source numerical toolbox DuMux (www.dumux.org). The MSR (Gong et al., 2009) determines the inter-porosity terms by solving simplified local single-phase flow problems. The DFM is considered as the reference solution. The numerical examples consider a quasi-1D reservoir with a quadratic fracture system , a five-spot radial symmetric reservoir, and a completely random generated fracture system. Keywords: MINC, upscaling, two-phase flow, fractured porous media, discrete fracture model, continuum fracture model

  6. 40 CFR 86.884-5 - Test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Test procedures. 86.884-5 Section 86... New Diesel Heavy-Duty Engines; Smoke Exhaust Test Procedure § 86.884-5 Test procedures. The procedures described in this and subsequent sections will be the test program to determine the conformity of engines...

  7. 40 CFR 86.608-98 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 19 2014-07-01 2014-07-01 false Test procedures. 86.608-98 Section 86... New Light-Duty Vehicles, Light-Duty Trucks, and Heavy-Duty Vehicles § 86.608-98 Test procedures. (a) The prescribed test procedures are the Federal Test Procedure, as described in subpart B of this part...

  8. 40 CFR 86.608-98 - Test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Test procedures. 86.608-98 Section 86... Auditing of New Light-Duty Vehicles, Light-Duty Trucks, and Heavy-Duty Vehicles § 86.608-98 Test procedures. (a) The prescribed test procedures are the Federal Test Procedure, as described in subpart B and/or...

  9. 14 CFR Appendix F to Part 23 - Test Procedure

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Test Procedure F Appendix F to Part 23...—Test Procedure Part I—Acceptable Test Procedure for Self-Extinguishing Materials for Showing Compliance With §§ 23.853, 23.855, and 23.1359 Acceptable test procedure for self-extinguishing materials for...

  10. 40 CFR 600.111-08 - Test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Test procedures. 600.111-08 Section... Emission Regulations for 1978 and Later Model Year Automobiles-Test Procedures § 600.111-08 Test procedures. This section provides test procedures for the FTP, highway, US06, SC03, and the cold temperature FTP...

  11. 14 CFR Appendix F to Part 23 - Test Procedure

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Test Procedure F Appendix F to Part 23...—Test Procedure Part I—Acceptable Test Procedure for Self-Extinguishing Materials for Showing Compliance With §§ 23.853, 23.855, and 23.1359 Acceptable test procedure for self-extinguishing materials for...

  12. 40 CFR 86.884-5 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 19 2014-07-01 2014-07-01 false Test procedures. 86.884-5 Section 86... Heavy-Duty Engines; Smoke Exhaust Test Procedure § 86.884-5 Test procedures. The procedures described in this and subsequent sections will be the test program to determine the conformity of engines with the...

  13. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Test procedures and standards. 51.357... Requirements § 51.357 Test procedures and standards. Written test procedures and pass/fail standards shall be established and followed for each model year and vehicle type included in the program. (a) Test procedure...

  14. Keeping It in Three Dimensions: Measuring the Development of Mental Rotation in Children with the Rotated Colour Cube Test (RCCT).

    PubMed

    Lütke, Nikolay; Lange-Küttner, Christiane

    2015-08-03

    This study introduces the new Rotated Colour Cube Test (RCCT) as a measure of object identification and mental rotation using single 3D colour cube images in a matching-to-sample procedure. One hundred 7- to 11-year-old children were tested with aligned or rotated cube models, distracters and targets. While different orientations of distracters made the RCCT more difficult, different colours of distracters had the opposite effect and made the RCCT easier because colour facilitated clearer discrimination between target and distracters. Ten-year-olds performed significantly better than 7- to 8-year-olds. The RCCT significantly correlated with children's performance on the Raven's Coloured Progressive Matrices Test (RCPM) presumably due to the shared multiple-choice format, but the RCCT was easier, as it did not require sequencing. Children from families with a high socio-economic status performed best on both tests, with boys outperforming girls on the more difficult RCCT test sections.

  15. Recent Improvements in Semi-Span Testing at the National Transonic Facility (Invited)

    NASA Technical Reports Server (NTRS)

    Gatlin, G. M.; Tomek, W. G.; Payne, F. M.; Griffiths, R. C.

    2006-01-01

    Three wind tunnel investigations of a commercial transport, high-lift, semi-span configuration have recently been conducted in the National Transonic Facility at the NASA Langley Research Center. Throughout the course of these investigations multiple improvements have been developed in the facility semi-span test capability. The primary purpose of the investigations was to assess Reynolds number scale effects on a modern commercial transport configuration up to full-scale flight test conditions (Reynolds numbers on the order of 27 million). The tests included longitudinal aerodynamic studies at subsonic takeoff and landing conditions across a range of Reynolds numbers from that available in conventional wind tunnels up to flight conditions. The purpose of this paper is to discuss lessons learned and improvements incorporated into the semi-span testing process. Topics addressed include enhanced thermal stabilization and moisture reduction procedures, assessments and improvements in model sealing techniques, compensation of model reference dimensions due to test temperature, significantly improved semi-span model access capability, and assessments of data repeatability.

  16. A Bayesian test for Hardy–Weinberg equilibrium of biallelic X-chromosomal markers

    PubMed Central

    Puig, X; Ginebra, J; Graffelman, J

    2017-01-01

    The X chromosome is a relatively large chromosome, harboring a lot of genetic information. Much of the statistical analysis of X-chromosomal information is complicated by the fact that males only have one copy. Recently, frequentist statistical tests for Hardy–Weinberg equilibrium have been proposed specifically for dealing with markers on the X chromosome. Bayesian test procedures for Hardy–Weinberg equilibrium for the autosomes have been described, but Bayesian work on the X chromosome in this context is lacking. This paper gives the first Bayesian approach for testing Hardy–Weinberg equilibrium with biallelic markers at the X chromosome. Marginal and joint posterior distributions for the inbreeding coefficient in females and the male to female allele frequency ratio are computed, and used for statistical inference. The paper gives a detailed account of the proposed Bayesian test, and illustrates it with data from the 1000 Genomes project. In that implementation, a novel approach to tackle multiple testing from a Bayesian perspective through posterior predictive checks is used. PMID:28900292

  17. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  18. Evaluation of efficacy and indications of surgical fixation for multiple rib fractures: a propensity-score matched analysis.

    PubMed

    Uchida, K; Nishimura, T; Takesada, H; Morioka, T; Hagawa, N; Yamamoto, T; Kaga, S; Terada, T; Shinyama, N; Yamamoto, H; Mizobata, Y

    2017-08-01

    The purpose of this study was to assess the effects of recent surgical rib fixation and establish its indications not only for flail chest but also for multiple rib fractures. Between 2007 and 2015, 187 patients were diagnosed as having multiple rib fractures in our institution. After the propensity score matching was performed, ten patients who had performed surgical rib fixation and ten patients who had treated with non-operative management were included. Categorical variables were analyzed with Fischer's exact test and non-parametric numerical data were compared using the Mann-Whitney U test. Wilcoxon signed-rank test was performed for comparison of pre- and postoperative variables. All statistical data are presented as median (25-75 % interquartile range [IQR]) or number. The surgically treated patients extubated significantly earlier than non-operative management patients (5.5 [1-8] vs 9 [7-12] days: p = 0.019). The duration of continuous intravenous narcotic agents infusion days (4.5 [3-6] vs 12 [9-14] days: p = 0.002) and the duration of intensive care unit stay (6.5 [3-9] vs 12 [8-14] days: p = 0.008) were also significantly shorter in surgically treated patients. Under the same ventilating conditions, the postoperative values of tidal volume and respiratory rate improved significantly compared to those values measured just before the surgery. The incidence of pneumonia as a complication was significantly higher in non-operative management group (p = 0.05). From the viewpoints of early respiratory stabilization and intensive care unit disposition without any complications, surgical rib fixation is a sufficiently acceptable procedure not only for flail chest but also for repair of severe multiple rib fractures.

  19. Using Simultaneous Prompting Procedure to Promote Recall of Multiplication Facts by Middle School Students with Cognitive Impairment

    ERIC Educational Resources Information Center

    Rao, Shaila; Mallow, Lynette

    2009-01-01

    This study examined effectiveness of simultaneous prompting system in teaching students with cognitive impairment to automate recall of multiplication facts. A multiple probes design with multiple sets of math facts and replicated across multiple subjects was used to assess effectiveness of simultaneous prompting on recall of basic multiplication…

  20. Effects of Mathematics Computer Games on Special Education Students' Multiplicative Reasoning Ability

    ERIC Educational Resources Information Center

    Bakker, Marjoke; van den Heuvel-Panhuizen, Marja; Robitzsch, Alexander

    2016-01-01

    This study examined the effects of a teacher-delivered intervention with online mathematics mini-games on special education students' multiplicative reasoning ability (multiplication and division). The games involved declarative, procedural, as well as conceptual knowledge of multiplicative relations, and were accompanied with teacher-led lessons…

  1. SWI 1.10 Testing Process

    NASA Technical Reports Server (NTRS)

    Stokes, LeBarian

    2009-01-01

    This procedure establishes a system for performing testing in the Six-Degree-Of-Freedom Dynamic Test System (SDTS). Testing includes development and verification testing of customer supplied Test Articles (TAs) and other testing requirements, as requested. This procedure applies to all SDTS testing operations and equipment. The procedure provides an overview of testing performed in the SDTS including test identification requirements, test planning and procedure development, test and performance inspection, test data analysis, and test report generation.

  2. 75 FR 47817 - Notice of Availability: Test Tools and Test Procedures Approved for the Office of the National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-09

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Notice of Availability: Test Tools and Test Procedures.... SUMMARY: This notice announces the availability of test tools and test procedures approved by the National... certification program. The approved test tools and test procedures are identified on the ONC Web site at: http...

  3. 77 FR 64343 - Notice of Availability: Test Tools and Test Procedures Approved for the Office of the National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Notice of Availability: Test Tools and Test Procedures.... SUMMARY: This notice announces the availability of test tools and test procedures approved by the National... test tools and test procedures are identified on the ONC Web site at: http://www.healthit.gov/policy...

  4. The Effects of Direct Instruction Flashcard and Math Racetrack Procedures on Mastery of Basic Multiplication Facts by Three Elementary School Students

    ERIC Educational Resources Information Center

    Skarr, Adam; Zielinski, Katie; Ruwe, Kellen; Sharp, Hannah; Williams, Randy L.; McLaughlin, T. F.

    2014-01-01

    The purpose of this study was to determine if a typical third-grade boy and fifth-grade girl and a boy with learning disabilities could benefit from the combined use of Direct Instruction (DI) flashcard and math racetrack procedures in an after-school program. The dependent variable was accuracy and fluency of saying basic multiplication facts. A…

  5. Preschooler test or procedure preparation

    MedlinePlus

    Preparing preschoolers for test/procedure; Test/procedure preparation - preschooler ... Preparing children for medical tests can reduce their anxiety. It can also make them less likely to cry and resist the procedure. Research shows that ...

  6. 77 FR 4203 - Energy Conservation Program: Test Procedures for General Service Fluorescent Lamps, General...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-27

    ...On September 14, 2011, the U.S. Department of Energy (DOE) issued a notice of proposed rulemaking (NOPR) to amend the test procedures for general service fluorescent lamps (GSFLs), general service incandescent lamps (GSILs), and incandescent reflector lamps (IRLs). That proposed rulemaking serves as the basis for today's action. DOE is amending its test procedures for GSFLs and GSILs established under the Energy Policy and Conservation Act (EPCA). DOE is not amending in this final rule the existing test procedure for IRLs established under EPCA. For GSFLs and GSILs, DOE is updating several references to the industry standards referenced in DOE's test procedures. DOE is also establishing a lamp lifetime test procedure for GSILs. These test procedures also provide the protocols upon which the Federal Trade Commission bases its energy guide label for these products. DOE's review of the GSFL, GSIL, and IRL test procedures fulfills the EPCA requirement that DOE review test procedures for all covered products at least once every seven years.

  7. Test Operations Procedure (TOP) 03-2-827 Test Procedures for Video Target Scoring Using Calibration Lights

    DTIC Science & Technology

    2016-04-04

    Final 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Test Operations Procedure (TOP) 03-2-827 Test Procedures for Video Target Scoring Using...ABSTRACT This Test Operations Procedure (TOP) describes typical equipment and procedures to setup and operate a Video Target Scoring System (VTSS) to...lights. 15. SUBJECT TERMS Video Target Scoring System, VTSS, witness screens, camera, target screen, light pole 16. SECURITY

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miraglia, Roberto, E-mail: rmiraglia@ismett.edu; Maruzzelli, Luigi; Tuzzolino, Fabio

    Purpose: The aim of this study was to estimate radiation exposure in pediatric liver transplants recipients who underwent biliary interventional procedures and to compare radiation exposure levels between biliary interventional procedures performed using an image intensifier-based angiographic system (IIDS) and a flat panel detector-based interventional system (FPDS). Materials and Methods: We enrolled 34 consecutive pediatric liver transplant recipients with biliary strictures between January 2008 and March 2013 with a total of 170 image-guided procedures. The dose-area product (DAP) and fluoroscopy time was recorded for each procedure. The mean age was 61 months (range 4-192), and mean weight was 17 kgmore » (range 4-41). The procedures were classified into three categories: percutaneous transhepatic cholangiography and biliary catheter placement (n = 40); cholangiography and balloon dilatation (n = 55); and cholangiography and biliary catheter change or removal (n = 75). Ninety-two procedures were performed using an IIDS. Seventy-eight procedures performed after July 2010 were performed using an FPDS. The difference in DAP between the two angiographic systems was compared using Wilcoxon rank-sum test and a multiple linear regression model. Results: Mean DAP in the three categories was significantly greater in the group of procedures performed using the IIDS compared with those performed using the FPDS. Statistical analysis showed a p value = 0.001 for the PTBD group, p = 0.0002 for the cholangiogram and balloon dilatation group, and p = 0.00001 for the group with cholangiogram and biliary catheter change or removal. Conclusion: In our selected cohort of patients, the use of an FPDS decreases radiation exposure.« less

  9. Integrated data analysis for genome-wide research.

    PubMed

    Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim

    2007-01-01

    Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.

  10. Including persistency of impairment in mild cognitive impairment classification enhances prediction of 5-year decline.

    PubMed

    Vandermorris, Susan; Hultsch, David F; Hunter, Michael A; MacDonald, Stuart W S; Strauss, Esther

    2011-02-01

    Although older adults with Mild Cognitive Impairment (MCI) show elevated rates of conversion to dementia as a group, heterogeneity of outcomes is common at the individual level. Using data from a prospective 5-year longitudinal investigation of cognitive change in healthy older adults (N = 262, aged 64-92 years), this study addressed limitations in contemporary MCI identification procedures which rely on single occasion assessment ("Single-Assessment [SA] MCI") by evaluating an alternate operational definition of MCI requiring evidence of persistent cognitive impairment over multiple-testing sessions ("Multiple-Assessment [MA] MCI"). As hypothesized, prevalence of SA-MCI exceeded that of MA-MCI. Further, the MA-MCI groups showed lower baseline cognitive and functional performance and steeper cognitive decline compared with Control and SA-MCI group. Results are discussed with reference to retest effects and clinical implications.

  11. Hypnosis as sole anaesthesia for skin tumour removal in a patient with multiple chemical sensitivity.

    PubMed

    Facco, E; Pasquali, S; Zanette, G; Casiglia, E

    2013-09-01

    A female patient with multiple chemical sensitivity and previous anaphylactoid reactions to local anaesthetics was admitted for removal of a thigh skin tumour under hypnosis as sole anaesthesia. The hypnotic protocol included hypnotic focused analgesia and a pre-operative pain threshold test. After inducing hypnosis, a wide excision was performed, preserving the deep fascia, and the tumour was removed; the patient's heart rate and blood pressure did not increase during the procedure. When the patient was de-hypnotised, she reported no pain and was discharged immediately. Our case confirms the efficacy of hypnosis and demonstrates that it may be valuable as a sole anaesthetic method in selected cases. Hypnosis can prevent pain perception and surgical stress as a whole, comparing well with anaesthetic drugs. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  12. Using a fuzzy comprehensive evaluation method to determine product usability: A test case

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942

  13. Using a fuzzy comprehensive evaluation method to determine product usability: A test case.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.

  14. Implementing and testing theoretical fission fragment yields in a Hauser-Feshbach statistical decay framework

    NASA Astrophysics Data System (ADS)

    Jaffke, Patrick; Möller, Peter; Stetcu, Ionel; Talou, Patrick; Schmitt, Christelle

    2018-03-01

    We implement fission fragment yields, calculated using Brownian shape-motion on a macroscopic-microscopic potential energy surface in six dimensions, into the Hauser-Feshbach statistical decay code CGMF. This combination allows us to test the impact of utilizing theoretically-calculated fission fragment yields on the subsequent prompt neutron and γ-ray emission. We draw connections between the fragment yields and the total kinetic energy TKE of the fission fragments and demonstrate that the use of calculated yields can introduce a difference in the 〈TKE〉 and, thus, the prompt neutron multiplicity v, as compared with experimental fragment yields. We deduce the uncertainty on the 〈TKE〉 and v from this procedure and identify possible applications.

  15. Micropropagation, antinociceptive and antioxidant activities of extracts of Verbena litoralis Kunth (Verbenaceae).

    PubMed

    Braga, Virgínia F; Mendes, Giselle C; Oliveira, Raphael T R; Soares, Carla Q G; Resende, Cristiano F; Pinto, Leandro C; Santana, Reinaldo de; Viccini, Lyderson F; Raposo, Nádia R B; Peixoto, Paulo H P

    2012-03-01

    This work describes an efficient micropropagation protocol for Verbena litoralis and the study of the antinociceptive and antioxidant activities in extracts of this species. For the establishment in vitro, surface-sterilization procedures and PVPP showed high efficiency in fungal-bacterial contamination and phenol oxidation controls. Nodal segments cultivation in MS medium supplemented with 6-benzyladenine (7.5 µM)/α-naphthaleneacetic acid (NAA; 0.005 µM) induced multiple shoots. Elongated shoots were rooted with IAA (0.2 µM). Acclimatization rates were elevated and the plants showed the typical features of this species. The hexanic fraction (HF) of powdered leaves presented a radical scavenging activity with IC(50) = 169.3 µg mL(-1). HF showed a non-dose dependent analgesic activity in the writhing test; its antinociceptive activity in the hot plate test was restricted to 500 mg kg(-1), which is the highest dose. The results of this study showed the potential of tissue culture on conservation and large scale multiplication and confirmed the traditional folk medicine use of V. litoralis.

  16. The ‘Pokemon’ (ZBTB7) Gene: No Evidence of Association with Sporadic Breast Cancer

    PubMed Central

    Salas, Antonio; Vega, Ana; Milne, Roger L.; García-Magariños, Manuel; Ruibal, Álvaro; Benítez, Javier; Carracedo, Ángel

    2008-01-01

    It has been proposed that the excess of familiar risk associated with breast cancer could be explained by the cumulative effect of multiple weakly predisposing alleles. The transcriptional repressor FBI1, also known as Pokemon, has recently been identified as a critical factor in oncogenesis. This protein is encoded by the ZBTB7 gene. Here we aimed to determine whether polymorphisms in ZBTB7 are associated with breast cancer risk in a sample of cases and controls collected in hospitals from North and Central Spanish patients. We genotyped 15 SNPs in ZBTB7, including the flanking regions, with an average coverage of 1 SNP/2.4 Kb, in 360 sporadic breast cancer cases and 402 controls. Comparison of allele, genotype and haplotype frequencies between cases and controls did not reveal associations using Pearson’s chi-square test and a permutation procedure to correct for multiple test. In this, the first study of the ZBTB7 gene in relation to, sporadic breast cancer, we found no evidence of an association. PMID:21892298

  17. Stereo vision tracking of multiple objects in complex indoor environments.

    PubMed

    Marrón-Romera, Marta; García, Juan C; Sotelo, Miguel A; Pizarro, Daniel; Mazo, Manuel; Cañas, José M; Losada, Cristina; Marcos, Alvaro

    2010-01-01

    This paper presents a novel system capable of solving the problem of tracking multiple targets in a crowded, complex and dynamic indoor environment, like those typical of mobile robot applications. The proposed solution is based on a stereo vision set in the acquisition step and a probabilistic algorithm in the obstacles position estimation process. The system obtains 3D position and speed information related to each object in the robot's environment; then it achieves a classification between building elements (ceiling, walls, columns and so on) and the rest of items in robot surroundings. All objects in robot surroundings, both dynamic and static, are considered to be obstacles but the structure of the environment itself. A combination of a Bayesian algorithm and a deterministic clustering process is used in order to obtain a multimodal representation of speed and position of detected obstacles. Performance of the final system has been tested against state of the art proposals; test results validate the authors' proposal. The designed algorithms and procedures provide a solution to those applications where similar multimodal data structures are found.

  18. Significant Linkage for Tourette Syndrome in a Large French Canadian Family

    PubMed Central

    Mérette, Chantal; Brassard, Andrée; Potvin, Anne; Bouvier, Hélène; Rousseau, François; Émond, Claudia; Bissonnette, Luc; Roy, Marc-André; Maziade, Michel; Ott, Jurg; Caron, Chantal

    2000-01-01

    Family and twin studies provide strong evidence that genetic factors are involved in the transmission of Gilles de la Tourette syndrome (TS) and related psychiatric disorders. To detect the underlying susceptibility gene(s) for TS, we performed linkage analysis in one large French Canadian family (127 members) from the Charlevoix region, in which 20 family members were definitely affected by TS and 20 others showed related tic disorders. Using model-based linkage analysis, we observed a LOD score of 3.24 on chromosome 11 (11q23). This result was obtained in a multipoint approach involving marker D11S1377, the marker for which significant linkage disequilibrium with TS recently has been detected in an Afrikaner population. Altogether, 25 markers were studied, and, for level of significance, we derived a criterion that took into account the multiple testing arising from the use of three phenotype definitions and three modes of inheritance, a procedure that yielded a LOD score of 3.18. Hence, even after adjustment for multiple testing, the present study shows statistically significant evidence for genetic linkage with TS. PMID:10986045

  19. 40 CFR 92.506 - Test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Test procedures. 92.506 Section 92.506... and Audit Programs § 92.506 Test procedures. (a)(1) For locomotives and locomotive engines subject to the provisions of this subpart, the prescribed test procedures are those procedures described in...

  20. 3D Printing Provides a Precise Approach in the Treatment of Tetralogy of Fallot, Pulmonary Atresia with Major Aortopulmonary Collateral Arteries.

    PubMed

    Anwar, Shafkat; Rockefeller, Toby; Raptis, Demetrios A; Woodard, Pamela K; Eghtesady, Pirooz

    2018-02-03

    Patients with tetralogy of Fallot, pulmonary atresia, and multiple aortopulmonary collateral arteries (Tet PA MAPCAs) have a wide spectrum of anatomy and disease severity. Management of these patients can be challenging and often require multiple high-risk surgical and interventional catheterization procedures. These interventions are made challenging by complex anatomy that require the proceduralist to mentally reconstruct three-dimensional anatomic relationships from two-dimensional images. Three-dimensional (3D) printing is an emerging medical technology that provides added benefits in the management of patients with Tet PA MAPCAs. When used in combination with current diagnostic modalities and procedures, 3D printing provides a precise approach to the management of these challenging, high-risk patients. Specifically, 3D printing enables detailed surgical and interventional planning prior to the procedure, which may improve procedural outcomes, decrease complications, and reduce procedure-related radiation dose and contrast load.

Top