Sample records for multiple testing method

  1. The Testing Methods and Gender Differences in Multiple-Choice Assessment

    NASA Astrophysics Data System (ADS)

    Ng, Annie W. Y.; Chan, Alan H. S.

    2009-10-01

    This paper provides a comprehensive review of the multiple-choice assessment in the past two decades for facilitating people to conduct effective testing in various subject areas. It was revealed that a variety of multiple-choice test methods viz. conventional multiple-choice, liberal multiple-choice, elimination testing, confidence marking, probability testing, and order-of-preference scheme are available for use in assessing subjects' knowledge and decision ability. However, the best multiple-choice test method for use has not yet been identified. The review also indicated that the existence of gender differences in multiple-choice task performance might be due to the test area, instruction/scoring condition, and item difficulty.

  2. Multiple testing and power calculations in genetic association studies.

    PubMed

    So, Hon-Cheong; Sham, Pak C

    2011-01-01

    Modern genetic association studies typically involve multiple single-nucleotide polymorphisms (SNPs) and/or multiple genes. With the development of high-throughput genotyping technologies and the reduction in genotyping cost, investigators can now assay up to a million SNPs for direct or indirect association with disease phenotypes. In addition, some studies involve multiple disease or related phenotypes and use multiple methods of statistical analysis. The combination of multiple genetic loci, multiple phenotypes, and multiple methods of evaluating associations between genotype and phenotype means that modern genetic studies often involve the testing of an enormous number of hypotheses. When multiple hypothesis tests are performed in a study, there is a risk of inflation of the type I error rate (i.e., the chance of falsely claiming an association when there is none). Several methods for multiple-testing correction are in popular use, and they all have strengths and weaknesses. Because no single method is universally adopted or always appropriate, it is important to understand the principles, strengths, and weaknesses of the methods so that they can be applied appropriately in practice. In this article, we review the three principle methods for multiple-testing correction and provide guidance for calculating statistical power.

  3. Common pitfalls in statistical analysis: The perils of multiple testing

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2016-01-01

    Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478

  4. Overview of multi-input frequency domain modal testing methods with an emphasis on sine testing

    NASA Technical Reports Server (NTRS)

    Rost, Robert W.; Brown, David L.

    1988-01-01

    An overview of the current state of the art multiple-input, multiple-output modal testing technology is discussed. A very brief review of the current time domain methods is given. A detailed review of frequency and spatial domain methods is presented with an emphasis on sine testing.

  5. Comparison of paragraph comprehension test scores with reading versus listening-reading and multiple-choice versus nominal recall administration techniques: justification for the bypass approach.

    PubMed

    Weinberg, W A; McLean, A; Snider, R L; Rintelmann, J W; Brumback, R A

    1989-12-01

    Eight groups of learning disabled children (N = 100), categorized by the clinical Lexical Paradigm as good readers or poor readers, were individually administered the Gilmore Oral Reading Test, Form D, by one of four input/retrieval methods: (1) the standardized method of administration in which the child reads each paragraph aloud and then answers five questions relating to the paragraph [read/recall method]; (2) the child reads each paragraph aloud and then for each question selects the correct answer from among three choices read by the examiner [read/choice method]; (3) the examiner reads each paragraph aloud and reads each of the five questions to the child to answer [listen/recall method]; and (4) the examiner reads each paragraph aloud and then for each question reads three multiple-choice answers from which the child selects the correct answer [listen/choice method]. The major difference in scores was between the groups tested by the recall versus the orally read multiple-choice methods. This study indicated that poor readers who listened to the material and were tested by orally read multiple-choice format could perform as well as good readers. The performance of good readers was not affected by listening or by the method of testing. The multiple-choice testing improved the performance of poor readers independent of the input method. This supports the arguments made previously that a "bypass approach" to education of poor readers in which testing is accomplished using an orally read multiple-choice format can enhance the child's school performance on reading-related tasks. Using a listening while reading input method may further enhance performance.

  6. Association analysis of multiple traits by an approach of combining P values.

    PubMed

    Chen, Lili; Wang, Yong; Zhou, Yajing

    2018-03-01

    Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.

  7. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  8. Magic Finger Teaching Method in Learning Multiplication Facts among Deaf Students

    ERIC Educational Resources Information Center

    Thai, Liong; Yasin, Mohd. Hanafi Mohd

    2016-01-01

    Deaf students face problems in mastering multiplication facts. This study aims to identify the effectiveness of Magic Finger Teaching Method (MFTM) and students' perception towards MFTM. The research employs a quasi experimental with non-equivalent pre-test and post-test control group design. Pre-test, post-test and questionnaires were used. As…

  9. Non-destructive testing method and apparatus utilizing phase multiplication holography

    DOEpatents

    Collins, H. Dale; Prince, James M.; Davis, Thomas J.

    1984-01-01

    An apparatus and method for imaging of structural characteristics in test objects using radiation amenable to coherent signal processing methods. Frequency and phase multiplication of received flaw signals is used to simulate a test wavelength at least one to two orders of magnitude smaller than the actual wavelength. The apparent reduction in wavelength between the illumination and recording radiation performs a frequency translation hologram. The hologram constructed with a high synthetic frequency and flaw phase multiplication is similar to a conventional acoustic hologram construction at the high frequency.

  10. Rapid and Accurate Multiple Testing Correction and Power Estimation for Millions of Correlated Markers

    PubMed Central

    Han, Buhm; Kang, Hyun Min; Eskin, Eleazar

    2009-01-01

    With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255

  11. Multiple Testing of Gene Sets from Gene Ontology: Possibilities and Pitfalls.

    PubMed

    Meijer, Rosa J; Goeman, Jelle J

    2016-09-01

    The use of multiple testing procedures in the context of gene-set testing is an important but relatively underexposed topic. If a multiple testing method is used, this is usually a standard familywise error rate (FWER) or false discovery rate (FDR) controlling procedure in which the logical relationships that exist between the different (self-contained) hypotheses are not taken into account. Taking those relationships into account, however, can lead to more powerful variants of existing multiple testing procedures and can make summarizing and interpreting the final results easier. We will show that, from the perspective of interpretation as well as from the perspective of power improvement, FWER controlling methods are more suitable than FDR controlling methods. As an example of a possible power improvement, we suggest a modified version of the popular method by Holm, which we also implemented in the R package cherry. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  12. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  13. An Illustration to Assist in Comparing and Remembering Several Multiplicity Adjustment Methods

    ERIC Educational Resources Information Center

    Hasler, Mario

    2017-01-01

    There are many well-known or new methods to adjust statistical tests for multiplicity. This article provides an illustration helping lecturers or consultants to remember the differences of three important multiplicity adjustment methods and to explain them to non-statisticians.

  14. A Residual Mass Ballistic Testing Method to Compare Armor Materials or Components (Residual Mass Ballistic Testing Method)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin Langhorst; Thomas M Lillo; Henry S Chu

    2014-05-01

    A statistics based ballistic test method is presented for use when comparing multiple groups of test articles of unknown relative ballistic perforation resistance. The method is intended to be more efficient than many traditional methods for research and development testing. To establish the validity of the method, it is employed in this study to compare test groups of known relative ballistic performance. Multiple groups of test articles were perforated using consistent projectiles and impact conditions. Test groups were made of rolled homogeneous armor (RHA) plates and differed in thickness. After perforation, each residual projectile was captured behind the target andmore » its mass was measured. The residual masses measured for each test group were analyzed to provide ballistic performance rankings with associated confidence levels. When compared to traditional V50 methods, the residual mass (RM) method was found to require fewer test events and be more tolerant of variations in impact conditions.« less

  15. Restricted random search method based on taboo search in the multiple minima problem

    NASA Astrophysics Data System (ADS)

    Hong, Seung Do; Jhon, Mu Shik

    1997-03-01

    The restricted random search method is proposed as a simple Monte Carlo sampling method to search minima fast in the multiple minima problem. This method is based on taboo search applied recently to continuous test functions. The concept of the taboo region instead of the taboo list is used and therefore the sampling of a region near an old configuration is restricted in this method. This method is applied to 2-dimensional test functions and the argon clusters. This method is found to be a practical and efficient method to search near-global configurations of test functions and the argon clusters.

  16. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  17. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    PubMed

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  18. Network meta-analysis of diagnostic test accuracy studies identifies and ranks the optimal diagnostic tests and thresholds for health care policy and decision-making.

    PubMed

    Owen, Rhiannon K; Cooper, Nicola J; Quinn, Terence J; Lees, Rosalind; Sutton, Alex J

    2018-07-01

    Network meta-analyses (NMA) have extensively been used to compare the effectiveness of multiple interventions for health care policy and decision-making. However, methods for evaluating the performance of multiple diagnostic tests are less established. In a decision-making context, we are often interested in comparing and ranking the performance of multiple diagnostic tests, at varying levels of test thresholds, in one simultaneous analysis. Motivated by an example of cognitive impairment diagnosis following stroke, we synthesized data from 13 studies assessing the efficiency of two diagnostic tests: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), at two test thresholds: MMSE <25/30 and <27/30, and MoCA <22/30 and <26/30. Using Markov chain Monte Carlo (MCMC) methods, we fitted a bivariate network meta-analysis model incorporating constraints on increasing test threshold, and accounting for the correlations between multiple test accuracy measures from the same study. We developed and successfully fitted a model comparing multiple tests/threshold combinations while imposing threshold constraints. Using this model, we found that MoCA at threshold <26/30 appeared to have the best true positive rate, whereas MMSE at threshold <25/30 appeared to have the best true negative rate. The combined analysis of multiple tests at multiple thresholds allowed for more rigorous comparisons between competing diagnostics tests for decision making. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds

    PubMed Central

    Deeks, J.J.; Martin, E.C.; Riley, R.D.

    2017-01-01

    Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347

  20. Differential Item Functioning Detection Using the Multiple Indicators, Multiple Causes Method with a Pure Short Anchor

    ERIC Educational Resources Information Center

    Shih, Ching-Lin; Wang, Wen-Chung

    2009-01-01

    The multiple indicators, multiple causes (MIMIC) method with a pure short anchor was proposed to detect differential item functioning (DIF). A simulation study showed that the MIMIC method with an anchor of 1, 2, 4, or 10 DIF-free items yielded a well-controlled Type I error rate even when such tests contained as many as 40% DIF items. In general,…

  1. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  2. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  3. An Adaptive Association Test for Multiple Phenotypes with GWAS Summary Statistics.

    PubMed

    Kim, Junghi; Bai, Yun; Pan, Wei

    2015-12-01

    We study the problem of testing for single marker-multiple phenotype associations based on genome-wide association study (GWAS) summary statistics without access to individual-level genotype and phenotype data. For most published GWASs, because obtaining summary data is substantially easier than accessing individual-level phenotype and genotype data, while often multiple correlated traits have been collected, the problem studied here has become increasingly important. We propose a powerful adaptive test and compare its performance with some existing tests. We illustrate its applications to analyses of a meta-analyzed GWAS dataset with three blood lipid traits and another with sex-stratified anthropometric traits, and further demonstrate its potential power gain over some existing methods through realistic simulation studies. We start from the situation with only one set of (possibly meta-analyzed) genome-wide summary statistics, then extend the method to meta-analysis of multiple sets of genome-wide summary statistics, each from one GWAS. We expect the proposed test to be useful in practice as more powerful than or complementary to existing methods. © 2015 WILEY PERIODICALS, INC.

  4. Testing for association with multiple traits in generalized estimation equations, with application to neuroimaging data.

    PubMed

    Zhang, Yiwei; Xu, Zhiyuan; Shen, Xiaotong; Pan, Wei

    2014-08-01

    There is an increasing need to develop and apply powerful statistical tests to detect multiple traits-single locus associations, as arising from neuroimaging genetics and other studies. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI), in addition to genome-wide single nucleotide polymorphisms (SNPs), thousands of neuroimaging and neuropsychological phenotypes as intermediate phenotypes for Alzheimer's disease, have been collected. Although some classic methods like MANOVA and newly proposed methods may be applied, they have their own limitations. For example, MANOVA cannot be applied to binary and other discrete traits. In addition, the relationships among these methods are not well understood. Importantly, since these tests are not data adaptive, depending on the unknown association patterns among multiple traits and between multiple traits and a locus, these tests may or may not be powerful. In this paper we propose a class of data-adaptive weights and the corresponding weighted tests in the general framework of generalized estimation equations (GEE). A highly adaptive test is proposed to select the most powerful one from this class of the weighted tests so that it can maintain high power across a wide range of situations. Our proposed tests are applicable to various types of traits with or without covariates. Importantly, we also analytically show relationships among some existing and our proposed tests, indicating that many existing tests are special cases of our proposed tests. Extensive simulation studies were conducted to compare and contrast the power properties of various existing and our new methods. Finally, we applied the methods to an ADNI dataset to illustrate the performance of the methods. We conclude with the recommendation for the use of the GEE-based Score test and our proposed adaptive test for their high and complementary performance. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Normalized Rotational Multiple Yield Surface Framework (NRMYSF) stress-strain curve prediction method based on small strain triaxial test data on undisturbed Auckland residual clay soils

    NASA Astrophysics Data System (ADS)

    Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.

    2018-04-01

    Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.

  6. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    NASA Astrophysics Data System (ADS)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  7. Multiple Testing with Modified Bonferroni Methods.

    ERIC Educational Resources Information Center

    Li, Jianmin; And Others

    This paper discusses the issue of multiple testing and overall Type I error rates in contexts other than multiple comparisons of means. It demonstrates, using a 5 x 5 correlation matrix, the application of 5 recently developed modified Bonferroni procedures developed by the following authors: (1) Y. Hochberg (1988); (2) B. S. Holland and M. D.…

  8. Effects of Multiple Intelligences Activities on Writing Skill Development in an EFL Context

    ERIC Educational Resources Information Center

    Gündüz, Zennure Elgün; Ünal, Ismail Dogan

    2016-01-01

    This study aims at exploring the effects of multiple intelligences activities versus traditional method on English writing development of the sixth grade students in Turkey. A quasi-experimental research method with a pre-test post-test design was applied. The participants were 50 sixth grade students at a state school in Ardahan in Turkey. The…

  9. Utilizing the Zero-One Linear Programming Constraints to Draw Multiple Sets of Matched Samples from a Non-Treatment Population as Control Groups for the Quasi-Experimental Design

    ERIC Educational Resources Information Center

    Li, Yuan H.; Yang, Yu N.; Tompkins, Leroy J.; Modarresi, Shahpar

    2005-01-01

    The statistical technique, "Zero-One Linear Programming," that has successfully been used to create multiple tests with similar characteristics (e.g., item difficulties, test information and test specifications) in the area of educational measurement, was deemed to be a suitable method for creating multiple sets of matched samples to be…

  10. A spreadsheet template compatible with Microsoft Excel and iWork Numbers that returns the simultaneous confidence intervals for all pairwise differences between multiple sample means.

    PubMed

    Brown, Angus M

    2010-04-01

    The objective of the method described in this paper is to develop a spreadsheet template for the purpose of comparing multiple sample means. An initial analysis of variance (ANOVA) test on the data returns F--the test statistic. If F is larger than the critical F value drawn from the F distribution at the appropriate degrees of freedom, convention dictates rejection of the null hypothesis and allows subsequent multiple comparison testing to determine where the inequalities between the sample means lie. A variety of multiple comparison methods are described that return the 95% confidence intervals for differences between means using an inclusive pairwise comparison of the sample means. 2009 Elsevier Ireland Ltd. All rights reserved.

  11. How Should Colleges Treat Multiple Admissions Test Scores? ACT Working Paper 2017-4

    ERIC Educational Resources Information Center

    Mattern, Krista; Radunzel, Justine; Bertling, Maria; Ho, Andrew

    2017-01-01

    The percentage of students retaking college admissions tests is rising (Harmston & Crouse, 2016). Researchers and college admissions offices currently use a variety of methods for summarizing these multiple scores. Testing companies, interested in validity evidence like correlations with college first-year grade-point averages (FYGPA), often…

  12. Multiple testing corrections in quantitative proteomics: A useful but blunt tool.

    PubMed

    Pascovici, Dana; Handler, David C L; Wu, Jemma X; Haynes, Paul A

    2016-09-01

    Multiple testing corrections are a useful tool for restricting the FDR, but can be blunt in the context of low power, as we demonstrate by a series of simple simulations. Unfortunately, in proteomics experiments low power can be common, driven by proteomics-specific issues like small effects due to ratio compression, and few replicates due to reagent high cost, instrument time availability and other issues; in such situations, most multiple testing corrections methods, if used with conventional thresholds, will fail to detect any true positives even when many exist. In this low power, medium scale situation, other methods such as effect size considerations or peptide-level calculations may be a more effective option, even if they do not offer the same theoretical guarantee of a low FDR. Thus, we aim to highlight in this article that proteomics presents some specific challenges to the standard multiple testing corrections methods, which should be employed as a useful tool but not be regarded as a required rubber stamp. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    PubMed

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  14. Examining Measurement Invariance and Differential Item Functioning with Discrete Latent Construct Indicators: A Note on a Multiple Testing Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Dimitrov, Dimiter M.; Marcoulides, George A.; Li, Tatyana; Menold, Natalja

    2018-01-01

    A latent variable modeling method for studying measurement invariance when evaluating latent constructs with multiple binary or binary scored items with no guessing is outlined. The approach extends the continuous indicator procedure described by Raykov and colleagues, utilizes similarly the false discovery rate approach to multiple testing, and…

  15. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Using multiple travel paths to estimate daily travel distance in arboreal, group-living primates.

    PubMed

    Steel, Ruth Irene

    2015-01-01

    Primate field studies often estimate daily travel distance (DTD) in order to estimate energy expenditure and/or test foraging hypotheses. In group-living species, the center of mass (CM) method is traditionally used to measure DTD; a point is marked at the group's perceived center of mass at a set time interval or upon each move, and the distance between consecutive points is measured and summed. However, for groups using multiple travel paths, the CM method potentially creates a central path that is shorter than the individual paths and/or traverses unused areas. These problems may compromise tests of foraging hypotheses, since distance and energy expenditure could be underestimated. To better understand the magnitude of these potential biases, I designed and tested the multiple travel paths (MTP) method, in which DTD was calculated by recording all travel paths taken by the group's members, weighting each path's distance based on its proportional use by the group, and summing the weighted distances. To compare the MTP and CM methods, DTD was calculated using both methods in three groups of Udzungwa red colobus monkeys (Procolobus gordonorum; group size 30-43) for a random sample of 30 days between May 2009 and March 2010. Compared to the CM method, the MTP method provided significantly longer estimates of DTD that were more representative of the actual distance traveled and the areas used by a group. The MTP method is more time-intensive and requires multiple observers compared to the CM method. However, it provides greater accuracy for testing ecological and foraging models.

  17. Assessing Multiple Choice Question (MCQ) Tests--A Mathematical Perspective

    ERIC Educational Resources Information Center

    Scharf, Eric M.; Baldwin, Lynne P.

    2007-01-01

    The reasoning behind popular methods for analysing the raw data generated by multiple choice question (MCQ) tests is not always appreciated, occasionally with disastrous results. This article discusses and analyses three options for processing the raw data produced by MCQ tests. The article shows that one extreme option is not to penalize a…

  18. Feasibility and Reliability of Two Different Walking Tests in People with Severe Intellectual and Sensory Disabilities

    ERIC Educational Resources Information Center

    Waninge, A.; Evenhuis, I. J.; van Wijck, R.; van der Schans, C. P.

    2011-01-01

    Background: The purpose of this study is to describe feasibility and test-retest reliability of the six-minute walking distance test (6MWD) and an adapted shuttle run test (aSRT) in persons with severe intellectual and sensory (multiple) disabilities. Materials and Methods: Forty-seven persons with severe multiple disabilities, with Gross Motor…

  19. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  20. An extended sequential goodness-of-fit multiple testing method for discrete data.

    PubMed

    Castro-Conde, Irene; Döhler, Sebastian; de Uña-Álvarez, Jacobo

    2017-10-01

    The sequential goodness-of-fit (SGoF) multiple testing method has recently been proposed as an alternative to the familywise error rate- and the false discovery rate-controlling procedures in high-dimensional problems. For discrete data, the SGoF method may be very conservative. In this paper, we introduce an alternative SGoF-type procedure that takes into account the discreteness of the test statistics. Like the original SGoF, our new method provides weak control of the false discovery rate/familywise error rate but attains false discovery rate levels closer to the desired nominal level, and thus it is more powerful. We study the performance of this method in a simulation study and illustrate its application to a real pharmacovigilance data set.

  1. Atmospheric turbulence profiling with SLODAR using multiple adaptive optics wavefront sensors.

    PubMed

    Wang, Lianqi; Schöck, Matthias; Chanan, Gary

    2008-04-10

    The slope detection and ranging (SLODAR) method recovers atmospheric turbulence profiles from time averaged spatial cross correlations of wavefront slopes measured by Shack-Hartmann wavefront sensors. The Palomar multiple guide star unit (MGSU) was set up to test tomographic multiple guide star adaptive optics and provided an ideal test bed for SLODAR turbulence altitude profiling. We present the data reduction methods and SLODAR results from MGSU observations made in 2006. Wind profiling is also performed using delayed wavefront cross correlations along with SLODAR analysis. The wind profiling analysis is shown to improve the height resolution of the SLODAR method and in addition gives the wind velocities of the turbulent layers.

  2. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  3. Variance Difference between Maximum Likelihood Estimation Method and Expected A Posteriori Estimation Method Viewed from Number of Test Items

    ERIC Educational Resources Information Center

    Mahmud, Jumailiyah; Sutikno, Muzayanah; Naga, Dali S.

    2016-01-01

    The aim of this study is to determine variance difference between maximum likelihood and expected A posteriori estimation methods viewed from number of test items of aptitude test. The variance presents an accuracy generated by both maximum likelihood and Bayes estimation methods. The test consists of three subtests, each with 40 multiple-choice…

  4. Adhesive Defect Monitoring of Glass Fiber Epoxy Plate Using an Impedance-Based Non-Destructive Testing Method for Multiple Structures

    PubMed Central

    Na, Wongi S.; Baek, Jongdae

    2017-01-01

    The emergence of composite materials has revolutionized the approach to building engineering structures. With the number of applications for composites increasing every day, maintaining structural integrity is of utmost importance. For composites, adhesive bonding is usually the preferred choice over the mechanical fastening method, and monitoring for delamination is an essential factor in the field of composite materials. In this study, a non-destructive method known as the electromechanical impedance method is used with an approach of monitoring multiple areas by specifying certain frequency ranges to correspond to a certain test specimen. Experiments are conducted using various numbers of stacks created by attaching glass fiber epoxy composite plates onto one another, and two different debonding damage types are introduced to evaluate the performance of the multiple monitoring electromechanical impedance method. PMID:28629194

  5. Best (but oft-forgotten) practices: the multiple problems of multiplicity-whether and how to correct for many statistical tests.

    PubMed

    Streiner, David L

    2015-10-01

    Testing many null hypotheses in a single study results in an increased probability of detecting a significant finding just by chance (the problem of multiplicity). Debates have raged over many years with regard to whether to correct for multiplicity and, if so, how it should be done. This article first discusses how multiple tests lead to an inflation of the α level, then explores the following different contexts in which multiplicity arises: testing for baseline differences in various types of studies, having >1 outcome variable, conducting statistical tests that produce >1 P value, taking multiple "peeks" at the data, and unplanned, post hoc analyses (i.e., "data dredging," "fishing expeditions," or "P-hacking"). It then discusses some of the methods that have been proposed for correcting for multiplicity, including single-step procedures (e.g., Bonferroni); multistep procedures, such as those of Holm, Hochberg, and Šidák; false discovery rate control; and resampling approaches. Note that these various approaches describe different aspects and are not necessarily mutually exclusive. For example, resampling methods could be used to control the false discovery rate or the family-wise error rate (as defined later in this article). However, the use of one of these approaches presupposes that we should correct for multiplicity, which is not universally accepted, and the article presents the arguments for and against such "correction." The final section brings together these threads and presents suggestions with regard to when it makes sense to apply the corrections and how to do so. © 2015 American Society for Nutrition.

  6. Conservatism implications of shock test tailoring for multiple design environments

    NASA Technical Reports Server (NTRS)

    Baca, Thomas J.; Bell, R. Glenn; Robbins, Susan A.

    1987-01-01

    A method for analyzing shock conservation in test specifications that have been tailored to qualify a structure for multiple design environments is discussed. Shock test conservation is qualified for shock response spectra, shock intensity spectra and ranked peak acceleration data in terms of an Index of Conservation (IOC) and an Overtest Factor (OTF). The multi-environment conservation analysis addresses the issue of both absolute and average conservation. The method is demonstrated in a case where four laboratory tests have been specified to qualify a component which must survive seven different field environments. Final judgment of the tailored test specification is shown to require an understanding of the predominant failure modes of the test item.

  7. Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics

    PubMed Central

    Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven

    2011-01-01

    Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957

  8. Evaluation of MIMIC-Model Methods for DIF Testing with Comparison to Two-Group Analysis

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2009-01-01

    Differential item functioning (DIF) occurs when an item on a test or questionnaire has different measurement properties for 1 group of people versus another, irrespective of mean differences on the construct. This study focuses on the use of multiple-indicator multiple-cause (MIMIC) structural equation models for DIF testing, parameterized as item…

  9. Sample Size Calculation for Estimating or Testing a Nonzero Squared Multiple Correlation Coefficient

    ERIC Educational Resources Information Center

    Krishnamoorthy, K.; Xia, Yanping

    2008-01-01

    The problems of hypothesis testing and interval estimation of the squared multiple correlation coefficient of a multivariate normal distribution are considered. It is shown that available one-sided tests are uniformly most powerful, and the one-sided confidence intervals are uniformly most accurate. An exact method of calculating sample size to…

  10. Meta-analysis of quantitative pleiotropic traits for next-generation sequencing with multivariate functional linear models

    PubMed Central

    Chiu, Chi-yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-ling; Xiong, Momiao; Fan, Ruzong

    2017-01-01

    To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data. PMID:28000696

  11. Meta-analysis of quantitative pleiotropic traits for next-generation sequencing with multivariate functional linear models.

    PubMed

    Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong

    2017-02-01

    To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.

  12. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  13. Multicollinearity is a red herring in the search for moderator variables: A guide to interpreting moderated multiple regression models and a critique of Iacobucci, Schneider, Popovich, and Bakamitsos (2016).

    PubMed

    McClelland, Gary H; Irwin, Julie R; Disatnik, David; Sivan, Liron

    2017-02-01

    Multicollinearity is irrelevant to the search for moderator variables, contrary to the implications of Iacobucci, Schneider, Popovich, and Bakamitsos (Behavior Research Methods, 2016, this issue). Multicollinearity is like the red herring in a mystery novel that distracts the statistical detective from the pursuit of a true moderator relationship. We show multicollinearity is completely irrelevant for tests of moderator variables. Furthermore, readers of Iacobucci et al. might be confused by a number of their errors. We note those errors, but more positively, we describe a variety of methods researchers might use to test and interpret their moderated multiple regression models, including two-stage testing, mean-centering, spotlighting, orthogonalizing, and floodlighting without regard to putative issues of multicollinearity. We cite a number of recent studies in the psychological literature in which the researchers used these methods appropriately to test, to interpret, and to report their moderated multiple regression models. We conclude with a set of recommendations for the analysis and reporting of moderated multiple regression that should help researchers better understand their models and facilitate generalizations across studies.

  14. A Comparison of Methods for Transforming Sentences into Test Questions for Instructional Materials. Technical Report #1.

    ERIC Educational Resources Information Center

    Roid, Gale; And Others

    Several measurement theorists have convincingly argued that methods of writing test questions, particularly for criterion-referenced tests, should be based on operationally defined rules. This study was designed to examine and further refine a method for objectively generating multiple-choice questions for prose instructional materials. Important…

  15. A multiple scales approach to maximal superintegrability

    NASA Astrophysics Data System (ADS)

    Gubbiotti, G.; Latini, D.

    2018-07-01

    In this paper we present a simple, algorithmic test to establish if a Hamiltonian system is maximally superintegrable or not. This test is based on a very simple corollary of a theorem due to Nekhoroshev and on a perturbative technique called the multiple scales method. If the outcome is positive, this test can be used to suggest maximal superintegrability, whereas when the outcome is negative it can be used to disprove it. This method can be regarded as a finite dimensional analog of the multiple scales method as a way to produce soliton equations. We use this technique to show that the real counterpart of a mechanical system found by Jules Drach in 1935 is, in general, not maximally superintegrable. We give some hints on how this approach could be applied to classify maximally superintegrable systems by presenting a direct proof of the well-known Bertrand’s theorem.

  16. Equating in Small-Scale Language Testing Programs

    ERIC Educational Resources Information Center

    LaFlair, Geoffrey T.; Isbell, Daniel; May, L. D. Nicolas; Gutierrez Arvizu, Maria Nelly; Jamieson, Joan

    2017-01-01

    Language programs need multiple test forms for secure administrations and effective placement decisions, but can they have confidence that scores on alternate test forms have the same meaning? In large-scale testing programs, various equating methods are available to ensure the comparability of forms. The choice of equating method is informed by…

  17. Resampling-based Methods in Single and Multiple Testing for Equality of Covariance/Correlation Matrices

    PubMed Central

    Yang, Yang; DeGruttola, Victor

    2016-01-01

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients. PMID:22740584

  18. Resampling-based methods in single and multiple testing for equality of covariance/correlation matrices.

    PubMed

    Yang, Yang; DeGruttola, Victor

    2012-06-22

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.

  19. Standard setting: comparison of two methods.

    PubMed

    George, Sanju; Haque, M Sayeed; Oyebode, Femi

    2006-09-14

    The outcome of assessments is determined by the standard-setting method used. There is a wide range of standard-setting methods and the two used most extensively in undergraduate medical education in the UK are the norm-reference and the criterion-reference methods. The aims of the study were to compare these two standard-setting methods for a multiple-choice question examination and to estimate the test-retest and inter-rater reliability of the modified Angoff method. The norm-reference method of standard-setting (mean minus 1 SD) was applied to the 'raw' scores of 78 4th-year medical students on a multiple-choice examination (MCQ). Two panels of raters also set the standard using the modified Angoff method for the same multiple-choice question paper on two occasions (6 months apart). We compared the pass/fail rates derived from the norm reference and the Angoff methods and also assessed the test-retest and inter-rater reliability of the modified Angoff method. The pass rate with the norm-reference method was 85% (66/78) and that by the Angoff method was 100% (78 out of 78). The percentage agreement between Angoff method and norm-reference was 78% (95% CI 69% - 87%). The modified Angoff method had an inter-rater reliability of 0.81-0.82 and a test-retest reliability of 0.59-0.74. There were significant differences in the outcomes of these two standard-setting methods, as shown by the difference in the proportion of candidates that passed and failed the assessment. The modified Angoff method was found to have good inter-rater reliability and moderate test-retest reliability.

  20. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  1. Theory of chromatic noise masking applied to testing linearity of S-cone detection mechanisms.

    PubMed

    Giulianini, Franco; Eskew, Rhea T

    2007-09-01

    A method for testing the linearity of cone combination of chromatic detection mechanisms is applied to S-cone detection. This approach uses the concept of mechanism noise, the noise as seen by a postreceptoral neural mechanism, to represent the effects of superposing chromatic noise components in elevating thresholds and leads to a parameter-free prediction for a linear mechanism. The method also provides a test for the presence of multiple linear detectors and off-axis looking. No evidence for multiple linear mechanisms was found when using either S-cone increment or decrement tests. The results for both S-cone test polarities demonstrate that these mechanisms combine their cone inputs nonlinearly.

  2. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  3. Teaching Composition Skills with Weekly Multiple Choice Tests in Lieu of Theme Writing. Final Report.

    ERIC Educational Resources Information Center

    Scannell, Dale P.; Haugh, Oscar M.

    The purpose of the study was to compare the effectiveness with which composition skills could be taught by the traditional theme-assignment approach and by an experimental method using weekly multiple-choice composition tests in lieu of theme writing. The weekly tests were based on original but typical first-draft compositions and covered problems…

  4. An Empirical Comparison of DDF Detection Methods for Understanding the Causes of DIF in Multiple-Choice Items

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Talley, Anna E.

    2015-01-01

    This study compared and illustrated four differential distractor functioning (DDF) detection methods for analyzing multiple-choice items. The log-linear approach, two item response theory-model-based approaches with likelihood ratio tests, and the odds ratio approach were compared to examine the congruence among the four DDF detection methods.…

  5. A Case Study on Multiple-Choice Testing in Anatomical Sciences

    ERIC Educational Resources Information Center

    Golda, Stephanie DuPont

    2011-01-01

    Objective testing techniques, such as multiple-choice examinations, are a widely accepted method of assessment in gross anatomy. In order to deter cheating on these types of examinations, instructors often design several versions of an examination to distribute. These versions usually involve the rearrangement of questions and their corresponding…

  6. Will the "Real" Proficiency Standard Please Stand Up?

    ERIC Educational Resources Information Center

    Baron, Joan Boykoff; And Others

    Connecticut's experience with four different standard-setting methods regarding multiple choice proficiency tests is described. The methods include Angoff, Nedelsky, Borderline Group, and Contrasting Groups Methods. All Connecticut ninth graders were administered proficiency tests in reading, language arts, and mathematics. As soon as final test…

  7. False Discovery Control in Large-Scale Spatial Multiple Testing

    PubMed Central

    Sun, Wenguang; Reich, Brian J.; Cai, T. Tony; Guindani, Michele; Schwartzman, Armin

    2014-01-01

    Summary This article develops a unified theoretical and computational framework for false discovery control in multiple testing of spatial signals. We consider both point-wise and cluster-wise spatial analyses, and derive oracle procedures which optimally control the false discovery rate, false discovery exceedance and false cluster rate, respectively. A data-driven finite approximation strategy is developed to mimic the oracle procedures on a continuous spatial domain. Our multiple testing procedures are asymptotically valid and can be effectively implemented using Bayesian computational algorithms for analysis of large spatial data sets. Numerical results show that the proposed procedures lead to more accurate error control and better power performance than conventional methods. We demonstrate our methods for analyzing the time trends in tropospheric ozone in eastern US. PMID:25642138

  8. Creep compliance and percent recovery of Oklahoma certified binder using the multiple stress recovery (MSCR) method.

    DOT National Transportation Integrated Search

    2015-04-01

    A laboratory study was conducted to develop guidelines for the Multiple Stress Creep Recovery : (MSCR) test method for local conditions prevailing in Oklahoma. The study consisted of : commonly used binders in Oklahoma, namely PG 64-22, PG 70-28, and...

  9. Teaching Electric Circuits with Multiple Batteries: A Qualitative Approach

    ERIC Educational Resources Information Center

    Smith, David P.; van Kampen, Paul

    2011-01-01

    We have investigated preservice science teachers' qualitative understanding of circuits consisting of multiple batteries in single and multiple loops using a pretest and post-test method and classroom observations. We found that most students were unable to explain the effects of adding batteries in single and multiple loops, as they tended to use…

  10. From Addition to Multiplication ... and Back: The Development of Students' Additive and Multiplicative Reasoning Skills

    ERIC Educational Resources Information Center

    Van Dooren, Wim; De Bock, Dirk; Verschaffel, Lieven

    2010-01-01

    This study builds on two lines of research that have so far developed largely separately: the use of additive methods to solve proportional word problems and the use of proportional methods to solve additive word problems. We investigated the development with age of both kinds of erroneous solution methods. We gave a test containing missing-value…

  11. An efficient genome-wide association test for multivariate phenotypes based on the Fisher combination function.

    PubMed

    Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne

    2016-01-05

    In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.

  12. Correcting for multiple-testing in multi-arm trials: is it necessary and is it done?

    PubMed

    Wason, James M S; Stecher, Lynne; Mander, Adrian P

    2014-09-17

    Multi-arm trials enable the evaluation of multiple treatments within a single trial. They provide a way of substantially increasing the efficiency of the clinical development process. However, since multi-arm trials test multiple hypotheses, some regulators require that a statistical correction be made to control the chance of making a type-1 error (false-positive). Several conflicting viewpoints are expressed in the literature regarding the circumstances in which a multiple-testing correction should be used. In this article we discuss these conflicting viewpoints and review the frequency with which correction methods are currently used in practice. We identified all multi-arm clinical trials published in 2012 by four major medical journals. Summary data on several aspects of the trial design were extracted, including whether the trial was exploratory or confirmatory, whether a multiple-testing correction was applied and, if one was used, what type it was. We found that almost half (49%) of published multi-arm trials report using a multiple-testing correction. The percentage that corrected was higher for trials in which the experimental arms included multiple doses or regimens of the same treatments (67%). The percentage that corrected was higher in exploratory than confirmatory trials, although this is explained by a greater proportion of exploratory trials testing multiple doses and regimens of the same treatment. A sizeable proportion of published multi-arm trials do not correct for multiple-testing. Clearer guidance about whether multiple-testing correction is needed for multi-arm trials that test separate treatments against a common control group is required.

  13. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    ERIC Educational Resources Information Center

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  14. A Tutorial on Multiple Testing: False Discovery Control

    NASA Astrophysics Data System (ADS)

    Chatelain, F.

    2016-09-01

    This paper presents an overview of criteria and methods in multiple testing, with an emphasis on the false discovery rate control. The popular Benjamini and Hochberg procedure is described. The rationale for this approach is explained through a simple Bayesian interpretation. Some state-of-the-art variations and extensions are also presented.

  15. Psychological traits underlying different killing methods among Malaysian male murderers.

    PubMed

    Kamaluddin, Mohammad Rahim; Shariff, Nadiah Syariani; Nurfarliza, Siti; Othman, Azizah; Ismail, Khaidzir H; Mat Saat, Geshina Ayu

    2014-04-01

    Murder is the most notorious crime that violates religious, social and cultural norms. Examining the types and number of different killing methods that used are pivotal in a murder case. However, the psychological traits underlying specific and multiple killing methods are still understudied. The present study attempts to fill this gap in knowledge by identifying the underlying psychological traits of different killing methods among Malaysian murderers. The study adapted an observational cross-sectional methodology using a guided self-administered questionnaire for data collection. The sampling frame consisted of 71 Malaysian male murderers from 11 Malaysian prisons who were selected using purposive sampling method. The participants were also asked to provide the types and number of different killing methods used to kill their respective victims. An independent sample t-test was performed to establish the mean score difference of psychological traits between the murderers who used single and multiple types of killing methods. Kruskal-Wallis tests were carried out to ascertain the psychological trait differences between specific types of killing methods. The results suggest that specific psychological traits underlie the type and number of different killing methods used during murder. The majority (88.7%) of murderers used a single method of killing. Multiple methods of killing was evident in 'premeditated' murder compared to 'passion' murder, and revenge was a common motive. Examples of multiple methods are combinations of stabbing and strangulation or slashing and physical force. An exception was premeditated murder committed with shooting, when it was usually a single method, attributed to the high lethality of firearms. Shooting was also notable when the motive was financial gain or related to drug dealing. Murderers who used multiple killing methods were more aggressive and sadistic than those who used a single killing method. Those who used multiple methods or slashing also displayed a higher level of minimisation traits. Despite its limitations, this study has provided some light on the underlying psychological traits of different killing methods which is useful in the field of criminology.

  16. Multisignal detecting system of pile integrity testing

    NASA Astrophysics Data System (ADS)

    Liu, Zuting; Luo, Ying; Yu, Shihai

    2002-05-01

    The low strain reflection wave method plays a principal rule in the integrating detection of base piles. However, there are some deficiencies with this method. For example, there is a blind area of detection on top of the tested pile; it is difficult to recognize the defects at deep-seated parts of the pile; there is still the planar of 3D domino effect, etc. It is very difficult to solve these problems only with the single-transducer pile integrity testing system. A new multi-signal piles integrity testing system is proposed in this paper, which is able to impulse and collect signals on multiple points on top of the pile. By using the multiple superposition data processing method, the detecting system can effectively restrain the interference and elevate the precision and SNR of pile integrity testing. The system can also be applied to the evaluation of engineering structure health.

  17. Normal uniform mixture differential gene expression detection for cDNA microarrays

    PubMed Central

    Dean, Nema; Raftery, Adrian E

    2005-01-01

    Background One of the primary tasks in analysing gene expression data is finding genes that are differentially expressed in different samples. Multiple testing issues due to the thousands of tests run make some of the more popular methods for doing this problematic. Results We propose a simple method, Normal Uniform Differential Gene Expression (NUDGE) detection for finding differentially expressed genes in cDNA microarrays. The method uses a simple univariate normal-uniform mixture model, in combination with new normalization methods for spread as well as mean that extend the lowess normalization of Dudoit, Yang, Callow and Speed (2002) [1]. It takes account of multiple testing, and gives probabilities of differential expression as part of its output. It can be applied to either single-slide or replicated experiments, and it is very fast. Three datasets are analyzed using NUDGE, and the results are compared to those given by other popular methods: unadjusted and Bonferroni-adjusted t tests, Significance Analysis of Microarrays (SAM), and Empirical Bayes for microarrays (EBarrays) with both Gamma-Gamma and Lognormal-Normal models. Conclusion The method gives a high probability of differential expression to genes known/suspected a priori to be differentially expressed and a low probability to the others. In terms of known false positives and false negatives, the method outperforms all multiple-replicate methods except for the Gamma-Gamma EBarrays method to which it offers comparable results with the added advantages of greater simplicity, speed, fewer assumptions and applicability to the single replicate case. An R package called nudge to implement the methods in this paper will be made available soon at . PMID:16011807

  18. Correlation between skin-prick testing, individual specific IgE tests, and a multiallergen IgE assay for allergy detection in patients with chronic rhinitis.

    PubMed

    Cho, Jae Hoon; Suh, Jeffrey D; Kim, Jin Kook; Hong, Seok-Chan; Park, Il-Ho; Lee, Heung-Man

    2014-01-01

    Allergy test results can differ based on the method used. The most common tests include skin-prick testing (SPT) and in vitro tests to detect allergen-specific IgE. This study was designed to assess allergy test results using SPT, individual specific IgE tests, and a multiallergen IgE assay (multiple allergen simultaneous test) in patients with chronic rhinitis and controls. One hundred forty total patients were prospectively enrolled in the study, including 100 patients with chronic rhinitis and 40 control patients without atopy. All eligible patients underwent SPT, serum analysis using individual specific IgE test, and multiple allergen simultaneous test against 10 common allergens. Allergy test results were then compared to identify correlation and interest agreement. There was an 81-97% agreement between SPT and individual specific IgE test in allergen detection and an 80-98% agreement between SPT and multiple allergen simultaneous test. Individual specific IgE test and multiple allergen simultaneous test allergy detection prevalence was generally similar to SPT in patients with chronic rhinitis. All control patients had negative SPT (0/40), but low positive results were found with both individual specific IgE test (5-12.5%) and multiple allergen simultaneous test (2.5-7.5%) to some allergens, especially cockroach, Dermatophagoides farina, and ragweed. Agreement and correlation between individual specific IgE test and multiple allergen simultaneous test were good to excellent for a majority of tested allergens. This study shows good agreement and correlation between SPT with individual specific IgE test and multiple allergen simultaneous test on a majority of the tested allergens for patients with chronic rhinitis. Comparing the two in vitro tests, individual specific IgE test agrees with SPT better than multiple allergen simultaneous test.

  19. Quasi-Maximum Likelihood Estimation of Structural Equation Models with Multiple Interaction and Quadratic Effects

    ERIC Educational Resources Information Center

    Klein, Andreas G.; Muthen, Bengt O.

    2007-01-01

    In this article, a nonlinear structural equation model is introduced and a quasi-maximum likelihood method for simultaneous estimation and testing of multiple nonlinear effects is developed. The focus of the new methodology lies on efficiency, robustness, and computational practicability. Monte-Carlo studies indicate that the method is highly…

  20. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  1. Global Search Capabilities of Indirect Methods for Impulsive Transfers

    NASA Astrophysics Data System (ADS)

    Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong

    2015-09-01

    An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.

  2. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  3. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  4. Multiple Imputation of Item Scores in Test and Questionnaire Data, and Influence on Psychometric Results

    ERIC Educational Resources Information Center

    van Ginkel, Joost R.; van der Ark, L. Andries; Sijtsma, Klaas

    2007-01-01

    The performance of five simple multiple imputation methods for dealing with missing data were compared. In addition, random imputation and multivariate normal imputation were used as lower and upper benchmark, respectively. Test data were simulated and item scores were deleted such that they were either missing completely at random, missing at…

  5. Cooperative system and method using mobile robots for testing a cooperative search controller

    DOEpatents

    Byrne, Raymond H.; Harrington, John J.; Eskridge, Steven E.; Hurtado, John E.

    2002-01-01

    A test system for testing a controller provides a way to use large numbers of miniature mobile robots to test a cooperative search controller in a test area, where each mobile robot has a sensor, a communication device, a processor, and a memory. A method of using a test system provides a way for testing a cooperative search controller using multiple robots sharing information and communicating over a communication network.

  6. The History of a Decision: A Standard Vibration Test Method for Qualification

    DOE PAGES

    Rizzo, Davinia; Blackburn, Mark

    2017-01-01

    As Mil-Std-810G and subsequent versions have included multiple degree of freedom vibration test methodologies, it is important to understand the history and factors that drove the original decision in Mil-Std-810 to focus on single degree of freedom (SDOF) vibration testing. By assessing the factors and thought process of early Mil-Std-810 vibration test methods, it enables one to better consider the use of multiple degree of freedom testing now that it is feasible with today’s technology and documented in Mil-Std-810. This paper delves into the details of the decision made in the 1960s for the SDOF vibration testing standards in Mil-Std-810more » beyond the limitations of technology at the time. We also consider the implications for effective test planning today considering the advances in test capabilities and improvements in understanding of the operational environment.« less

  7. The History of a Decision: A Standard Vibration Test Method for Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia; Blackburn, Mark

    As Mil-Std-810G and subsequent versions have included multiple degree of freedom vibration test methodologies, it is important to understand the history and factors that drove the original decision in Mil-Std-810 to focus on single degree of freedom (SDOF) vibration testing. By assessing the factors and thought process of early Mil-Std-810 vibration test methods, it enables one to better consider the use of multiple degree of freedom testing now that it is feasible with today’s technology and documented in Mil-Std-810. This paper delves into the details of the decision made in the 1960s for the SDOF vibration testing standards in Mil-Std-810more » beyond the limitations of technology at the time. We also consider the implications for effective test planning today considering the advances in test capabilities and improvements in understanding of the operational environment.« less

  8. A Multiple Choice Version of the Sentence Completion Method

    ERIC Educational Resources Information Center

    Shouval, Ron; And Others

    1975-01-01

    It was concluded that a multiple choice form corresponding to a sentence completion measure, test clearly defined personality areas (such as autonomy) could be a reasonable alternative for many purposes. (Author/DEP)

  9. Integrative set enrichment testing for multiple omics platforms

    PubMed Central

    2011-01-01

    Background Enrichment testing assesses the overall evidence of differential expression behavior of the elements within a defined set. When we have measured many molecular aspects, e.g. gene expression, metabolites, proteins, it is desirable to assess their differential tendencies jointly across platforms using an integrated set enrichment test. In this work we explore the properties of several methods for performing a combined enrichment test using gene expression and metabolomics as the motivating platforms. Results Using two simulation models we explored the properties of several enrichment methods including two novel methods: the logistic regression 2-degree of freedom Wald test and the 2-dimensional permutation p-value for the sum-of-squared statistics test. In relation to their univariate counterparts we find that the joint tests can improve our ability to detect results that are marginal univariately. We also find that joint tests improve the ranking of associated pathways compared to their univariate counterparts. However, there is a risk of Type I error inflation with some methods and self-contained methods lose specificity when the sets are not representative of underlying association. Conclusions In this work we show that consideration of data from multiple platforms, in conjunction with summarization via a priori pathway information, leads to increased power in detection of genomic associations with phenotypes. PMID:22118224

  10. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  11. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  12. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  13. Alternative Multiple Imputation Inference for Mean and Covariance Structure Modeling

    ERIC Educational Resources Information Center

    Lee, Taehun; Cai, Li

    2012-01-01

    Model-based multiple imputation has become an indispensable method in the educational and behavioral sciences. Mean and covariance structure models are often fitted to multiply imputed data sets. However, the presence of multiple random imputations complicates model fit testing, which is an important aspect of mean and covariance structure…

  14. Developing multiple-choices test items as tools for measuring the scientific-generic skills on solar system

    NASA Astrophysics Data System (ADS)

    Bhakti, Satria Seto; Samsudin, Achmad; Chandra, Didi Teguh; Siahaan, Parsaoran

    2017-05-01

    The aim of research is developing multiple-choices test items as tools for measuring the scientific of generic skills on solar system. To achieve the aim that the researchers used the ADDIE model consisting Of: Analyzing, Design, Development, Implementation, dan Evaluation, all of this as a method research. While The scientific of generic skills limited research to five indicator including: (1) indirect observation, (2) awareness of the scale, (3) inference logic, (4) a causal relation, and (5) mathematical modeling. The participants are 32 students at one of junior high schools in Bandung. The result shown that multiple-choices that are constructed test items have been declared valid by the expert validator, and after the tests show that the matter of developing multiple-choices test items be able to measuring the scientific of generic skills on solar system.

  15. Multiple well-shutdown tests and site-scale flow simulation in fractured rocks

    USGS Publications Warehouse

    Tiedeman, Claire; Lacombe, Pierre J.; Goode, Daniel J.

    2010-01-01

    A new method was developed for conducting aquifer tests in fractured-rock flow systems that have a pump-and-treat (P&T) operation for containing and removing groundwater contaminants. The method involves temporary shutdown of individual pumps in wells of the P&T system. Conducting aquifer tests in this manner has several advantages, including (1) no additional contaminated water is withdrawn, and (2) hydraulic containment of contaminants remains largely intact because pumping continues at most wells. The well-shutdown test method was applied at the former Naval Air Warfare Center (NAWC), West Trenton, New Jersey, where a P&T operation is designed to contain and remove trichloroethene and its daughter products in the dipping fractured sedimentary rocks underlying the site. The detailed site-scale subsurface geologic stratigraphy, a three-dimensional MODFLOW model, and inverse methods in UCODE_2005 were used to analyze the shutdown tests. In the model, a deterministic method was used for representing the highly heterogeneous hydraulic conductivity distribution and simulations were conducted using an equivalent porous media method. This approach was very successful for simulating the shutdown tests, contrary to a common perception that flow in fractured rocks must be simulated using a stochastic or discrete fracture representation of heterogeneity. Use of inverse methods to simultaneously calibrate the model to the multiple shutdown tests was integral to the effectiveness of the approach.

  16. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces. PMID:27736999

  17. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations.

    PubMed

    Hess, Becky M; Amidan, Brett G; Anderson, Kevin K; Hutchison, Janine R

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.

  18. Visual Search as a Tool for a Quick and Reliable Assessment of Cognitive Functions in Patients with Multiple Sclerosis

    PubMed Central

    Utz, Kathrin S.; Hankeln, Thomas M. A.; Jung, Lena; Lämmer, Alexandra; Waschbisch, Anne; Lee, De-Hyung; Linker, Ralf A.; Schenk, Thomas

    2013-01-01

    Background Despite the high frequency of cognitive impairment in multiple sclerosis, its assessment has not gained entrance into clinical routine yet, due to lack of time-saving and suitable tests for patients with multiple sclerosis. Objective The aim of the study was to compare the paradigm of visual search with neuropsychological standard tests, in order to identify the test that discriminates best between patients with multiple sclerosis and healthy individuals concerning cognitive functions, without being susceptible to practice effects. Methods Patients with relapsing remitting multiple sclerosis (n = 38) and age-and gender-matched healthy individuals (n = 40) were tested with common neuropsychological tests and a computer-based visual search task, whereby a target stimulus has to be detected amongst distracting stimuli on a touch screen. Twenty-eight of the healthy individuals were re-tested in order to determine potential practice effects. Results Mean reaction time reflecting visual attention and movement time indicating motor execution in the visual search task discriminated best between healthy individuals and patients with multiple sclerosis, without practice effects. Conclusions Visual search is a promising instrument for the assessment of cognitive functions and potentially cognitive changes in patients with multiple sclerosis thanks to its good discriminatory power and insusceptibility to practice effects. PMID:24282604

  19. Gene- and pathway-based association tests for multiple traits with GWAS summary statistics.

    PubMed

    Kwak, Il-Youp; Pan, Wei

    2017-01-01

    To identify novel genetic variants associated with complex traits and to shed new insights on underlying biology, in addition to the most popular single SNP-single trait association analysis, it would be useful to explore multiple correlated (intermediate) traits at the gene- or pathway-level by mining existing single GWAS or meta-analyzed GWAS data. For this purpose, we present an adaptive gene-based test and a pathway-based test for association analysis of multiple traits with GWAS summary statistics. The proposed tests are adaptive at both the SNP- and trait-levels; that is, they account for possibly varying association patterns (e.g. signal sparsity levels) across SNPs and traits, thus maintaining high power across a wide range of situations. Furthermore, the proposed methods are general: they can be applied to mixed types of traits, and to Z-statistics or P-values as summary statistics obtained from either a single GWAS or a meta-analysis of multiple GWAS. Our numerical studies with simulated and real data demonstrated the promising performance of the proposed methods. The methods are implemented in R package aSPU, freely and publicly available at: https://cran.r-project.org/web/packages/aSPU/ CONTACT: weip@biostat.umn.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. EVALUATION OF IMMUNOASSAY METHODS FOR DETERMINATION OF 3,5,6-TRICHLORO-2-PYRIDINOL IN MULTIPLE SAMPLE MEDIA

    EPA Science Inventory

    Two enzyme-linked immunosorbent assay (ELISA) methods were evaluated for the determination of 3,5,6-trichloro-2-pyridinol (3,5,6-TCP) in multiple sample media (dust, soil, food, and urine). The dust and soil samples were analyzed by the RaPID (TM) commercial immunoassay testing ...

  1. Wavefront reconstruction for multi-lateral shearing interferometry using difference Zernike polynomials fitting

    NASA Astrophysics Data System (ADS)

    Liu, Ke; Wang, Jiannian; Wang, Hai; Li, Yanqiu

    2018-07-01

    For the multi-lateral shearing interferometers (multi-LSIs), the measurement accuracy can be enhanced by estimating the wavefront under test with the multidirectional phase information encoded in the shearing interferogram. Usually the multi-LSIs reconstruct the test wavefront from the phase derivatives in multiple directions using the discrete Fourier transforms (DFT) method, which is only suitable to small shear ratios and relatively sensitive to noise. To improve the accuracy of multi-LSIs, wavefront reconstruction from the multidirectional phase differences using the difference Zernike polynomials fitting (DZPF) method is proposed in this paper. For the DZPF method applied in the quadriwave LSI, difference Zernike polynomials in only two orthogonal shear directions are required to represent the phase differences in multiple shear directions. In this way, the test wavefront can be reconstructed from the phase differences in multiple shear directions using a noise-variance weighted least-squares method with almost no extra computational burden, compared with the usual recovery from the phase differences in two orthogonal directions. Numerical simulation results show that the DZPF method can maintain high reconstruction accuracy in a wider range of shear ratios and has much better anti-noise performance than the DFT method. A null test experiment of the quadriwave LSI has been conducted and the experimental results show that the measurement accuracy of the quadriwave LSI can be improved from 0.0054 λ rms to 0.0029 λ rms (λ = 632.8 nm) by substituting the DFT method with the proposed DZPF method in the wavefront reconstruction process.

  2. Similitude assessment method for comparing PMHS response data from impact loading across multiple test devices.

    PubMed

    Dooley, Christopher J; Tenore, Francesco V; Gayzik, F Scott; Merkle, Andrew C

    2018-04-27

    Biological tissue testing is inherently susceptible to the wide range of variability specimen to specimen. A primary resource for encapsulating this range of variability is the biofidelity response corridor or BRC. In the field of injury biomechanics, BRCs are often used for development and validation of both physical, such as anthropomorphic test devices, and computational models. For the purpose of generating corridors, post-mortem human surrogates were tested across a range of loading conditions relevant to under-body blast events. To sufficiently cover the wide range of input conditions, a relatively small number of tests were performed across a large spread of conditions. The high volume of required testing called for leveraging the capabilities of multiple impact test facilities, all with slight variations in test devices. A method for assessing similitude of responses between test devices was created as a metric for inclusion of a response in the resulting BRC. The goal of this method was to supply a statistically sound, objective method to assess the similitude of an individual response against a set of responses to ensure that the BRC created from the set was affected primarily by biological variability, not anomalies or differences stemming from test devices. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Multiple imputation methods for nonparametric inference on cumulative incidence with missing cause of failure

    PubMed Central

    Lee, Minjung; Dignam, James J.; Han, Junhee

    2014-01-01

    We propose a nonparametric approach for cumulative incidence estimation when causes of failure are unknown or missing for some subjects. Under the missing at random assumption, we estimate the cumulative incidence function using multiple imputation methods. We develop asymptotic theory for the cumulative incidence estimators obtained from multiple imputation methods. We also discuss how to construct confidence intervals for the cumulative incidence function and perform a test for comparing the cumulative incidence functions in two samples with missing cause of failure. Through simulation studies, we show that the proposed methods perform well. The methods are illustrated with data from a randomized clinical trial in early stage breast cancer. PMID:25043107

  4. Using a fuzzy comprehensive evaluation method to determine product usability: A test case

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942

  5. Rapid evaluation of reverse-osmosis membranes

    NASA Technical Reports Server (NTRS)

    Hollahan, J. R.; Wydeven, T.

    1972-01-01

    Simultaneous reverse-osmosis tests conducted with centrifuges having multiple compartment heads are discussed. Equipment for retaining reverse-osmosis membrane is illustrated. Method of conducting tests is described.

  6. Benefits of Using Planned Comparisons Rather Than Post Hoc Tests: A Brief Review with Examples.

    ERIC Educational Resources Information Center

    DuRapau, Theresa M.

    The rationale behind analysis of variance (including analysis of covariance and multiple analyses of variance and covariance) methods is reviewed, and unplanned and planned methods of evaluating differences between means are briefly described. Two advantages of using planned or a priori tests over unplanned or post hoc tests are presented. In…

  7. Multiple-Choice Testing Using Immediate Feedback--Assessment Technique (IF AT®) Forms: Second-Chance Guessing vs. Second-Chance Learning?

    ERIC Educational Resources Information Center

    Merrel, Jeremy D.; Cirillo, Pier F.; Schwartz, Pauline M.; Webb, Jeffrey A.

    2015-01-01

    Multiple choice testing is a common but often ineffective method for evaluating learning. A newer approach, however, using Immediate Feedback Assessment Technique (IF AT®, Epstein Educational Enterprise, Inc.) forms, offers several advantages. In particular, a student learns immediately if his or her answer is correct and, in the case of an…

  8. Sample size determination for equivalence assessment with multiple endpoints.

    PubMed

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  9. Weighted least-square approach for simultaneous measurement of multiple reflective surfaces

    NASA Astrophysics Data System (ADS)

    Tang, Shouhong; Bills, Richard E.; Freischlad, Klaus

    2007-09-01

    Phase shifting interferometry (PSI) is a highly accurate method for measuring the nanometer-scale relative surface height of a semi-reflective test surface. PSI is effectively used in conjunction with Fizeau interferometers for optical testing, hard disk inspection, and semiconductor wafer flatness. However, commonly-used PSI algorithms are unable to produce an accurate phase measurement if more than one reflective surface is present in the Fizeau interferometer test cavity. Examples of test parts that fall into this category include lithography mask blanks and their protective pellicles, and plane parallel optical beam splitters. The plane parallel surfaces of these parts generate multiple interferograms that are superimposed in the recording plane of the Fizeau interferometer. When using wavelength shifting in PSI the phase shifting speed of each interferogram is proportional to the optical path difference (OPD) between the two reflective surfaces. The proposed method is able to differentiate each underlying interferogram from each other in an optimal manner. In this paper, we present a method for simultaneously measuring the multiple test surfaces of all underlying interferograms from these superimposed interferograms through the use of a weighted least-square fitting technique. The theoretical analysis of weighted least-square technique and the measurement results will be described in this paper.

  10. More Than Just Accuracy: A Novel Method to Incorporate Multiple Test Attributes in Evaluating Diagnostic Tests Including Point of Care Tests.

    PubMed

    Thompson, Matthew; Weigl, Bernhard; Fitzpatrick, Annette; Ide, Nicole

    2016-01-01

    Current frameworks for evaluating diagnostic tests are constrained by a focus on diagnostic accuracy, and assume that all aspects of the testing process and test attributes are discrete and equally important. Determining the balance between the benefits and harms associated with new or existing tests has been overlooked. Yet, this is critically important information for stakeholders involved in developing, testing, and implementing tests. This is particularly important for point of care tests (POCTs) where tradeoffs exist between numerous aspects of the testing process and test attributes. We developed a new model that multiple stakeholders (e.g., clinicians, patients, researchers, test developers, industry, regulators, and health care funders) can use to visualize the multiple attributes of tests, the interactions that occur between these attributes, and their impacts on health outcomes. We use multiple examples to illustrate interactions between test attributes (test availability, test experience, and test results) and outcomes, including several POCTs. The model could be used to prioritize research and development efforts, and inform regulatory submissions for new diagnostics. It could potentially provide a way to incorporate the relative weights that various subgroups or clinical settings might place on different test attributes. Our model provides a novel way that multiple stakeholders can use to visualize test attributes, their interactions, and impacts on individual and population outcomes. We anticipate that this will facilitate more informed decision making around diagnostic tests.

  11. An MLE method for finding LKB NTCP model parameters using Monte Carlo uncertainty estimates

    NASA Astrophysics Data System (ADS)

    Carolan, Martin; Oborn, Brad; Foo, Kerwyn; Haworth, Annette; Gulliford, Sarah; Ebert, Martin

    2014-03-01

    The aims of this work were to establish a program to fit NTCP models to clinical data with multiple toxicity endpoints, to test the method using a realistic test dataset, to compare three methods for estimating confidence intervals for the fitted parameters and to characterise the speed and performance of the program.

  12. Using a fuzzy comprehensive evaluation method to determine product usability: A test case.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.

  13. A Multiple Deficit Model of Reading Disability and Attention-Deficit/Hyperactivity Disorder: Searching for Shared Cognitive Deficits

    ERIC Educational Resources Information Center

    McGrath, Lauren M.; Pennington, Bruce F.; Shanahan, Michelle A.; Santerre-Lemmon, Laura E.; Barnard, Holly D.; Willcutt, Erik G.; DeFries, John C.; Olson, Richard K.

    2011-01-01

    Background: This study tests a multiple cognitive deficit model of reading disability (RD), attention-deficit/hyperactivity disorder (ADHD), and their comorbidity. Methods: A structural equation model (SEM) of multiple cognitive risk factors and symptom outcome variables was constructed. The model included phonological awareness as a unique…

  14. COMPARISON OF IMMUNOASSAY AND GAS CHROMATOGRAPHY/MASS SPECTROMETRY METHODS FOR MEASURING 3,5,6-TRICHLORO-2PYRIDINOL IN MULTIPLE SAMPLE MEDIA

    EPA Science Inventory

    Two enzyme-linked immunosorbent assay (ELISA) methods were evaluated for the determination of 3,5,6-trichloro-2-pyridinol (3,5,6-TCP) in multiple sample media (dust, soil, food, and urine). The dust and soil samples were analyzed by a commercial RaPID immunoassay testing kit. ...

  15. Fuzzy neural network technique for system state forecasting.

    PubMed

    Li, Dezhi; Wang, Wilson; Ismail, Fathy

    2013-10-01

    In many system state forecasting applications, the prediction is performed based on multiple datasets, each corresponding to a distinct system condition. The traditional methods dealing with multiple datasets (e.g., vector autoregressive moving average models and neural networks) have some shortcomings, such as limited modeling capability and opaque reasoning operations. To tackle these problems, a novel fuzzy neural network (FNN) is proposed in this paper to effectively extract information from multiple datasets, so as to improve forecasting accuracy. The proposed predictor consists of both autoregressive (AR) nodes modeling and nonlinear nodes modeling; AR models/nodes are used to capture the linear correlation of the datasets, and the nonlinear correlation of the datasets are modeled with nonlinear neuron nodes. A novel particle swarm technique [i.e., Laplace particle swarm (LPS) method] is proposed to facilitate parameters estimation of the predictor and improve modeling accuracy. The effectiveness of the developed FNN predictor and the associated LPS method is verified by a series of tests related to Mackey-Glass data forecast, exchange rate data prediction, and gear system prognosis. Test results show that the developed FNN predictor and the LPS method can capture the dynamics of multiple datasets effectively and track system characteristics accurately.

  16. Use of ultrasonic array method for positioning multiple partial discharge sources in transformer oil.

    PubMed

    Xie, Qing; Tao, Junhan; Wang, Yongqiang; Geng, Jianghai; Cheng, Shuyi; Lü, Fangcheng

    2014-08-01

    Fast and accurate positioning of partial discharge (PD) sources in transformer oil is very important for the safe, stable operation of power systems because it allows timely elimination of insulation faults. There is usually more than one PD source once an insulation fault occurs in the transformer oil. This study, which has both theoretical and practical significance, proposes a method of identifying multiple PD sources in the transformer oil. The method combines the two-sided correlation transformation algorithm in the broadband signal focusing and the modified Gerschgorin disk estimator. The method of classification of multiple signals is used to determine the directions of arrival of signals from multiple PD sources. The ultrasonic array positioning method is based on the multi-platform direction finding and the global optimization searching. Both the 4 × 4 square planar ultrasonic sensor array and the ultrasonic array detection platform are built to test the method of identifying and positioning multiple PD sources. The obtained results verify the validity and the engineering practicability of this method.

  17. Improvements in cognition, quality of life, and physical performance with clinical Pilates in multiple sclerosis: a randomized controlled trial

    PubMed Central

    Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen

    2016-01-01

    [Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists. PMID:27134355

  18. Improvements in cognition, quality of life, and physical performance with clinical Pilates in multiple sclerosis: a randomized controlled trial.

    PubMed

    Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen

    2016-03-01

    [Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists.

  19. A Model-Based Method for Content Validation of Automatically Generated Test Items

    ERIC Educational Resources Information Center

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  20. Development of a Methodology for Successful Multigeneration Life-Cycle Testing of the Estuarine Sheepshead Minnow, Cyprinodon variegatus.

    EPA Science Inventory

    Sustainability of wildlife populations exposed to endocrine disrupting chemicals in natural water bodies has sparked sufficient concern that the U.S.EPA is developing methods for multiple generation exposures of fishes. Established testing methods and the short life-cycle of the ...

  1. A Comparative Study of Online Item Calibration Methods in Multidimensional Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Chen, Ping

    2017-01-01

    Calibration of new items online has been an important topic in item replenishment for multidimensional computerized adaptive testing (MCAT). Several online calibration methods have been proposed for MCAT, such as multidimensional "one expectation-maximization (EM) cycle" (M-OEM) and multidimensional "multiple EM cycles"…

  2. Inflatable straddle packers and associated equipment for hydraulic fracturing and hydrologic testing

    USGS Publications Warehouse

    Shuter, Eugene; Pemberton, Robert R.

    1978-01-01

    Independent aquifer testing is the only way to fully understand the hydrology encountered in boreholes intersecting multiple aquifers. The most feasible method to accomplish the testing of multiple aquifer wells is through the use inflatable packers. The straddle packers and associated equipment herein described arE valuable tools for making isolated aquifer tests as well as conducting hydraulic fracturing experiments. The system, due to design, permits multiple tests in a bore-hole without the necessity of tripping in and out of the hole to redress the packers prior to testing each zone. Electronic pressure transducers, the output of which was fed into strip-chart recorders, were used to monitor the zone being tested, as well as to monitor the zones above and below the packers. This was necessary to ensure that no leaking had occurred around the packers, causing hydraulic continuity between the isolated zones.

  3. Improved Statistical Methods Enable Greater Sensitivity in Rhythm Detection for Genome-Wide Data

    PubMed Central

    Hutchison, Alan L.; Maienschein-Cline, Mark; Chiang, Andrew H.; Tabei, S. M. Ali; Gudjonson, Herman; Bahroos, Neil; Allada, Ravi; Dinner, Aaron R.

    2015-01-01

    Robust methods for identifying patterns of expression in genome-wide data are important for generating hypotheses regarding gene function. To this end, several analytic methods have been developed for detecting periodic patterns. We improve one such method, JTK_CYCLE, by explicitly calculating the null distribution such that it accounts for multiple hypothesis testing and by including non-sinusoidal reference waveforms. We term this method empirical JTK_CYCLE with asymmetry search, and we compare its performance to JTK_CYCLE with Bonferroni and Benjamini-Hochberg multiple hypothesis testing correction, as well as to five other methods: cyclohedron test, address reduction, stable persistence, ANOVA, and F24. We find that ANOVA, F24, and JTK_CYCLE consistently outperform the other three methods when data are limited and noisy; empirical JTK_CYCLE with asymmetry search gives the greatest sensitivity while controlling for the false discovery rate. Our analysis also provides insight into experimental design and we find that, for a fixed number of samples, better sensitivity and specificity are achieved with higher numbers of replicates than with higher sampling density. Application of the methods to detecting circadian rhythms in a metadataset of microarrays that quantify time-dependent gene expression in whole heads of Drosophila melanogaster reveals annotations that are enriched among genes with highly asymmetric waveforms. These include a wide range of oxidation reduction and metabolic genes, as well as genes with transcripts that have multiple splice forms. PMID:25793520

  4. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  5. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  6. Electrical characterization of a Mapham inverter using pulse testing techniques

    NASA Technical Reports Server (NTRS)

    Baumann, E. D.; Myers, I. T.; Hammoud, A. N.

    1990-01-01

    The use of a multiple pulse testing technique to determine the electrical characteristics of large megawatt-level power systems for aerospace missions is proposed. An innovative test method based on the multiple pulse technique is demonstrated on a 2-kW Mapham inverter. The concept of this technique shows that characterization of large power systems under electrical equilibrium at rated power can be accomplished without large costly power supplies. The heat generation that occurs in systems when tested in a continuous mode is eliminated. The results indicate that there is a good agreement between this testing technique and that of steady state testing.

  7. Accuracy of p53 Codon 72 Polymorphism Status Determined by Multiple Laboratory Methods: A Latent Class Model Analysis

    PubMed Central

    Walter, Stephen D.; Riddell, Corinne A.; Rabachini, Tatiana; Villa, Luisa L.; Franco, Eduardo L.

    2013-01-01

    Introduction Studies on the association of a polymorphism in codon 72 of the p53 tumour suppressor gene (rs1042522) with cervical neoplasia have inconsistent results. While several methods for genotyping p53 exist, they vary in accuracy and are often discrepant. Methods We used latent class models (LCM) to examine the accuracy of six methods for p53 determination, all conducted by the same laboratory. We also examined the association of p53 with cytological cervical abnormalities, recognising potential test inaccuracy. Results Pairwise disagreement between laboratory methods occurred approximately 10% of the time. Given the estimated true p53 status of each woman, we found that each laboratory method is most likely to classify a woman to her correct status. Arg/Arg women had the highest risk of squamous intraepithelial lesions (SIL). Test accuracy was independent of cytology. There was no strong evidence for correlations of test errors. Discussion Empirical analyses ignore possible laboratory errors, and so are inherently biased, but test accuracy estimated by the LCM approach is unbiased when model assumptions are met. LCM analysis avoids ambiguities arising from empirical test discrepancies, obviating the need to regard any of the methods as a “gold” standard measurement. The methods we presented here to analyse the p53 data can be applied in many other situations where multiple tests exist, but where none of them is a gold standard. PMID:23441193

  8. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  9. Design and analysis of multiple diseases genome-wide association studies without controls.

    PubMed

    Chen, Zhongxue; Huang, Hanwen; Ng, Hon Keung Tony

    2012-11-15

    In genome-wide association studies (GWAS), multiple diseases with shared controls is one of the case-control study designs. If data obtained from these studies are appropriately analyzed, this design can have several advantages such as improving statistical power in detecting associations and reducing the time and cost in the data collection process. In this paper, we propose a study design for GWAS which involves multiple diseases but without controls. We also propose corresponding statistical data analysis strategy for GWAS with multiple diseases but no controls. Through a simulation study, we show that the statistical association test with the proposed study design is more powerful than the test with single disease sharing common controls, and it has comparable power to the overall test based on the whole dataset including the controls. We also apply the proposed method to a real GWAS dataset to illustrate the methodologies and the advantages of the proposed design. Some possible limitations of this study design and testing method and their solutions are also discussed. Our findings indicate that the proposed study design and statistical analysis strategy could be more efficient than the usual case-control GWAS as well as those with shared controls. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Benefits of Multiple Methods for Evaluating HIV Counseling and Testing Sites in Pennsylvania.

    ERIC Educational Resources Information Center

    Encandela, John A.; Gehl, Mary Beth; Silvestre, Anthony; Schelzel, George

    1999-01-01

    Examines results from two methods used to evaluate publicly funded human immunodeficiency virus (HIV) counseling and testing in Pennsylvania. Results of written mail surveys of all sites and interviews from a random sample of 30 sites were similar in terms of questions posed and complementary in other ways. (SLD)

  11. Accurate and fast multiple-testing correction in eQTL studies.

    PubMed

    Sul, Jae Hoon; Raj, Towfique; de Jong, Simone; de Bakker, Paul I W; Raychaudhuri, Soumya; Ophoff, Roel A; Stranger, Barbara E; Eskin, Eleazar; Han, Buhm

    2015-06-04

    In studies of expression quantitative trait loci (eQTLs), it is of increasing interest to identify eGenes, the genes whose expression levels are associated with variation at a particular genetic variant. Detecting eGenes is important for follow-up analyses and prioritization because genes are the main entities in biological processes. To detect eGenes, one typically focuses on the genetic variant with the minimum p value among all variants in cis with a gene and corrects for multiple testing to obtain a gene-level p value. For performing multiple-testing correction, a permutation test is widely used. Because of growing sample sizes of eQTL studies, however, the permutation test has become a computational bottleneck in eQTL studies. In this paper, we propose an efficient approach for correcting for multiple testing and assess eGene p values by utilizing a multivariate normal distribution. Our approach properly takes into account the linkage-disequilibrium structure among variants, and its time complexity is independent of sample size. By applying our small-sample correction techniques, our method achieves high accuracy in both small and large studies. We have shown that our method consistently produces extremely accurate p values (accuracy > 98%) for three human eQTL datasets with different sample sizes and SNP densities: the Genotype-Tissue Expression pilot dataset, the multi-region brain dataset, and the HapMap 3 dataset. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  12. DendroBLAST: approximate phylogenetic trees in the absence of multiple sequence alignments.

    PubMed

    Kelly, Steven; Maini, Philip K

    2013-01-01

    The rapidly growing availability of genome information has created considerable demand for both fast and accurate phylogenetic inference algorithms. We present a novel method called DendroBLAST for reconstructing phylogenetic dendrograms/trees from protein sequences using BLAST. This method differs from other methods by incorporating a simple model of sequence evolution to test the effect of introducing sequence changes on the reliability of the bipartitions in the inferred tree. Using realistic simulated sequence data we demonstrate that this method produces phylogenetic trees that are more accurate than other commonly-used distance based methods though not as accurate as maximum likelihood methods from good quality multiple sequence alignments. In addition to tests on simulated data, we use DendroBLAST to generate input trees for a supertree reconstruction of the phylogeny of the Archaea. This independent analysis produces an approximate phylogeny of the Archaea that has both high precision and recall when compared to previously published analysis of the same dataset using conventional methods. Taken together these results demonstrate that approximate phylogenetic trees can be produced in the absence of multiple sequence alignments, and we propose that these trees will provide a platform for improving and informing downstream bioinformatic analysis. A web implementation of the DendroBLAST method is freely available for use at http://www.dendroblast.com/.

  13. Modifications to the Patient Rule-Induction Method that utilize non-additive combinations of genetic and environmental effects to define partitions that predict ischemic heart disease.

    PubMed

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G; Tybjaerg-Hansen, Anne; Sing, Charles F

    2009-05-01

    This article extends the Patient Rule-Induction Method (PRIM) for modeling cumulative incidence of disease developed by Dyson et al. (Genet Epidemiol 31:515-527) to include the simultaneous consideration of non-additive combinations of predictor variables, a significance test of each combination, an adjustment for multiple testing and a confidence interval for the estimate of the cumulative incidence of disease in each partition. We employ the partitioning algorithm component of the Combinatorial Partitioning Method to construct combinations of predictors, permutation testing to assess the significance of each combination, theoretical arguments for incorporating a multiple testing adjustment and bootstrap resampling to produce the confidence intervals. An illustration of this revised PRIM utilizing a sample of 2,258 European male participants from the Copenhagen City Heart Study is presented that assesses the utility of genetic variants in predicting the presence of ischemic heart disease beyond the established risk factors.

  14. Modifications to the Patient Rule-Induction Method that utilize non-additive combinations of genetic and environmental effects to define partitions that predict ischemic heart disease

    PubMed Central

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G.; Tybjærg-Hansen, Anne; Sing, Charles F.

    2009-01-01

    This paper extends the Patient Rule-Induction Method (PRIM) for modeling cumulative incidence of disease developed by Dyson et al. (2007) to include the simultaneous consideration of non-additive combinations of predictor variables, a significance test of each combination, an adjustment for multiple testing and a confidence interval for the estimate of the cumulative incidence of disease in each partition. We employ the partitioning algorithm component of the Combinatorial Partitioning Method (CPM) to construct combinations of predictors, permutation testing to assess the significance of each combination, theoretical arguments for incorporating a multiple testing adjustment and bootstrap resampling to produce the confidence intervals. An illustration of this revised PRIM utilizing a sample of 2258 European male participants from the Copenhagen City Heart Study is presented that assesses the utility of genetic variants in predicting the presence of ischemic heart disease beyond the established risk factors. PMID:19025787

  15. A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants

    PubMed Central

    Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.

    2016-01-01

    Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286

  16. Detecting a Weak Association by Testing its Multiple Perturbations: a Data Mining Approach

    NASA Astrophysics Data System (ADS)

    Lo, Min-Tzu; Lee, Wen-Chung

    2014-05-01

    Many risk factors/interventions in epidemiologic/biomedical studies are of minuscule effects. To detect such weak associations, one needs a study with a very large sample size (the number of subjects, n). The n of a study can be increased but unfortunately only to an extent. Here, we propose a novel method which hinges on increasing sample size in a different direction-the total number of variables (p). We construct a p-based `multiple perturbation test', and conduct power calculations and computer simulations to show that it can achieve a very high power to detect weak associations when p can be made very large. As a demonstration, we apply the method to analyze a genome-wide association study on age-related macular degeneration and identify two novel genetic variants that are significantly associated with the disease. The p-based method may set a stage for a new paradigm of statistical tests.

  17. Suppressing multiples using an adaptive multichannel filter based on L1-norm

    NASA Astrophysics Data System (ADS)

    Shi, Ying; Jing, Hongliang; Zhang, Wenwu; Ning, Dezhi

    2017-08-01

    Adaptive subtraction is an important link for removing surface-related multiples in the wave equation-based method. In this paper, we propose an adaptive multichannel subtraction method based on the L1-norm. We achieve enhanced compensation for the mismatch between the input seismogram and the predicted multiples in terms of the amplitude, phase, frequency band, and travel time. Unlike the conventional L2-norm, the proposed method does not rely on the assumption that the primary and the multiples are orthogonal, and also takes advantage of the fact that the L1-norm is more robust when dealing with outliers. In addition, we propose a frequency band extension via modulation to reconstruct the high frequencies to compensate for the frequency misalignment. We present a parallel computing scheme to accelerate the subtraction algorithm on graphic processing units (GPUs), which significantly reduces the computational cost. The synthetic and field seismic data tests show that the proposed method effectively suppresses the multiples.

  18. Method of forming a multiple layer dielectric and a hot film sensor therewith

    NASA Technical Reports Server (NTRS)

    Hopson, Purnell, Jr. (Inventor); Tran, Sang Q. (Inventor)

    1990-01-01

    The invention is a method of forming a multiple layer dielectric for use in a hot-film laminar separation sensor. The multiple layer dielectric substrate is formed by depositing a first layer of a thermoelastic polymer such as on an electrically conductive substrate such as the metal surface of a model to be tested under cryogenic conditions and high Reynolds numbers. Next, a second dielectric layer of fused silica is formed on the first dielectric layer of thermoplastic polymer. A resistive metal film is deposited on selected areas of the multiple layer dielectric substrate to form one or more hot-film sensor elements to which aluminum electrical circuits deposited upon the multiple layered dielectric substrate are connected.

  19. Evaluation of multiple tracer methods to estimate low groundwater flow velocities.

    PubMed

    Reimus, Paul W; Arnold, Bill W

    2017-04-01

    Four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or "shut-in" periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity data are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a "ground truth" velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. The advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them are discussed. Published by Elsevier B.V.

  20. A two-step hierarchical hypothesis set testing framework, with applications to gene expression data on ordered categories

    PubMed Central

    2014-01-01

    Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138

  1. Resting-state fMRI data reflects default network activity rather than null data: A defense of commonly employed methods to correct for multiple comparisons.

    PubMed

    Slotnick, Scott D

    2017-07-01

    Analysis of functional magnetic resonance imaging (fMRI) data typically involves over one hundred thousand independent statistical tests; therefore, it is necessary to correct for multiple comparisons to control familywise error. In a recent paper, Eklund, Nichols, and Knutsson used resting-state fMRI data to evaluate commonly employed methods to correct for multiple comparisons and reported unacceptable rates of familywise error. Eklund et al.'s analysis was based on the assumption that resting-state fMRI data reflect null data; however, their 'null data' actually reflected default network activity that inflated familywise error. As such, Eklund et al.'s results provide no basis to question the validity of the thousands of published fMRI studies that have corrected for multiple comparisons or the commonly employed methods to correct for multiple comparisons.

  2. Optimization of OT-MACH Filter Generation for Target Recognition

    NASA Technical Reports Server (NTRS)

    Johnson, Oliver C.; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin

    2009-01-01

    An automatic Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter generator for use in a gray-scale optical correlator (GOC) has been developed for improved target detection at JPL. While the OT-MACH filter has been shown to be an optimal filter for target detection, actually solving for the optimum is too computationally intensive for multiple targets. Instead, an adaptive step gradient descent method was tested to iteratively optimize the three OT-MACH parameters, alpha, beta, and gamma. The feedback for the gradient descent method was a composite of the performance measures, correlation peak height and peak to side lobe ratio. The automated method generated and tested multiple filters in order to approach the optimal filter quicker and more reliably than the current manual method. Initial usage and testing has shown preliminary success at finding an approximation of the optimal filter, in terms of alpha, beta, gamma values. This corresponded to a substantial improvement in detection performance where the true positive rate increased for the same average false positives per image.

  3. Effects of Barometric Fluctuations on Well Water-Level Measurements and Aquifer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spane, Frank A.

    1999-12-16

    This report examines the effects of barometric fluctuations on well water-level measurements and evaluates adjustment and removal methods for determining areal aquifer head conditions and aquifer test analysis. Two examples of Hanford Site unconfined aquifer tests are examined that demonstrate baro-metric response analysis and illustrate the predictive/removal capabilities of various methods for well water-level and aquifer total head values. Good predictive/removal characteristics were demonstrated with best corrective results provided by multiple-regression deconvolution methods.

  4. The Face-Symbol Test and the Symbol-Digit Test are not reliable surrogates for the Paced Auditory Serial Addition Test in multiple sclerosis.

    PubMed

    Williams, J; O'Rourke, K; Hutchinson, M; Tubridy, N

    2006-10-01

    The Paced Auditory Serial Addition Test (PASAT) is the chosen task for cognitive assessment in the multiple sclerosis functional composite (MSFC) and a widely used task in neuropsychological studies of people with multiple sclerosis (MS), but is unpopular with patients. The Face-Symbol Test (FST) and Symbol-Digit Tests (SDT) are alternative methods of cognitive testing in MS, which are easily administered and patient-friendly. In order to evaluate the potential of the FST as a possible surrogate for the PASAT, we directly compared the FST to the PASAT and the SDT in a cohort of 50 MS patients with varying levels of disability. There was significant correlation between SDT and FST scores (Spearman's rho 0.80, 95% CI 0.66-0.88), R(2) 65%, with moderate inter-test agreement (k =0.52). In contrast, SDT and FST scores were less predictive of PASAT scores. We concluded that neither the FST nor SDT are reliable surrogates for the PASAT.

  5. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Building Diversified Multiple Trees for classification in high dimensional noisy biomedical data.

    PubMed

    Li, Jiuyong; Liu, Lin; Liu, Jixue; Green, Ryan

    2017-12-01

    It is common that a trained classification model is applied to the operating data that is deviated from the training data because of noise. This paper will test an ensemble method, Diversified Multiple Tree (DMT), on its capability for classifying instances in a new laboratory using the classifier built on the instances of another laboratory. DMT is tested on three real world biomedical data sets from different laboratories in comparison with four benchmark ensemble methods, AdaBoost, Bagging, Random Forests, and Random Trees. Experiments have also been conducted on studying the limitation of DMT and its possible variations. Experimental results show that DMT is significantly more accurate than other benchmark ensemble classifiers on classifying new instances of a different laboratory from the laboratory where instances are used to build the classifier. This paper demonstrates that an ensemble classifier, DMT, is more robust in classifying noisy data than other widely used ensemble methods. DMT works on the data set that supports multiple simple trees.

  7. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models

    PubMed Central

    Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong

    2015-01-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955

  9. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    PubMed

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  10. EBUS-TBNA Provides Highest RNA Yield for Multiple Biomarker Testing from Routinely Obtained Small Biopsies in Non-Small Cell Lung Cancer Patients - A Comparative Study of Three Different Minimal Invasive Sampling Methods

    PubMed Central

    Schmid-Bindert, Gerald; Wang, Yongsheng; Jiang, Hongbin; Sun, Hui; Henzler, Thomas; Wang, Hao; Pilz, Lothar R.; Ren, Shengxiang; Zhou, Caicun

    2013-01-01

    Background Multiple biomarker testing is necessary to facilitate individualized treatment of lung cancer patients. More than 80% of lung cancers are diagnosed based on very small tumor samples. Often there is not enough tissue for molecular analysis. We compared three minimal invasive sampling methods with respect to RNA quantity for molecular testing. Methods 106 small biopsies were prospectively collected by three different methods forceps biopsy, endobronchial ultrasound (EBUS) guided transbronchial needle aspiration (TBNA), and CT-guided core biopsy. Samples were split into two halves. One part was formalin fixed and paraffin embedded for standard pathological evaluation. The other part was put in RNAlater for immediate RNA/DNA extraction. If the pathologist confirmed the diagnosis of non-small cell lung cancer(NSCLC), the following molecular markers were tested: EGFR mutation, ERCC1, RRM1 and BRCA1. Results Overall, RNA-extraction was possible in 101 out of 106 patients (95.3%). We found 49% adenocarcinomas, 38% squamouscarcinomas, and 14% non-otherwise-specified(NOS). The highest RNA yield came from endobronchial ultrasound guided needle aspiration, which was significantly higher than bronchoscopy (37.74±41.09 vs. 13.74±15.53 ng respectively, P = 0.005) and numerically higher than CT-core biopsy (37.74±41.09 vs. 28.72±44.27 ng respectively, P = 0.244). EGFR mutation testing was feasible in 100% of evaluable patients and its incidence was 40.8%, 7.9% and 14.3% in adenocarcinomas, squamouscarcinomas and NSCLC NOS subgroup respectively. There was no difference in the feasibility of molecular testing between the three sampling methods with feasibility rates for ERCC1, RRM1 and BRCA1 of 91%, 87% and 81% respectively. Conclusion All three methods can provide sufficient tumor material for multiple biomarkers testing from routinely obtained small biopsies in lung cancer patients. In our study EBUS guided needle aspiration provided the highest amount of tumor RNA compared to bronchoscopy or CT guided core biopsy. Thus EBUS should be considered as an acceptable option for tissue acquisition for molecular testing. PMID:24205040

  11. Modal survey of the space shuttle solid rocket motor using multiple input methods

    NASA Technical Reports Server (NTRS)

    Brillhart, Ralph; Hunt, David L.; Jensen, Brent M.; Mason, Donald R.

    1987-01-01

    The ability to accurately characterize propellant in a finite element model is a concern of engineers tasked with studying the dynamic response of the Space Shuttle Solid Rocket Motor (SRM). THe uncertainties arising from propellant characterization through specimem testing led to the decision to perform a model survey and model correlation of a single segment of the Shuttle SRM. Multiple input methods were used to excite and define case/propellant modes of both an inert segment and, later, a live propellant segment. These tests were successful at defining highly damped, flexible modes, several pairs of which occured with frequency spacing of less than two percent.

  12. A comparison of two follow-up analyses after multiple analysis of variance, analysis of variance, and descriptive discriminant analysis: A case study of the program effects on education-abroad programs

    Treesearch

    Alvin H. Yu; Garry Chick

    2010-01-01

    This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...

  13. Non-destructive testing of full-length bonded rock bolts based on HHT signal analysis

    NASA Astrophysics Data System (ADS)

    Shi, Z. M.; Liu, L.; Peng, M.; Liu, C. C.; Tao, F. J.; Liu, C. S.

    2018-04-01

    Full-length bonded rock bolts are commonly used in mining, tunneling and slope engineering because of their simple design and resistance to corrosion. However, the length of a rock bolt and grouting quality do not often meet the required design standards in practice because of the concealment and complexity of bolt construction. Non-destructive testing is preferred when testing a rock bolt's quality because of the convenience, low cost and wide detection range. In this paper, a signal analysis method for the non-destructive sound wave testing of full-length bonded rock bolts is presented, which is based on the Hilbert-Huang transform (HHT). First, we introduce the HHT analysis method to calculate the bolt length and identify defect locations based on sound wave reflection test signals, which includes decomposing the test signal via empirical mode decomposition (EMD), selecting the intrinsic mode functions (IMF) using the Pearson Correlation Index (PCI) and calculating the instantaneous phase and frequency via the Hilbert transform (HT). Second, six model tests are conducted using different grouting defects and bolt protruding lengths to verify the effectiveness of the HHT analysis method. Lastly, the influence of the bolt protruding length on the test signal, identification of multiple reflections from defects, bolt end and protruding end, and mode mixing from EMD are discussed. The HHT analysis method can identify the bolt length and grouting defect locations from signals that contain noise at multiple reflected interfaces. The reflection from the long protruding end creates an irregular test signal with many frequency peaks on the spectrum. The reflections from defects barely change the original signal because they are low energy, which cannot be adequately resolved using existing methods. The HHT analysis method can identify reflections from the long protruding end of the bolt and multiple reflections from grouting defects based on mutations in the instantaneous frequency, which makes weak reflections more noticeable. The mode mixing phenomenon is observed in several tests, but this does not markedly affect the identification results due to the simple medium in bolt tests. The mode mixing can be reduced by ensemble EMD (EEMD) or complete ensemble EMD with adaptive noise (CEEMDAN), which are powerful tools to used analyze the test signal in a complex medium and may play an important role in future studies. The HHT bolt signal analysis method is a self-adaptive and automatic process, which can be programed as analysis software and will make bolt tests more convenient in practice.

  14. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    ERIC Educational Resources Information Center

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  15. Testing for measurement invariance and latent mean differences across methods: interesting incremental information from multitrait-multimethod studies

    PubMed Central

    Geiser, Christian; Burns, G. Leonard; Servera, Mateu

    2014-01-01

    Models of confirmatory factor analysis (CFA) are frequently applied to examine the convergent validity of scores obtained from multiple raters or methods in so-called multitrait-multimethod (MTMM) investigations. We show that interesting incremental information about method effects can be gained from including mean structures and tests of MI across methods in MTMM models. We present a modeling framework for testing MI in the first step of a CFA-MTMM analysis. We also discuss the relevance of MI in the context of four more complex CFA-MTMM models with method factors. We focus on three recently developed multiple-indicator CFA-MTMM models for structurally different methods [the correlated traits-correlated (methods – 1), latent difference, and latent means models; Geiser et al., 2014a; Pohl and Steyer, 2010; Pohl et al., 2008] and one model for interchangeable methods (Eid et al., 2008). We demonstrate that some of these models require or imply MI by definition for a proper interpretation of trait or method factors, whereas others do not, and explain why MI may or may not be required in each model. We show that in the model for interchangeable methods, testing for MI is critical for determining whether methods can truly be seen as interchangeable. We illustrate the theoretical issues in an empirical application to an MTMM study of attention deficit and hyperactivity disorder (ADHD) with mother, father, and teacher ratings as methods. PMID:25400603

  16. Evaluating Equating Results in the Non-Equivalent Groups with Anchor Test Design Using Equipercentile and Equity Criteria

    ERIC Educational Resources Information Center

    Duong, Minh Quang

    2011-01-01

    Testing programs often use multiple test forms of the same test to control item exposure and to ensure test security. Although test forms are constructed to be as similar as possible, they often differ. Test equating techniques are those statistical methods used to adjust scores obtained on different test forms of the same test so that they are…

  17. Robust Scale Transformation Methods in IRT True Score Equating under Common-Item Nonequivalent Groups Design

    ERIC Educational Resources Information Center

    He, Yong

    2013-01-01

    Common test items play an important role in equating multiple test forms under the common-item nonequivalent groups design. Inconsistent item parameter estimates among common items can lead to large bias in equated scores for IRT true score equating. Current methods extensively focus on detection and elimination of outlying common items, which…

  18. Advances in Testing the Statistical Significance of Mediation Effects

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.

    2006-01-01

    P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…

  19. Adaptive testing for multiple traits in a proportional odds model with applications to detect SNP-brain network associations.

    PubMed

    Kim, Junghi; Pan, Wei

    2017-04-01

    There has been increasing interest in developing more powerful and flexible statistical tests to detect genetic associations with multiple traits, as arising from neuroimaging genetic studies. Most of existing methods treat a single trait or multiple traits as response while treating an SNP as a predictor coded under an additive inheritance mode. In this paper, we follow an earlier approach in treating an SNP as an ordinal response while treating traits as predictors in a proportional odds model (POM). In this way, it is not only easier to handle mixed types of traits, e.g., some quantitative and some binary, but it is also potentially more robust to the commonly adopted additive inheritance mode. More importantly, we develop an adaptive test in a POM so that it can maintain high power across many possible situations. Compared to the existing methods treating multiple traits as responses, e.g., in a generalized estimating equation (GEE) approach, the proposed method can be applied to a high dimensional setting where the number of phenotypes (p) can be larger than the sample size (n), in addition to a usual small P setting. The promising performance of the proposed method was demonstrated with applications to the Alzheimer's Disease Neuroimaging Initiative (ADNI) data, in which either structural MRI driven phenotypes or resting-state functional MRI (rs-fMRI) derived brain functional connectivity measures were used as phenotypes. The applications led to the identification of several top SNPs of biological interest. Furthermore, simulation studies showed competitive performance of the new method, especially for p>n. © 2017 WILEY PERIODICALS, INC.

  20. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  1. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. integIRTy: a method to identify genes altered in cancer by accounting for multiple mechanisms of regulation using item response theory.

    PubMed

    Tong, Pan; Coombes, Kevin R

    2012-11-15

    Identifying genes altered in cancer plays a crucial role in both understanding the mechanism of carcinogenesis and developing novel therapeutics. It is known that there are various mechanisms of regulation that can lead to gene dysfunction, including copy number change, methylation, abnormal expression, mutation and so on. Nowadays, all these types of alterations can be simultaneously interrogated by different types of assays. Although many methods have been proposed to identify altered genes from a single assay, there is no method that can deal with multiple assays accounting for different alteration types systematically. In this article, we propose a novel method, integration using item response theory (integIRTy), to identify altered genes by using item response theory that allows integrated analysis of multiple high-throughput assays. When applied to a single assay, the proposed method is more robust and reliable than conventional methods such as Student's t-test or the Wilcoxon rank-sum test. When used to integrate multiple assays, integIRTy can identify novel-altered genes that cannot be found by looking at individual assay separately. We applied integIRTy to three public cancer datasets (ovarian carcinoma, breast cancer, glioblastoma) for cross-assay type integration which all show encouraging results. The R package integIRTy is available at the web site http://bioinformatics.mdanderson.org/main/OOMPA:Overview. kcoombes@mdanderson.org. Supplementary data are available at Bioinformatics online.

  3. Categorical Variables in Multiple Regression: Some Cautions.

    ERIC Educational Resources Information Center

    O'Grady, Kevin E.; Medoff, Deborah R.

    1988-01-01

    Limitations of dummy coding and nonsense coding as methods of coding categorical variables for use as predictors in multiple regression analysis are discussed. The combination of these approaches often yields estimates and tests of significance that are not intended by researchers for inclusion in their models. (SLD)

  4. Equalization in Aeronautical Telemetry Using Multiple Antennas

    DTIC Science & Technology

    2014-04-01

    Multiple Antennas April 2014 DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Test Resource Management Center...Telemetry Using Multiple Antennas 5a. CONTRACT NUMBER: W900KK-13-C- 0026 5b. GRANT NUMBER: N/A 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Michael...employing two transmit antennas and as a method for exploiting partial channel state information by the transmitter. The generalization involves

  5. Evaluating Equity at the Local Level Using Bootstrap Tests. Research Report 2016-4

    ERIC Educational Resources Information Center

    Kim, YoungKoung; DeCarlo, Lawrence T.

    2016-01-01

    Because of concerns about test security, different test forms are typically used across different testing occasions. As a result, equating is necessary in order to get scores from the different test forms that can be used interchangeably. In order to assure the quality of equating, multiple equating methods are often examined. Various equity…

  6. Testing a single regression coefficient in high dimensional linear models

    PubMed Central

    Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling

    2017-01-01

    In linear regression models with high dimensional data, the classical z-test (or t-test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z-test to assess the significance of each covariate. Based on the p-value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively. PMID:28663668

  7. Testing a single regression coefficient in high dimensional linear models.

    PubMed

    Lan, Wei; Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling

    2016-11-01

    In linear regression models with high dimensional data, the classical z -test (or t -test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z -test to assess the significance of each covariate. Based on the p -value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively.

  8. Science Library of Test Items. Volume Three. Mastery Testing Programme. Introduction and Manual.

    ERIC Educational Resources Information Center

    New South Wales Dept. of Education, Sydney (Australia).

    A set of short tests aimed at measuring student mastery of specific skills in the natural sciences are presented with a description of the mastery program's purposes, development, and methods. Mastery learning, criterion-referenced testing, and the scope of skills to be tested are defined. Each of the multiple choice tests for grades 7 through 10…

  9. High School Students' Concepts of Acids and Bases.

    ERIC Educational Resources Information Center

    Ross, Bertram H. B.

    An investigation of Ontario high school students' understanding of acids and bases with quantitative and qualitative methods revealed misconceptions. A concept map, based on the objectives of the Chemistry Curriculum Guideline, generated multiple-choice items and interview questions. The multiple-choice test was administered to 34 grade 12…

  10. Nutzwertanalyse

    Treesearch

    A. Henne

    1978-01-01

    Nutzwertanalyse (NUWA) is a psychometric instrument for finding the test compromise in the multiple use planning of forestry, when the multiple objectives cannot be expressed in the same physical or monetary unit. It insures a systematic assessment of the consequences of proposed alternatives and thoroughly documents the decision process. The method leads to a ranking...

  11. Assessing the Impact of Influential Observations on Multiple Regression Analysis on Human Resource Research.

    ERIC Educational Resources Information Center

    Bates, Reid A.; Holton, Elwood F., III; Burnett, Michael F.

    1999-01-01

    A case study of learning transfer demonstrates the possible effect of influential observation on linear regression analysis. A diagnostic method that tests for violation of assumptions, multicollinearity, and individual and multiple influential observations helps determine which observation to delete to eliminate bias. (SK)

  12. Interlaboratory comparability, bias, and precision for four laboratories measuring constituents in precipitation, November 1982-August 1983

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Malo, B.A.

    1985-01-01

    Four laboratories were evaluated in their analysis of identical natural and simulated precipitation water samples. Interlaboratory comparability was evaluated using analysis of variance coupled with Duncan 's multiple range test, and linear-regression models describing the relations between individual laboratory analytical results for natural precipitation samples. Results of the statistical analyses indicate that certain pairs of laboratories produce different results when analyzing identical samples. Analyte bias for each laboratory was examined using analysis of variance coupled with Duncan 's multiple range test on data produced by the laboratories from the analysis of identical simulated precipitation samples. Bias for a given analyte produced by a single laboratory has been indicated when the laboratory mean for that analyte is shown to be significantly different from the mean for the most-probable analyte concentrations in the simulated precipitation samples. Ion-chromatographic methods for the determination of chloride, nitrate, and sulfate have been compared with the colorimetric methods that were also in use during the study period. Comparisons were made using analysis of variance coupled with Duncan 's multiple range test for means produced by the two methods. Analyte precision for each laboratory has been estimated by calculating a pooled variance for each analyte. Analyte estimated precisions have been compared using F-tests and differences in analyte precisions for laboratory pairs have been reported. (USGS)

  13. Iterative Usage of Fixed and Random Effect Models for Powerful and Efficient Genome-Wide Association Studies

    PubMed Central

    Liu, Xiaolei; Huang, Meng; Fan, Bin; Buckler, Edward S.; Zhang, Zhiwu

    2016-01-01

    False positives in a Genome-Wide Association Study (GWAS) can be effectively controlled by a fixed effect and random effect Mixed Linear Model (MLM) that incorporates population structure and kinship among individuals to adjust association tests on markers; however, the adjustment also compromises true positives. The modified MLM method, Multiple Loci Linear Mixed Model (MLMM), incorporates multiple markers simultaneously as covariates in a stepwise MLM to partially remove the confounding between testing markers and kinship. To completely eliminate the confounding, we divided MLMM into two parts: Fixed Effect Model (FEM) and a Random Effect Model (REM) and use them iteratively. FEM contains testing markers, one at a time, and multiple associated markers as covariates to control false positives. To avoid model over-fitting problem in FEM, the associated markers are estimated in REM by using them to define kinship. The P values of testing markers and the associated markers are unified at each iteration. We named the new method as Fixed and random model Circulating Probability Unification (FarmCPU). Both real and simulated data analyses demonstrated that FarmCPU improves statistical power compared to current methods. Additional benefits include an efficient computing time that is linear to both number of individuals and number of markers. Now, a dataset with half million individuals and half million markers can be analyzed within three days. PMID:26828793

  14. The Voronoi Implicit Interface Method for computing multiphase physics.

    PubMed

    Saye, Robert I; Sethian, James A

    2011-12-06

    We introduce a numerical framework, the Voronoi Implicit Interface Method for tracking multiple interacting and evolving regions (phases) whose motion is determined by complex physics (fluids, mechanics, elasticity, etc.), intricate jump conditions, internal constraints, and boundary conditions. The method works in two and three dimensions, handles tens of thousands of interfaces and separate phases, and easily and automatically handles multiple junctions, triple points, and quadruple points in two dimensions, as well as triple lines, etc., in higher dimensions. Topological changes occur naturally, with no surgery required. The method is first-order accurate at junction points/lines, and of arbitrarily high-order accuracy away from such degeneracies. The method uses a single function to describe all phases simultaneously, represented on a fixed Eulerian mesh. We test the method's accuracy through convergence tests, and demonstrate its applications to geometric flows, accurate prediction of von Neumann's law for multiphase curvature flow, and robustness under complex fluid flow with surface tension and large shearing forces.

  15. Detection of ingested nitromethane and reliable creatinine assessment using multiple common analytical methods.

    PubMed

    Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri

    2018-04-01

    Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.

  16. Online Testing Suffers Setbacks in Multiple States

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2013-01-01

    Widespread technical failures and interruptions of recent online testing in a number of states have shaken the confidence of educators and policymakers in high-tech assessment methods and raised serious concerns about schools' technological readiness for the coming common-core online tests. The glitches arose as many districts in the 46 states…

  17. Assessing Personality and Mood With Adjective Check List Methodology: A Review

    ERIC Educational Resources Information Center

    Craig, Robert J.

    2005-01-01

    This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…

  18. A Comparison of Three Tests of Mediation

    ERIC Educational Resources Information Center

    Warbasse, Rosalia E.

    2009-01-01

    A simulation study was conducted to evaluate the performance of three tests of mediation: the bias-corrected and accelerated bootstrap (Efron & Tibshirani, 1993), the asymmetric confidence limits test (MacKinnon, 2008), and a multiple regression approach described by Kenny, Kashy, and Bolger (1998). The evolution of these methods is reviewed and…

  19. The Performance of IRT Model Selection Methods with Mixed-Format Tests

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.

    2012-01-01

    When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the…

  20. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND GENERATION MODEL. PART I, PRELIMINARY DISCUSSIONS OF METHODOLOGY. SUPPLEMENT, COMPUTER PROGRAMS OF THE HDL INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    ALTMANN, BERTHOLD; BROWN, WILLIAM G.

    THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…

  1. An Empirical Comparison of Five Linear Equating Methods for the NEAT Design

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Mroch, Andrew A.; Kane, Michael T.; Ripkey, Douglas R.

    2009-01-01

    In this study, a data base containing the responses of 40,000 candidates to 90 multiple-choice questions was used to mimic data sets for 50-item tests under the "nonequivalent groups with anchor test" (NEAT) design. Using these smaller data sets, we evaluated the performance of five linear equating methods for the NEAT design with five levels of…

  2. Evaluation of Two Methods for Modeling Measurement Errors When Testing Interaction Effects with Observed Composite Scores

    ERIC Educational Resources Information Center

    Hsiao, Yu-Yu; Kwok, Oi-Man; Lai, Mark H. C.

    2018-01-01

    Path models with observed composites based on multiple items (e.g., mean or sum score of the items) are commonly used to test interaction effects. Under this practice, researchers generally assume that the observed composites are measured without errors. In this study, we reviewed and evaluated two alternative methods within the structural…

  3. On testing an unspecified function through a linear mixed effects model with multiple variance components

    PubMed Central

    Wang, Yuanjia; Chen, Huaihou

    2012-01-01

    Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801

  4. On testing an unspecified function through a linear mixed effects model with multiple variance components.

    PubMed

    Wang, Yuanjia; Chen, Huaihou

    2012-12-01

    We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.

  5. A simple test of association for contingency tables with multiple column responses.

    PubMed

    Decady, Y J; Thomas, D R

    2000-09-01

    Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.

  6. Automated detection of age-related macular degeneration in OCT images using multiple instance learning

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Liu, Xiaoming; Yang, Zhou

    2017-07-01

    Age-related Macular Degeneration (AMD) is a kind of macular disease which mostly occurs in old people,and it may cause decreased vision or even lead to permanent blindness. Drusen is an important clinical indicator for AMD which can help doctor diagnose disease and decide the strategy of treatment. Optical Coherence Tomography (OCT) is widely used in the diagnosis of ophthalmic diseases, include AMD. In this paper, we propose a classification method based on Multiple Instance Learning (MIL) to detect AMD. Drusen can exist in a few slices of OCT images, and MIL is utilized in our method. We divided the method into two phases: training phase and testing phase. We train the initial features and clustered to create a codebook, and employ the trained classifier in the test set. Experiment results show that our method achieved high accuracy and effectiveness.

  7. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  8. Detecting Renibacterium salmoninarum in wild brown trout by use of multiple organ samples and diagnostic methods

    USGS Publications Warehouse

    Guomundsdottir, S.; Applegate, Lynn M.; Arnason, I.O.; Kristmundsson, A.; Purcell, Maureen K.; Elliott, Diane G.

    2017-01-01

    Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease (BKD), is endemic in many wild trout species in northerly regions. The aim of the present study was to determine the optimal R. salmoninarum sampling/testing strategy for wild brown trout (Salmo trutta L.) populations in Iceland. Fish were netted in a lake and multiple organs—kidney, spleen, gills, oesophagus and mid-gut—were sampled and subjected to five detection tests i.e. culture, polyclonal enzyme-linked immunosorbent assay (pELISA) and three different PCR tests. The results showed that each fish had encountered R. salmoninarum but there were marked differences between results obtained depending on organ and test. The bacterium was not cultured from any kidney sample while all kidney samples were positive by pELISA. At least one organ from 92.9% of the fish tested positive by PCR. The results demonstrated that the choice of tissue and diagnostic method can dramatically influence the outcome of R. salmoninarum surveys.

  9. Testing Measurement Invariance across Groups of Children with and without Attention-Deficit/ Hyperactivity Disorder: Applications for Word Recognition and Spelling Tasks

    PubMed Central

    Lúcio, Patrícia S.; Salum, Giovanni; Swardfager, Walter; Mari, Jair de Jesus; Pan, Pedro M.; Bressan, Rodrigo A.; Gadelha, Ary; Rohde, Luis A.; Cogo-Moreira, Hugo

    2017-01-01

    Although studies have consistently demonstrated that children with attention-deficit/hyperactivity disorder (ADHD) perform significantly lower than controls on word recognition and spelling tests, such studies rely on the assumption that those groups are comparable in these measures. This study investigates comparability of word recognition and spelling tests based on diagnostic status for ADHD through measurement invariance methods. The participants (n = 1,935; 47% female; 11% ADHD) were children aged 6–15 with normal IQ (≥70). Measurement invariance was investigated through Confirmatory Factor Analysis and Multiple Indicators Multiple Causes models. Measurement invariance was attested in both methods, demonstrating the direct comparability of the groups. Children with ADHD were 0.51 SD lower in word recognition and 0.33 SD lower in spelling tests than controls. Results suggest that differences in performance on word recognition and spelling tests are related to true mean differences based on ADHD diagnostic status. Implications for clinical practice and research are discussed. PMID:29118733

  10. Testing Measurement Invariance across Groups of Children with and without Attention-Deficit/ Hyperactivity Disorder: Applications for Word Recognition and Spelling Tasks.

    PubMed

    Lúcio, Patrícia S; Salum, Giovanni; Swardfager, Walter; Mari, Jair de Jesus; Pan, Pedro M; Bressan, Rodrigo A; Gadelha, Ary; Rohde, Luis A; Cogo-Moreira, Hugo

    2017-01-01

    Although studies have consistently demonstrated that children with attention-deficit/hyperactivity disorder (ADHD) perform significantly lower than controls on word recognition and spelling tests, such studies rely on the assumption that those groups are comparable in these measures. This study investigates comparability of word recognition and spelling tests based on diagnostic status for ADHD through measurement invariance methods. The participants ( n = 1,935; 47% female; 11% ADHD) were children aged 6-15 with normal IQ (≥70). Measurement invariance was investigated through Confirmatory Factor Analysis and Multiple Indicators Multiple Causes models. Measurement invariance was attested in both methods, demonstrating the direct comparability of the groups. Children with ADHD were 0.51 SD lower in word recognition and 0.33 SD lower in spelling tests than controls. Results suggest that differences in performance on word recognition and spelling tests are related to true mean differences based on ADHD diagnostic status. Implications for clinical practice and research are discussed.

  11. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  13. Development of a multiple immunoaffinity column for simultaneous determination of multiple mycotoxins in feeds using UPLC-MS/MS.

    PubMed

    Hu, Xiaofeng; Hu, Rui; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Wang, Min

    2016-09-01

    A sensitive and specific immunoaffinity column to clean up and isolate multiple mycotoxins was developed along with a rapid one-step sample preparation procedure for ultra-performance liquid chromatography-tandem mass spectrometry analysis. Monoclonal antibodies against aflatoxin B1, aflatoxin B2, aflatoxin G1, aflatoxin G2, zearalenone, ochratoxin A, sterigmatocystin, and T-2 toxin were coupled to microbeads for mycotoxin purification. We optimized a homogenization and extraction procedure as well as column loading and elution conditions to maximize recoveries from complex feed matrices. This method allowed rapid, simple, and simultaneous determination of mycotoxins in feeds with a single chromatographic run. Detection limits for these toxins ranged from 0.006 to 0.12 ng mL(-1), and quantitation limits ranged from 0.06 to 0.75 ng mL(-1). Concentration curves were linear from 0.12 to 40 μg kg(-1) with correlation coefficients of R (2) > 0.99. Intra-assay and inter-assay comparisons indicated excellent repeatability and reproducibility of the multiple immunoaffinity columns. As a proof of principle, 80 feed samples were tested and several contained multiple mycotoxins. This method is sensitive, rapid, and durable enough for multiple mycotoxin determinations that fulfill European Union and Chinese testing criteria.

  14. Methods for detecting long-term CNS dysfunction after prenatal exposure to neurotoxins.

    PubMed

    Vorhees, C V

    1997-11-01

    Current U.S. Environmental Protection Agency regulatory guidelines for developmental neurotoxicity emphasize functional categories such as motor activity, auditory startle, and learning and memory. A single test of some simple form of learning and memory is accepted to meet the latter category. The rationale for this emphasis has been that sensitive and reliable methods for assessing complex learning and memory are either not available or are too burdensome, and that insufficient data exist to endorse one approach over another. There has been little discussion of the fact that learning and memory is not a single identifiable functional category and no single test can assess all types of learning and memory. Three methods for assessing complex learning and memory are presented that assess two different types of learning and memory, are relatively efficient to conduct, and are sensitive to several known neurobehavioral teratogens. The tests are a 9-unit multiple-T swimming maze, and the Morris and Barnes mazes. The first of these assesses sequential learning, while the latter two assess spatial learning. A description of each test is provided, along with procedures for their use, and data exemplifying effects obtained using developmental exposure to phenytoin, methamphetamine, and MDMA. It is argued that multiple tests of learning and memory are required to ascertain cognitive deficits; something no single method can accomplish. Methods for acoustic startle are also presented.

  15. Automated optimal coordination of multiple-DOF neuromuscular actions in feedforward neuroprostheses.

    PubMed

    Lujan, J Luis; Crago, Patrick E

    2009-01-01

    This paper describes a new method for designing feedforward controllers for multiple-muscle, multiple-DOF, motor system neural prostheses. The design process is based on experimental measurement of the forward input/output properties of the neuromechanical system and numerical optimization of stimulation patterns to meet muscle coactivation criteria, thus resolving the muscle redundancy (i.e., overcontrol) and the coupled DOF problems inherent in neuromechanical systems. We designed feedforward controllers to control the isometric forces at the tip of the thumb in two directions during stimulation of three thumb muscles as a model system. We tested the method experimentally in ten able-bodied individuals and one patient with spinal cord injury. Good control of isometric force in both DOFs was observed, with rms errors less than 10% of the force range in seven experiments and statistically significant correlations between the actual and target forces in all ten experiments. Systematic bias and slope errors were observed in a few experiments, likely due to the neuromuscular fatigue. Overall, the tests demonstrated the ability of a general design approach to satisfy both control and coactivation criteria in multiple-muscle, multiple-axis neuromechanical systems, which is applicable to a wide range of neuromechanical systems and stimulation electrodes.

  16. Fault diagnosis of sensor networked structures with multiple faults using a virtual beam based approach

    NASA Astrophysics Data System (ADS)

    Wang, H.; Jing, X. J.

    2017-07-01

    This paper presents a virtual beam based approach suitable for conducting diagnosis of multiple faults in complex structures with limited prior knowledge of the faults involved. The "virtual beam", a recently-proposed concept for fault detection in complex structures, is applied, which consists of a chain of sensors representing a vibration energy transmission path embedded in the complex structure. Statistical tests and adaptive threshold are particularly adopted for fault detection due to limited prior knowledge of normal operational conditions and fault conditions. To isolate the multiple faults within a specific structure or substructure of a more complex one, a 'biased running' strategy is developed and embedded within the bacterial-based optimization method to construct effective virtual beams and thus to improve the accuracy of localization. The proposed method is easy and efficient to implement for multiple fault localization with limited prior knowledge of normal conditions and faults. With extensive experimental results, it is validated that the proposed method can localize both single fault and multiple faults more effectively than the classical trust index subtract on negative add on positive (TI-SNAP) method.

  17. The use of multiple imputation for the accurate measurements of individual feed intake by electronic feeders.

    PubMed

    Jiao, S; Tiezzi, F; Huang, Y; Gray, K A; Maltecca, C

    2016-02-01

    Obtaining accurate individual feed intake records is the key first step in achieving genetic progress toward more efficient nutrient utilization in pigs. Feed intake records collected by electronic feeding systems contain errors (erroneous and abnormal values exceeding certain cutoff criteria), which are due to feeder malfunction or animal-feeder interaction. In this study, we examined the use of a novel data-editing strategy involving multiple imputation to minimize the impact of errors and missing values on the quality of feed intake data collected by an electronic feeding system. Accuracy of feed intake data adjustment obtained from the conventional linear mixed model (LMM) approach was compared with 2 alternative implementations of multiple imputation by chained equation, denoted as MI (multiple imputation) and MICE (multiple imputation by chained equation). The 3 methods were compared under 3 scenarios, where 5, 10, and 20% feed intake error rates were simulated. Each of the scenarios was replicated 5 times. Accuracy of the alternative error adjustment was measured as the correlation between the true daily feed intake (DFI; daily feed intake in the testing period) or true ADFI (the mean DFI across testing period) and the adjusted DFI or adjusted ADFI. In the editing process, error cutoff criteria are used to define if a feed intake visit contains errors. To investigate the possibility that the error cutoff criteria may affect any of the 3 methods, the simulation was repeated with 2 alternative error cutoff values. Multiple imputation methods outperformed the LMM approach in all scenarios with mean accuracies of 96.7, 93.5, and 90.2% obtained with MI and 96.8, 94.4, and 90.1% obtained with MICE compared with 91.0, 82.6, and 68.7% using LMM for DFI. Similar results were obtained for ADFI. Furthermore, multiple imputation methods consistently performed better than LMM regardless of the cutoff criteria applied to define errors. In conclusion, multiple imputation is proposed as a more accurate and flexible method for error adjustments in feed intake data collected by electronic feeders.

  18. A functional U-statistic method for association analysis of sequencing data.

    PubMed

    Jadhav, Sneha; Tong, Xiaoran; Lu, Qing

    2017-11-01

    Although sequencing studies hold great promise for uncovering novel variants predisposing to human diseases, the high dimensionality of the sequencing data brings tremendous challenges to data analysis. Moreover, for many complex diseases (e.g., psychiatric disorders) multiple related phenotypes are collected. These phenotypes can be different measurements of an underlying disease, or measurements characterizing multiple related diseases for studying common genetic mechanism. Although jointly analyzing these phenotypes could potentially increase the power of identifying disease-associated genes, the different types of phenotypes pose challenges for association analysis. To address these challenges, we propose a nonparametric method, functional U-statistic method (FU), for multivariate analysis of sequencing data. It first constructs smooth functions from individuals' sequencing data, and then tests the association of these functions with multiple phenotypes by using a U-statistic. The method provides a general framework for analyzing various types of phenotypes (e.g., binary and continuous phenotypes) with unknown distributions. Fitting the genetic variants within a gene using a smoothing function also allows us to capture complexities of gene structure (e.g., linkage disequilibrium, LD), which could potentially increase the power of association analysis. Through simulations, we compared our method to the multivariate outcome score test (MOST), and found that our test attained better performance than MOST. In a real data application, we apply our method to the sequencing data from Minnesota Twin Study (MTS) and found potential associations of several nicotine receptor subunit (CHRN) genes, including CHRNB3, associated with nicotine dependence and/or alcohol dependence. © 2017 WILEY PERIODICALS, INC.

  19. Experimental Validation of the Dynamic Inertia Measurement Method to Find the Mass Properties of an Iron Bird Test Article

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The mass properties of an aerospace vehicle are required by multiple disciplines in the analysis and prediction of flight behavior. Pendulum oscillation methods have been developed and employed for almost a century as a means to measure mass properties. However, these oscillation methods are costly, time consuming, and risky. The NASA Armstrong Flight Research Center has been investigating the Dynamic Inertia Measurement, or DIM method as a possible alternative to oscillation methods. The DIM method uses ground test techniques that are already applied to aerospace vehicles when conducting modal surveys. Ground vibration tests would require minimal additional instrumentation and time to apply the DIM method. The DIM method has been validated on smaller test articles, but has not yet been fully proven on large aerospace vehicles.

  20. Shear Recovery Accuracy in Weak-Lensing Analysis with the Elliptical Gauss-Laguerre Method

    NASA Astrophysics Data System (ADS)

    Nakajima, Reiko; Bernstein, Gary

    2007-04-01

    We implement the elliptical Gauss-Laguerre (EGL) galaxy-shape measurement method proposed by Bernstein & Jarvis and quantify the shear recovery accuracy in weak-lensing analysis. This method uses a deconvolution fitting scheme to remove the effects of the point-spread function (PSF). The test simulates >107 noisy galaxy images convolved with anisotropic PSFs and attempts to recover an input shear. The tests are designed to be immune to statistical (random) distributions of shapes, selection biases, and crowding, in order to test more rigorously the effects of detection significance (signal-to-noise ratio [S/N]), PSF, and galaxy resolution. The systematic error in shear recovery is divided into two classes, calibration (multiplicative) and additive, with the latter arising from PSF anisotropy. At S/N > 50, the deconvolution method measures the galaxy shape and input shear to ~1% multiplicative accuracy and suppresses >99% of the PSF anisotropy. These systematic errors increase to ~4% for the worst conditions, with poorly resolved galaxies at S/N simeq 20. The EGL weak-lensing analysis has the best demonstrated accuracy to date, sufficient for the next generation of weak-lensing surveys.

  1. Seeing is believing: video classification for computed tomographic colonography using multiple-instance learning.

    PubMed

    Wang, Shijun; McKenna, Matthew T; Nguyen, Tan B; Burns, Joseph E; Petrick, Nicholas; Sahiner, Berkman; Summers, Ronald M

    2012-05-01

    In this paper, we present development and testing results for a novel colonic polyp classification method for use as part of a computed tomographic colonography (CTC) computer-aided detection (CAD) system. Inspired by the interpretative methodology of radiologists using 3-D fly-through mode in CTC reading, we have developed an algorithm which utilizes sequences of images (referred to here as videos) for classification of CAD marks. For each CAD mark, we created a video composed of a series of intraluminal, volume-rendered images visualizing the detection from multiple viewpoints. We then framed the video classification question as a multiple-instance learning (MIL) problem. Since a positive (negative) bag may contain negative (positive) instances, which in our case depends on the viewing angles and camera distance to the target, we developed a novel MIL paradigm to accommodate this class of problems. We solved the new MIL problem by maximizing a L2-norm soft margin using semidefinite programming, which can optimize relevant parameters automatically. We tested our method by analyzing a CTC data set obtained from 50 patients from three medical centers. Our proposed method showed significantly better performance compared with several traditional MIL methods.

  2. Seeing is Believing: Video Classification for Computed Tomographic Colonography Using Multiple-Instance Learning

    PubMed Central

    Wang, Shijun; McKenna, Matthew T.; Nguyen, Tan B.; Burns, Joseph E.; Petrick, Nicholas; Sahiner, Berkman

    2012-01-01

    In this paper we present development and testing results for a novel colonic polyp classification method for use as part of a computed tomographic colonography (CTC) computer-aided detection (CAD) system. Inspired by the interpretative methodology of radiologists using 3D fly-through mode in CTC reading, we have developed an algorithm which utilizes sequences of images (referred to here as videos) for classification of CAD marks. For each CAD mark, we created a video composed of a series of intraluminal, volume-rendered images visualizing the detection from multiple viewpoints. We then framed the video classification question as a multiple-instance learning (MIL) problem. Since a positive (negative) bag may contain negative (positive) instances, which in our case depends on the viewing angles and camera distance to the target, we developed a novel MIL paradigm to accommodate this class of problems. We solved the new MIL problem by maximizing a L2-norm soft margin using semidefinite programming, which can optimize relevant parameters automatically. We tested our method by analyzing a CTC data set obtained from 50 patients from three medical centers. Our proposed method showed significantly better performance compared with several traditional MIL methods. PMID:22552333

  3. Multiple Image Arrangement for Subjective Quality Assessment

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Zhai, Guangtao

    2017-12-01

    Subjective quality assessment serves as the foundation for almost all visual quality related researches. Size of the image quality databases has expanded from dozens to thousands in the last decades. Since each subjective rating therein has to be averaged over quite a few participants, the ever-increasing overall size of those databases calls for an evolution of existing subjective test methods. Traditional single/double stimulus based approaches are being replaced by multiple image tests, where several distorted versions of the original one are displayed and rated at once. And this naturally brings upon the question of how to arrange those multiple images on screen during the test. In this paper, we answer this question by performing subjective viewing test with eye tracker for different types arrangements. Our research indicates that isometric arrangement imposes less duress on participants and has more uniform distribution of eye fixations and movements and therefore is expected to generate more reliable subjective ratings.

  4. Estimates of alcohol involvement in fatal crashes : new alcohol methodology

    DOT National Transportation Integrated Search

    2002-01-01

    The National Highway Traffic Safety Administration (NHTSA) has adopted a new method to : estimate missing blood alcohol concentration (BAC) test result data. This new method, multiple : imputation, will be used by NHTSAs National Center for Statis...

  5. Making the Cut in Gifted Selection: Score Combination Rules and Their Impact on Program Diversity

    ERIC Educational Resources Information Center

    Lakin, Joni M.

    2018-01-01

    The recommendation of using "multiple measures" is common in policy guidelines for gifted and talented assessment systems. However, the integration of multiple test scores in a system that uses cut-scores requires choosing between different methods of combining quantitative scores. Past research has indicated that OR combination rules…

  6. Family-Based Rare Variant Association Analysis: A Fast and Efficient Method of Multivariate Phenotype Association Analysis.

    PubMed

    Wang, Longfei; Lee, Sungyoung; Gim, Jungsoo; Qiao, Dandi; Cho, Michael; Elston, Robert C; Silverman, Edwin K; Won, Sungho

    2016-09-01

    Family-based designs have been repeatedly shown to be powerful in detecting the significant rare variants associated with human diseases. Furthermore, human diseases are often defined by the outcomes of multiple phenotypes, and thus we expect multivariate family-based analyses may be very efficient in detecting associations with rare variants. However, few statistical methods implementing this strategy have been developed for family-based designs. In this report, we describe one such implementation: the multivariate family-based rare variant association tool (mFARVAT). mFARVAT is a quasi-likelihood-based score test for rare variant association analysis with multiple phenotypes, and tests both homogeneous and heterogeneous effects of each variant on multiple phenotypes. Simulation results show that the proposed method is generally robust and efficient for various disease models, and we identify some promising candidate genes associated with chronic obstructive pulmonary disease. The software of mFARVAT is freely available at http://healthstat.snu.ac.kr/software/mfarvat/, implemented in C++ and supported on Linux and MS Windows. © 2016 WILEY PERIODICALS, INC.

  7. GalaxyTBM: template-based modeling by building a reliable core and refining unreliable local regions.

    PubMed

    Ko, Junsu; Park, Hahnbeom; Seok, Chaok

    2012-08-10

    Protein structures can be reliably predicted by template-based modeling (TBM) when experimental structures of homologous proteins are available. However, it is challenging to obtain structures more accurate than the single best templates by either combining information from multiple templates or by modeling regions that vary among templates or are not covered by any templates. We introduce GalaxyTBM, a new TBM method in which the more reliable core region is modeled first from multiple templates and less reliable, variable local regions, such as loops or termini, are then detected and re-modeled by an ab initio method. This TBM method is based on "Seok-server," which was tested in CASP9 and assessed to be amongst the top TBM servers. The accuracy of the initial core modeling is enhanced by focusing on more conserved regions in the multiple-template selection and multiple sequence alignment stages. Additional improvement is achieved by ab initio modeling of up to 3 unreliable local regions in the fixed framework of the core structure. Overall, GalaxyTBM reproduced the performance of Seok-server, with GalaxyTBM and Seok-server resulting in average GDT-TS of 68.1 and 68.4, respectively, when tested on 68 single-domain CASP9 TBM targets. For application to multi-domain proteins, GalaxyTBM must be combined with domain-splitting methods. Application of GalaxyTBM to CASP9 targets demonstrates that accurate protein structure prediction is possible by use of a multiple-template-based approach, and ab initio modeling of variable regions can further enhance the model quality.

  8. 3D Face Modeling Using the Multi-Deformable Method

    PubMed Central

    Hwang, Jinkyu; Yu, Sunjin; Kim, Joongrock; Lee, Sangyoun

    2012-01-01

    In this paper, we focus on the problem of the accuracy performance of 3D face modeling techniques using corresponding features in multiple views, which is quite sensitive to feature extraction errors. To solve the problem, we adopt a statistical model-based 3D face modeling approach in a mirror system consisting of two mirrors and a camera. The overall procedure of our 3D facial modeling method has two primary steps: 3D facial shape estimation using a multiple 3D face deformable model and texture mapping using seamless cloning that is a type of gradient-domain blending. To evaluate our method's performance, we generate 3D faces of 30 individuals and then carry out two tests: accuracy test and robustness test. Our method shows not only highly accurate 3D face shape results when compared with the ground truth, but also robustness to feature extraction errors. Moreover, 3D face rendering results intuitively show that our method is more robust to feature extraction errors than other 3D face modeling methods. An additional contribution of our method is that a wide range of face textures can be acquired by the mirror system. By using this texture map, we generate realistic 3D face for individuals at the end of the paper. PMID:23201976

  9. A spatial scan statistic for compound Poisson data.

    PubMed

    Rosychuk, Rhonda J; Chang, Hsing-Ming

    2013-12-20

    The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.

  10. The Voronoi Implicit Interface Method for computing multiphase physics

    PubMed Central

    Saye, Robert I.; Sethian, James A.

    2011-01-01

    We introduce a numerical framework, the Voronoi Implicit Interface Method for tracking multiple interacting and evolving regions (phases) whose motion is determined by complex physics (fluids, mechanics, elasticity, etc.), intricate jump conditions, internal constraints, and boundary conditions. The method works in two and three dimensions, handles tens of thousands of interfaces and separate phases, and easily and automatically handles multiple junctions, triple points, and quadruple points in two dimensions, as well as triple lines, etc., in higher dimensions. Topological changes occur naturally, with no surgery required. The method is first-order accurate at junction points/lines, and of arbitrarily high-order accuracy away from such degeneracies. The method uses a single function to describe all phases simultaneously, represented on a fixed Eulerian mesh. We test the method’s accuracy through convergence tests, and demonstrate its applications to geometric flows, accurate prediction of von Neumann’s law for multiphase curvature flow, and robustness under complex fluid flow with surface tension and large shearing forces. PMID:22106269

  11. The Voronoi Implicit Interface Method for computing multiphase physics

    DOE PAGES

    Saye, Robert I.; Sethian, James A.

    2011-11-21

    In this paper, we introduce a numerical framework, the Voronoi Implicit Interface Method for tracking multiple interacting and evolving regions (phases) whose motion is determined by complex physics (fluids, mechanics, elasticity, etc.), intricate jump conditions, internal constraints, and boundary conditions. The method works in two and three dimensions, handles tens of thousands of interfaces and separate phases, and easily and automatically handles multiple junctions, triple points, and quadruple points in two dimensions, as well as triple lines, etc., in higher dimensions. Topological changes occur naturally, with no surgery required. The method is first-order accurate at junction points/lines, and of arbitrarilymore » high-order accuracy away from such degeneracies. The method uses a single function to describe all phases simultaneously, represented on a fixed Eulerian mesh. Finally, we test the method’s accuracy through convergence tests, and demonstrate its applications to geometric flows, accurate prediction of von Neumann’s law for multiphase curvature flow, and robustness under complex fluid flow with surface tension and large shearing forces.« less

  12. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.

    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-samplemore » composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.« less

  13. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  14. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  15. An improved EMD method for modal identification and a combined static-dynamic method for damage detection

    NASA Astrophysics Data System (ADS)

    Yang, Jinping; Li, Peizhen; Yang, Youfa; Xu, Dian

    2018-04-01

    Empirical mode decomposition (EMD) is a highly adaptable signal processing method. However, the EMD approach has certain drawbacks, including distortions from end effects and mode mixing. In the present study, these two problems are addressed using an end extension method based on the support vector regression machine (SVRM) and a modal decomposition method based on the characteristics of the Hilbert transform. The algorithm includes two steps: using the SVRM, the time series data are extended at both endpoints to reduce the end effects, and then, a modified EMD method using the characteristics of the Hilbert transform is performed on the resulting signal to reduce mode mixing. A new combined static-dynamic method for identifying structural damage is presented. This method combines the static and dynamic information in an equilibrium equation that can be solved using the Moore-Penrose generalized matrix inverse. The combination method uses the differences in displacements of the structure with and without damage and variations in the modal force vector. Tests on a four-story, steel-frame structure were conducted to obtain static and dynamic responses of the structure. The modal parameters are identified using data from the dynamic tests and improved EMD method. The new method is shown to be more accurate and effective than the traditional EMD method. Through tests with a shear-type test frame, the higher performance of the proposed static-dynamic damage detection approach, which can detect both single and multiple damage locations and the degree of the damage, is demonstrated. For structures with multiple damage, the combined approach is more effective than either the static or dynamic method. The proposed EMD method and static-dynamic damage detection method offer improved modal identification and damage detection, respectively, in structures.

  16. Evaluation of multiple tracer methods to estimate low groundwater flow velocities

    DOE PAGES

    Reimus, Paul W.; Arnold, Bill W.

    2017-02-20

    Here, four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or “shut-in” periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity datamore » are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a “ground truth” velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. We discuss the advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them.« less

  17. Evaluation of multiple tracer methods to estimate low groundwater flow velocities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimus, Paul W.; Arnold, Bill W.

    Here, four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or “shut-in” periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity datamore » are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a “ground truth” velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. We discuss the advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them.« less

  18. Memory and other properties of multiple test procedures generated by entangled graphs.

    PubMed

    Maurer, Willi; Bretz, Frank

    2013-05-10

    Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    PubMed

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  20. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data

    PubMed Central

    Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria

    2017-01-01

    Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372

  1. PERMANOVA-S: association test for microbial community composition that accommodates confounders and multiple distances.

    PubMed

    Tang, Zheng-Zheng; Chen, Guanhua; Alekseyenko, Alexander V

    2016-09-01

    Recent advances in sequencing technology have made it possible to obtain high-throughput data on the composition of microbial communities and to study the effects of dysbiosis on the human host. Analysis of pairwise intersample distances quantifies the association between the microbiome diversity and covariates of interest (e.g. environmental factors, clinical outcomes, treatment groups). In the design of these analyses, multiple choices for distance metrics are available. Most distance-based methods, however, use a single distance and are underpowered if the distance is poorly chosen. In addition, distance-based tests cannot flexibly handle confounding variables, which can result in excessive false-positive findings. We derive presence-weighted UniFrac to complement the existing UniFrac distances for more powerful detection of the variation in species richness. We develop PERMANOVA-S, a new distance-based method that tests the association of microbiome composition with any covariates of interest. PERMANOVA-S improves the commonly-used Permutation Multivariate Analysis of Variance (PERMANOVA) test by allowing flexible confounder adjustments and ensembling multiple distances. We conducted extensive simulation studies to evaluate the performance of different distances under various patterns of association. Our simulation studies demonstrate that the power of the test relies on how well the selected distance captures the nature of the association. The PERMANOVA-S unified test combines multiple distances and achieves good power regardless of the patterns of the underlying association. We demonstrate the usefulness of our approach by reanalyzing several real microbiome datasets. miProfile software is freely available at https://medschool.vanderbilt.edu/tang-lab/software/miProfile z.tang@vanderbilt.edu or g.chen@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  2. PERMANOVA-S: association test for microbial community composition that accommodates confounders and multiple distances

    PubMed Central

    Tang, Zheng-Zheng; Chen, Guanhua; Alekseyenko, Alexander V.

    2016-01-01

    Motivation: Recent advances in sequencing technology have made it possible to obtain high-throughput data on the composition of microbial communities and to study the effects of dysbiosis on the human host. Analysis of pairwise intersample distances quantifies the association between the microbiome diversity and covariates of interest (e.g. environmental factors, clinical outcomes, treatment groups). In the design of these analyses, multiple choices for distance metrics are available. Most distance-based methods, however, use a single distance and are underpowered if the distance is poorly chosen. In addition, distance-based tests cannot flexibly handle confounding variables, which can result in excessive false-positive findings. Results: We derive presence-weighted UniFrac to complement the existing UniFrac distances for more powerful detection of the variation in species richness. We develop PERMANOVA-S, a new distance-based method that tests the association of microbiome composition with any covariates of interest. PERMANOVA-S improves the commonly-used Permutation Multivariate Analysis of Variance (PERMANOVA) test by allowing flexible confounder adjustments and ensembling multiple distances. We conducted extensive simulation studies to evaluate the performance of different distances under various patterns of association. Our simulation studies demonstrate that the power of the test relies on how well the selected distance captures the nature of the association. The PERMANOVA-S unified test combines multiple distances and achieves good power regardless of the patterns of the underlying association. We demonstrate the usefulness of our approach by reanalyzing several real microbiome datasets. Availability and Implementation: miProfile software is freely available at https://medschool.vanderbilt.edu/tang-lab/software/miProfile. Contact: z.tang@vanderbilt.edu or g.chen@vanderbilt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27197815

  3. Modal testing with Asher's method using a Fourier analyzer and curve fitting

    NASA Technical Reports Server (NTRS)

    Gold, R. R.; Hallauer, W. L., Jr.

    1979-01-01

    An unusual application of the method proposed by Asher (1958) for structural dynamic and modal testing is discussed. Asher's method has the capability, using the admittance matrix and multiple-shaker sinusoidal excitation, of separating structural modes having indefinitely close natural frequencies. The present application uses Asher's method in conjunction with a modern Fourier analyzer system but eliminates the necessity of exciting the test structure simultaneously with several shakers. Evaluation of this approach with numerically simulated data demonstrated its effectiveness; the parameters of two modes having almost identical natural frequencies were accurately identified. Laboratory evaluation of this approach was inconclusive because of poor experimental input data.

  4. Flexible Reporting of Clinical Data

    PubMed Central

    Andrews, Robert D.

    1987-01-01

    Two prototype methods have been developed to aid in the presentation of relevant clinical data: 1) an integrated report that displays results from a patient's computer-stored data and also allows manual entry of data, and 2) a graph program that plots results of multiple kinds of tests. These reports provide a flexible means of displaying data to help evaluate patient treatment. The two methods also explore ways of integrating the display of data from multiple components of the Veterans Administration's (VA) Decentralized Hospital Computer Program (DHCP) database.

  5. The stochastic control of the F-8C aircraft using the Multiple Model Adaptive Control (MMAC) method

    NASA Technical Reports Server (NTRS)

    Athans, M.; Dunn, K. P.; Greene, E. S.; Lee, W. H.; Sandel, N. R., Jr.

    1975-01-01

    The purpose of this paper is to summarize results obtained for the adaptive control of the F-8C aircraft using the so-called Multiple Model Adaptive Control method. The discussion includes the selection of the performance criteria for both the lateral and the longitudinal dynamics, the design of the Kalman filters for different flight conditions, the 'identification' aspects of the design using hypothesis testing ideas, and the performance of the closed loop adaptive system.

  6. Application of laser radiation and magnetostimulation in therapy of patients with multiple sclerosis.

    PubMed

    Kubsik, Anna; Klimkiewicz, Robert; Janczewska, Katarzyna; Klimkiewicz, Paulina; Jankowska, Agnieszka; Woldańska-Okońska, Marta

    2016-01-01

    Multiple sclerosis is one of the most common neurological disorders. It is a chronic inflammatory demyelinating disease of the CNS, whose etiology is not fully understood. Application of new rehabilitation methods are essential to improve functional status. The material studied consisted of 120 patients of both sexes (82 women and 38 men) aged 21-81 years. The study involved patients with a diagnosis of multiple sclerosis. The aim of the study was to evaluate the effect of laser radiation and other therapies on the functional status of patients with multiple sclerosis. Patients were randomly divided into four treatment groups. The evaluation was performed three times - before the start of rehabilitation, immediately after rehabilitation (21 days of treatment) and subsequent control - 30 days after the patients leave the clinic. The following tests were performed for all patients to assess functional status: Expanded Disability Status Scale (EDSS) of Kurtzke and Barthel Index. Results of all testing procedures show that the treatment methods are improving the functional status of patients with multiple sclerosis, with the significant advantage of the synergistic action of laser and magneto stimulation. The combination of laser and magneto stimulation significantly confirmed beneficial effect on quality of life. The results of these studies present new scientific value and are improved compared to program of rehabilitation of patients with multiple sclerosis by laser radiation which was previously used. This study showed that synergic action of laser radiation and magneto stimulation has a beneficial effect on improving functional status, and thus improves the quality of life of patients with multiple sclerosis. The effects of all methods of rehabilitation are persisted after cessation of treatment applications, with a particular advantage of the synergistic action of laser radiation and magneto stimulation, which indicates the possibility to elicitation in these methods the phenomenon of the biological hysteresis.

  7. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  8. Multiple intelligences and alternative teaching strategies: The effects on student academic achievement, conceptual understanding, and attitude

    NASA Astrophysics Data System (ADS)

    Baragona, Michelle

    The purpose of this study was to investigate the interactions between multiple intelligence strengths and alternative teaching methods on student academic achievement, conceptual understanding and attitudes. The design was a quasi-experimental study, in which students enrolled in Principles of Anatomy and Physiology, a developmental biology course, received lecture only, problem-based learning with lecture, or peer teaching with lecture. These students completed the Multiple Intelligence Inventory to determine their intelligence strengths, the Students' Motivation Toward Science Learning questionnaire to determine student attitudes towards learning in science, multiple choice tests to determine academic achievement, and open-ended questions to determine conceptual understanding. Effects of intelligence types and teaching methods on academic achievement and conceptual understanding were determined statistically by repeated measures ANOVAs. No significance occurred in academic achievement scores due to lab group or due to teaching method used; however, significant interactions between group and teaching method did occur in students with strengths in logical-mathematical, interpersonal, kinesthetic, and intrapersonal intelligences. Post-hoc analysis using Tukey HSD tests revealed students with strengths in logical-mathematical intelligence and enrolled in Group Three scored significantly higher when taught by problem-based learning (PBL) as compared to peer teaching (PT). No significance occurred in conceptual understanding scores due to lab group or due to teaching method used; however, significant interactions between group and teaching method did occur in students with strengths in musical, kinesthetic, intrapersonal, and spatial intelligences. Post-hoc analysis using Tukey HSD tests revealed students with strengths in logical-mathematical intelligence and enrolled in Group Three scored significantly higher when taught by lecture as compared to PBL. Students with strengths in intrapersonal intelligence and enrolled in Group One scored significantly lower when taught by lecture as compared to PBL. Results of a repeated measures ANOVA for student attitudes showed significant increases in positive student attitudes toward science learning for all three types of teaching method between pretest and posttest; but there were no significant differences in posttest attitude scores by type of teaching method.

  9. Test-Retest Reliability of the Multiple Sleep Latency Test in Narcolepsy without Cataplexy and Idiopathic Hypersomnia

    PubMed Central

    Trotti, Lynn Marie; Staab, Beth A.; Rye, David B.

    2013-01-01

    Study Objectives: Differentiation of narcolepsy without cataplexy from idiopathic hypersomnia relies entirely upon the multiple sleep latency test (MSLT). However, the test-retest reliability for these central nervous system hypersomnias has never been determined. Methods: Patients with narcolepsy without cataplexy, idiopathic hypersomnia, and physiologic hypersomnia who underwent two diagnostic multiple sleep latency tests were identified retrospectively. Correlations between the mean sleep latencies on the two studies were evaluated, and we probed for demographic and clinical features associated with reproducibility versus change in diagnosis. Results: Thirty-six patients (58% women, mean age 34 years) were included. Inter -test interval was 4.2 ± 3.8 years (range 2.5 months to 16.9 years). Mean sleep latencies on the first and second tests were 5.5 (± 3.7 SD) and 7.3 (± 3.9) minutes, respectively, with no significant correlation (r = 0.17, p = 0.31). A change in diagnosis occurred in 53% of patients, and was accounted for by a difference in the mean sleep latency (N = 15, 42%) or the number of sleep onset REM periods (N = 11, 31%). The only feature predictive of a diagnosis change was a history of hypnagogic or hypnopompic hallucinations. Conclusions: The multiple sleep latency test demonstrates poor test-retest reliability in a clinical population of patients with central nervous system hypersomnia evaluated in a tertiary referral center. Alternative diagnostic tools are needed. Citation: Trotti LM; Staab BA; Rye DB. Test- retest reliability of the multiple sleep latency test in narcolepsy without cataplexy and idiopathic hypersomnia. J Clin Sleep Med 2013;9(8):789-795. PMID:23946709

  10. Improved H-κ Method by Harmonic Analysis on Ps and Crustal Multiples in Receiver Functions with respect to Dipping Moho and Crustal Anisotropy

    NASA Astrophysics Data System (ADS)

    Li, J.; Song, X.; Wang, P.; Zhu, L.

    2017-12-01

    The H-κ method (Zhu and Kanamori, 2000) has been widely used to estimate the crustal thickness and Vp/Vs ratio with receiver functions. However, in regions where the crustal structure is complicated, the method may produce uncertain or even unrealistic results, arising particularly from dipping Moho and/or crustal anisotropy. Here, we propose an improved H-κ method, which corrects for these effects first before stacking. The effect of dipping Moho and crustal anisotropy on Ps receiver function has been well studied, but not as much on crustal multiples (PpPs and PpSs+PsPs). Synthetic tests show that the effect of crustal anisotropy on the multiples are similar to Ps, while the effect of dipping Moho on the multiples is 5 times that on Ps (same cosine trend but 5 times in time shift). A Harmonic Analysis (HA) method for dipping/anisotropy was developed by Wang et al. (2017) for crustal Ps receiver functions to extract parameters of dipping Moho and crustal azimuthal anisotropy. In real data, the crustal multiples are much more complicated than the Ps. Therefore, we use the HA method (Wang et al., 2017), but apply separately to Ps and the multiples. It shows that although complicated, the trend of multiples can still be reasonably well represented by the HA. We then perform separate azimuthal corrections for Ps and the multiples and stack to obtain a combined receiver function. Lastly, the traditional H-κ procedure is applied to the stacked receiver function. We apply the improved H-κ method on 40 CNDSN (Chinese National Digital Seismic Network) stations distributed in a variety of geological setting across the Chinese continent. The results show apparent improvement compared to the traditional H-κ method, with clearer traces of multiples and stronger stacking energy in the grid search, as well as more reliable H-κ values.

  11. Germicidal Activity against Carbapenem/Colistin-Resistant Enterobacteriaceae Using a Quantitative Carrier Test Method.

    PubMed

    Kanamori, Hajime; Rutala, William A; Gergen, Maria F; Sickbert-Bennett, Emily E; Weber, David J

    2018-05-07

    Susceptibility to germicides for carbapenem/colistin-resistant Enterobacteriaceae is poorly described. We investigated the efficacy of multiple germicides against these emerging antibiotic-resistant pathogens using the disc-based quantitative carrier test method that can produce results more similar to those encountered in healthcare settings than a suspension test. Our study results demonstrated that germicides commonly used in healthcare facilities likely will be effective against carbapenem/colistin-resistant Enterobacteriaceae when used appropriately in healthcare facilities. Copyright © 2018 American Society for Microbiology.

  12. INTRA-RATER RELIABILITY OF THE MULTIPLE SINGLE-LEG HOP-STABILIZATION TEST AND RELATIONSHIPS WITH AGE, LEG DOMINANCE AND TRAINING.

    PubMed

    Sawle, Leanne; Freeman, Jennifer; Marsden, Jonathan

    2017-04-01

    Balance is a complex construct, affected by multiple components such as strength and co-ordination. However, whilst assessing an athlete's dynamic balance is an important part of clinical examination, there is no gold standard measure. The multiple single-leg hop-stabilization test is a functional test which may offer a method of evaluating the dynamic attributes of balance, but it needs to show adequate intra-tester reliability. The purpose of this study was to assess the intra-rater reliability of a dynamic balance test, the multiple single-leg hop-stabilization test on the dominant and non-dominant legs. Intra-rater reliability study. Fifteen active participants were tested twice with a 10-minute break between tests. The outcome measure was the multiple single-leg hop-stabilization test score, based on a clinically assessed numerical scoring system. Results were analysed using an Intraclass Correlations Coefficient (ICC 2,1 ) and Bland-Altman plots. Regression analyses explored relationships between test scores, leg dominance, age and training (an alpha level of p = 0.05 was selected). ICCs for intra-rater reliability were 0.85 for the dominant and non-dominant legs (confidence intervals = 0.62-0.95 and 0.61-0.95 respectively). Bland-Altman plots showed scores within two standard deviations. A significant correlation was observed between the dominant and non-dominant leg on balance scores (R 2 =0.49, p<0.05), and better balance was associated with younger participants in their non-dominant leg (R 2 =0.28, p<0.05) and their dominant leg (R 2 =0.39, p<0.05), and a higher number of hours spent training for the non-dominant leg R 2 =0.37, p<0.05). The multiple single-leg hop-stabilisation test demonstrated strong intra-tester reliability with active participants. Younger participants who trained more, have better balance scores. This test may be a useful measure for evaluating the dynamic attributes of balance. 3.

  13. Non-animal methods to predict skin sensitization (II): an assessment of defined approaches *.

    PubMed

    Kleinstreuer, Nicole C; Hoffmann, Sebastian; Alépée, Nathalie; Allen, David; Ashikaga, Takao; Casey, Warren; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Göbel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Strickland, Judy; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Skin sensitization is a toxicity endpoint of widespread concern, for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture, with many available options for predicting sensitization without using animals. Cosmetics Europe and the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods collaborated to analyze the performance of multiple non-animal data integration approaches for the skin sensitization safety assessment of cosmetics ingredients. The Cosmetics Europe Skin Tolerance Task Force (STTF) collected and generated data on 128 substances in multiple in vitro and in chemico skin sensitization assays selected based on a systematic assessment by the STTF. These assays, together with certain in silico predictions, are key components of various non-animal testing strategies that have been submitted to the Organization for Economic Cooperation and Development as case studies for skin sensitization. Curated murine local lymph node assay (LLNA) and human skin sensitization data were used to evaluate the performance of six defined approaches, comprising eight non-animal testing strategies, for both hazard and potency characterization. Defined approaches examined included consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which were reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs demonstrated equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization.

  14. Quantitative analysis of the pendulum test: application to multiple sclerosis patients treated with botulinum toxin.

    PubMed

    Bianchi, L; Monaldi, F; Paolucci, S; Iani, C; Lacquaniti, F

    1999-01-01

    The aim of this study was to develop quantitative analytical methods in the application of the pendulum test to both normal and spastic subjects. The lower leg was released by a torque motor from different starting positions. The resulting changes in the knee angle were fitted by means of a time-varying model. Stiffness and viscosity coefficients were derived for each half-cycle oscillation in both flexion and extension, and for all knee starting positions. This method was applied to the assessment of the effects of Botulinum toxin A (BTX) in progressive multiple sclerosis patients in a follow-up study. About half of the patients showed a significant decrement in stiffness and viscosity coefficients.

  15. MultiDK: A Multiple Descriptor Multiple Kernel Approach for Molecular Discovery and Its Application to Organic Flow Battery Electrolytes.

    PubMed

    Kim, Sungjin; Jinich, Adrián; Aspuru-Guzik, Alán

    2017-04-24

    We propose a multiple descriptor multiple kernel (MultiDK) method for efficient molecular discovery using machine learning. We show that the MultiDK method improves both the speed and accuracy of molecular property prediction. We apply the method to the discovery of electrolyte molecules for aqueous redox flow batteries. Using multiple-type-as opposed to single-type-descriptors, we obtain more relevant features for machine learning. Following the principle of "wisdom of the crowds", the combination of multiple-type descriptors significantly boosts prediction performance. Moreover, by employing multiple kernels-more than one kernel function for a set of the input descriptors-MultiDK exploits nonlinear relations between molecular structure and properties better than a linear regression approach. The multiple kernels consist of a Tanimoto similarity kernel and a linear kernel for a set of binary descriptors and a set of nonbinary descriptors, respectively. Using MultiDK, we achieve an average performance of r 2 = 0.92 with a test set of molecules for solubility prediction. We also extend MultiDK to predict pH-dependent solubility and apply it to a set of quinone molecules with different ionizable functional groups to assess their performance as flow battery electrolytes.

  16. Analysis of multiple instructional techniques on the understanding and retention of select mechanical topics

    NASA Astrophysics Data System (ADS)

    Fetsco, Sara Elizabeth

    There are several topics that introductory physics students typically have difficulty understanding. The purpose of this thesis is to investigate if multiple instructional techniques will help students to better understand and retain the material. The three units analyzed in this study are graphing motion, projectile motion, and conservation of momentum. For each unit students were taught using new or altered instructional methods including online laboratory simulations, inquiry labs, and interactive demonstrations. Additionally, traditional instructional methods such as lecture and problem sets were retained. Effectiveness was measured through pre- and post-tests and student opinion surveys. Results suggest that incorporating multiple instructional techniques into teaching will improve student understanding and retention. Students stated that they learned well from all of the instructional methods used except the online simulations.

  17. Multiple transfer standard for calibration and characterization of test setups for LED lamps and luminaires in industry

    NASA Astrophysics Data System (ADS)

    Sperling, A.; Meyer, M.; Pendsa, S.; Jordan, W.; Revtova, E.; Poikonen, T.; Renoux, D.; Blattner, P.

    2018-04-01

    Proper characterization of test setups used in industry for testing and traceable measurement of lighting devices by the substitution method is an important task. According to new standards for testing LED lamps, luminaires and modules, uncertainty budgets are requested because in many cases the properties of the device under test differ from the transfer standard used, which may cause significant errors, for example if a LED-based lamp is tested or calibrated in an integrating sphere which was calibrated with a tungsten lamp. This paper introduces a multiple transfer standard, which was designed not only to transfer a single calibration value (e.g. luminous flux) but also to characterize test setups used for LED measurements with additional provided and calibrated output features to enable the application of the new standards.

  18. Using Replicates in Information Retrieval Evaluation.

    PubMed

    Voorhees, Ellen M; Samarov, Daniel; Soboroff, Ian

    2017-09-01

    This article explores a method for more accurately estimating the main effect of the system in a typical test-collection-based evaluation of information retrieval systems, thus increasing the sensitivity of system comparisons. Randomly partitioning the test document collection allows for multiple tests of a given system and topic (replicates). Bootstrap ANOVA can use these replicates to extract system-topic interactions-something not possible without replicates-yielding a more precise value for the system effect and a narrower confidence interval around that value. Experiments using multiple TREC collections demonstrate that removing the topic-system interactions substantially reduces the confidence intervals around the system effect as well as increases the number of significant pairwise differences found. Further, the method is robust against small changes in the number of partitions used, against variability in the documents that constitute the partitions, and the measure of effectiveness used to quantify system effectiveness.

  19. Using Replicates in Information Retrieval Evaluation

    PubMed Central

    VOORHEES, ELLEN M.; SAMAROV, DANIEL; SOBOROFF, IAN

    2018-01-01

    This article explores a method for more accurately estimating the main effect of the system in a typical test-collection-based evaluation of information retrieval systems, thus increasing the sensitivity of system comparisons. Randomly partitioning the test document collection allows for multiple tests of a given system and topic (replicates). Bootstrap ANOVA can use these replicates to extract system-topic interactions—something not possible without replicates—yielding a more precise value for the system effect and a narrower confidence interval around that value. Experiments using multiple TREC collections demonstrate that removing the topic-system interactions substantially reduces the confidence intervals around the system effect as well as increases the number of significant pairwise differences found. Further, the method is robust against small changes in the number of partitions used, against variability in the documents that constitute the partitions, and the measure of effectiveness used to quantify system effectiveness. PMID:29905334

  20. Music Performance Anxiety in Instrumental Music Students: A Multiple Case Study of Teacher Perspectives

    ERIC Educational Resources Information Center

    Sieger, Crystal

    2017-01-01

    Music Performance Anxiety (MPA) is a sometimes debilitating condition affecting many young musicians as they perform in testing or concert settings. These students may consult their teachers to seek aid in overcoming their anxiety. The purpose of this multiple case study was to investigate the strategies and methods utilized by middle and high…

  1. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    PubMed Central

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  2. Automatic detection and recognition of multiple macular lesions in retinal optical coherence tomography images with multi-instance multilabel learning

    NASA Astrophysics Data System (ADS)

    Fang, Leyuan; Yang, Liumao; Li, Shutao; Rabbani, Hossein; Liu, Zhimin; Peng, Qinghua; Chen, Xiangdong

    2017-06-01

    Detection and recognition of macular lesions in optical coherence tomography (OCT) are very important for retinal diseases diagnosis and treatment. As one kind of retinal disease (e.g., diabetic retinopathy) may contain multiple lesions (e.g., edema, exudates, and microaneurysms) and eye patients may suffer from multiple retinal diseases, multiple lesions often coexist within one retinal image. Therefore, one single-lesion-based detector may not support the diagnosis of clinical eye diseases. To address this issue, we propose a multi-instance multilabel-based lesions recognition (MIML-LR) method for the simultaneous detection and recognition of multiple lesions. The proposed MIML-LR method consists of the following steps: (1) segment the regions of interest (ROIs) for different lesions, (2) compute descriptive instances (features) for each lesion region, (3) construct multilabel detectors, and (4) recognize each ROI with the detectors. The proposed MIML-LR method was tested on 823 clinically labeled OCT images with normal macular and macular with three common lesions: epiretinal membrane, edema, and drusen. For each input OCT image, our MIML-LR method can automatically identify the number of lesions and assign the class labels, achieving the average accuracy of 88.72% for the cases with multiple lesions, which better assists macular disease diagnosis and treatment.

  3. Acoustic 3D modeling by the method of integral equations

    NASA Astrophysics Data System (ADS)

    Malovichko, M.; Khokhlov, N.; Yavich, N.; Zhdanov, M.

    2018-02-01

    This paper presents a parallel algorithm for frequency-domain acoustic modeling by the method of integral equations (IE). The algorithm is applied to seismic simulation. The IE method reduces the size of the problem but leads to a dense system matrix. A tolerable memory consumption and numerical complexity were achieved by applying an iterative solver, accompanied by an effective matrix-vector multiplication operation, based on the fast Fourier transform (FFT). We demonstrate that, the IE system matrix is better conditioned than that of the finite-difference (FD) method, and discuss its relation to a specially preconditioned FD matrix. We considered several methods of matrix-vector multiplication for the free-space and layered host models. The developed algorithm and computer code were benchmarked against the FD time-domain solution. It was demonstrated that, the method could accurately calculate the seismic field for the models with sharp material boundaries and a point source and receiver located close to the free surface. We used OpenMP to speed up the matrix-vector multiplication, while MPI was used to speed up the solution of the system equations, and also for parallelizing across multiple sources. The practical examples and efficiency tests are presented as well.

  4. A Comparative Analysis of Three Monocular Passive Ranging Methods on Real Infrared Sequences

    NASA Astrophysics Data System (ADS)

    Bondžulić, Boban P.; Mitrović, Srđan T.; Barbarić, Žarko P.; Andrić, Milenko S.

    2013-09-01

    Three monocular passive ranging methods are analyzed and tested on the real infrared sequences. The first method exploits scale changes of an object in successive frames, while other two use Beer-Lambert's Law. Ranging methods are evaluated by comparing with simultaneously obtained reference data at the test site. Research is addressed on scenarios where multiple sensor views or active measurements are not possible. The results show that these methods for range estimation can provide the fidelity required for object tracking. Maximum values of relative distance estimation errors in near-ideal conditions are less than 8%.

  5. Correction for Guessing in the Framework of the 3PL Item Response Theory

    ERIC Educational Resources Information Center

    Chiu, Ting-Wei

    2010-01-01

    Guessing behavior is an important topic with regard to assessing proficiency on multiple choice tests, particularly for examinees at lower levels of proficiency due to greater the potential for systematic error or bias which that inflates observed test scores. Methods that incorporate a correction for guessing on high-stakes tests generally rely…

  6. Making the Most of What We Have: A Practical Application of Multidimensional Item Response Theory in Test Scoring

    ERIC Educational Resources Information Center

    de la Torre, Jimmy; Patz, Richard J.

    2005-01-01

    This article proposes a practical method that capitalizes on the availability of information from multiple tests measuring correlated abilities given in a single test administration. By simultaneously estimating different abilities with the use of a hierarchical Bayesian framework, more precise estimates for each ability dimension are obtained.…

  7. A Comparison of Domain-Referenced and Classic Psychometric Test Construction Methods.

    ERIC Educational Resources Information Center

    Willoughby, Lee; And Others

    This study compared a domain referenced approach with a traditional psychometric approach in the construction of a test. Results of the December, 1975 Quarterly Profile Exam (QPE) administered to 400 examinees at a university were the source of data. The 400 item QPE is a five alternative multiple choice test of information a "safe"…

  8. Mixture IRT Model with a Higher-Order Structure for Latent Traits

    ERIC Educational Resources Information Center

    Huang, Hung-Yu

    2017-01-01

    Mixture item response theory (IRT) models have been suggested as an efficient method of detecting the different response patterns derived from latent classes when developing a test. In testing situations, multiple latent traits measured by a battery of tests can exhibit a higher-order structure, and mixtures of latent classes may occur on…

  9. Multi-chain Markov chain Monte Carlo methods for computationally expensive models

    NASA Astrophysics Data System (ADS)

    Huang, M.; Ray, J.; Ren, H.; Hou, Z.; Bao, J.

    2017-12-01

    Markov chain Monte Carlo (MCMC) methods are used to infer model parameters from observational data. The parameters are inferred as probability densities, thus capturing estimation error due to sparsity of the data, and the shortcomings of the model. Multiple communicating chains executing the MCMC method have the potential to explore the parameter space better, and conceivably accelerate the convergence to the final distribution. We present results from tests conducted with the multi-chain method to show how the acceleration occurs i.e., for loose convergence tolerances, the multiple chains do not make much of a difference. The ensemble of chains also seems to have the ability to accelerate the convergence of a few chains that might start from suboptimal starting points. Finally, we show the performance of the chains in the estimation of O(10) parameters using computationally expensive forward models such as the Community Land Model, where the sampling burden is distributed over multiple chains.

  10. Many tests of significance: new methods for controlling type I errors.

    PubMed

    Keselman, H J; Miller, Charles W; Holland, Burt

    2011-12-01

    There have been many discussions of how Type I errors should be controlled when many hypotheses are tested (e.g., all possible comparisons of means, correlations, proportions, the coefficients in hierarchical models, etc.). By and large, researchers have adopted familywise (FWER) control, though this practice certainly is not universal. Familywise control is intended to deal with the multiplicity issue of computing many tests of significance, yet such control is conservative--that is, less powerful--compared to per test/hypothesis control. The purpose of our article is to introduce the readership, particularly those readers familiar with issues related to controlling Type I errors when many tests of significance are computed, to newer methods that provide protection from the effects of multiple testing, yet are more powerful than familywise controlling methods. Specifically, we introduce a number of procedures that control the k-FWER. These methods--say, 2-FWER instead of 1-FWER (i.e., FWER)--are equivalent to specifying that the probability of 2 or more false rejections is controlled at .05, whereas FWER controls the probability of any (i.e., 1 or more) false rejections at .05. 2-FWER implicitly tolerates 1 false rejection and makes no explicit attempt to control the probability of its occurrence, unlike FWER, which tolerates no false rejections at all. More generally, k-FWER tolerates k - 1 false rejections, but controls the probability of k or more false rejections at α =.05. We demonstrate with two published data sets how more hypotheses can be rejected with k-FWER methods compared to FWER control.

  11. Context-Dependent Upper Limb Prosthesis Control for Natural and Robust Use.

    PubMed

    Amsuess, Sebastian; Vujaklija, Ivan; Goebel, Peter; Roche, Aidan D; Graimann, Bernhard; Aszmann, Oskar C; Farina, Dario

    2016-07-01

    Pattern recognition and regression methods applied to the surface EMG have been used for estimating the user intended motor tasks across multiple degrees of freedom (DOF), for prosthetic control. While these methods are effective in several conditions, they are still characterized by some shortcomings. In this study we propose a methodology that combines these two approaches for mutually alleviating their limitations. This resulted in a control method capable of context-dependent movement estimation that switched automatically between sequential (one DOF at a time) or simultaneous (multiple DOF) prosthesis control, based on an online estimation of signal dimensionality. The proposed method was evaluated in scenarios close to real-life situations, with the control of a physical prosthesis in applied tasks of varying difficulties. Test prostheses were individually manufactured for both able-bodied and transradial amputee subjects. With these prostheses, two amputees performed the Southampton Hand Assessment Procedure test with scores of 58 and 71 points. The five able-bodied individuals performed standardized tests, such as the box&block and clothes pin test, reducing the completion times by up to 30%, with respect to using a state-of-the-art pure sequential control algorithm. Apart from facilitating fast simultaneous movements, the proposed control scheme was also more intuitive to use, since human movements are predominated by simultaneous activations across joints. The proposed method thus represents a significant step towards intelligent, intuitive and natural control of upper limb prostheses.

  12. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  13. Integrated Analysis of Pharmacologic, Clinical, and SNP Microarray Data using Projection onto the Most Interesting Statistical Evidence with Adaptive Permutation Testing

    PubMed Central

    Pounds, Stan; Cao, Xueyuan; Cheng, Cheng; Yang, Jun; Campana, Dario; Evans, William E.; Pui, Ching-Hon; Relling, Mary V.

    2010-01-01

    Powerful methods for integrated analysis of multiple biological data sets are needed to maximize interpretation capacity and acquire meaningful knowledge. We recently developed Projection Onto the Most Interesting Statistical Evidence (PROMISE). PROMISE is a statistical procedure that incorporates prior knowledge about the biological relationships among endpoint variables into an integrated analysis of microarray gene expression data with multiple biological and clinical endpoints. Here, PROMISE is adapted to the integrated analysis of pharmacologic, clinical, and genome-wide genotype data that incorporating knowledge about the biological relationships among pharmacologic and clinical response data. An efficient permutation-testing algorithm is introduced so that statistical calculations are computationally feasible in this higher-dimension setting. The new method is applied to a pediatric leukemia data set. The results clearly indicate that PROMISE is a powerful statistical tool for identifying genomic features that exhibit a biologically meaningful pattern of association with multiple endpoint variables. PMID:21516175

  14. Cross-beam coherence of infrasonic signals at local and regional ranges.

    PubMed

    Alberts, W C Kirkpatrick; Tenney, Stephen M

    2017-11-01

    Signals collected by infrasound arrays require continuous analysis by skilled personnel or by automatic algorithms in order to extract useable information. Typical pieces of information gained by analysis of infrasonic signals collected by multiple sensor arrays are arrival time, line of bearing, amplitude, and duration. These can all be used, often with significant accuracy, to locate sources. A very important part of this chain is associating collected signals across multiple arrays. Here, a pairwise, cross-beam coherence method of signal association is described that allows rapid signal association for high signal-to-noise ratio events captured by multiple infrasound arrays at ranges exceeding 150 km. Methods, test cases, and results are described.

  15. Generation of multiple Bessel beams for a biophotonics workstation.

    PubMed

    Cizmár, T; Kollárová, V; Tsampoula, X; Gunn-Moore, F; Sibbett, W; Bouchal, Z; Dholakia, K

    2008-09-01

    We present a simple method using an axicon and spatial light modulator to create multiple parallel Bessel beams and precisely control their individual positions in three dimensions. This technique is tested as an alternative to classical holographic beam shaping commonly used now in optical tweezers. Various applications of precise control of multiple Bessel beams are demonstrated within a single microscope giving rise to new methods for three-dimensional positional control of trapped particles or active sorting of micro-objects as well as "focus-free" photoporation of living cells. Overall this concept is termed a 'biophotonics workstation' where users may readily trap, sort and porate material using Bessel light modes in a microscope.

  16. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  17. Hierarchical screening for multiple mental disorders.

    PubMed

    Batterham, Philip J; Calear, Alison L; Sunderland, Matthew; Carragher, Natacha; Christensen, Helen; Mackinnon, Andrew J

    2013-10-01

    There is a need for brief, accurate screening when assessing multiple mental disorders. Two-stage hierarchical screening, consisting of brief pre-screening followed by a battery of disorder-specific scales for those who meet diagnostic criteria, may increase the efficiency of screening without sacrificing precision. This study tested whether more efficient screening could be gained using two-stage hierarchical screening than by administering multiple separate tests. Two Australian adult samples (N=1990) with high rates of psychopathology were recruited using Facebook advertising to examine four methods of hierarchical screening for four mental disorders: major depressive disorder, generalised anxiety disorder, panic disorder and social phobia. Using K6 scores to determine whether full screening was required did not increase screening efficiency. However, pre-screening based on two decision tree approaches or item gating led to considerable reductions in the mean number of items presented per disorder screened, with estimated item reductions of up to 54%. The sensitivity of these hierarchical methods approached 100% relative to the full screening battery. Further testing of the hierarchical screening approach based on clinical criteria and in other samples is warranted. The results demonstrate that a two-phase hierarchical approach to screening multiple mental disorders leads to considerable increases efficiency gains without reducing accuracy. Screening programs should take advantage of prescreeners based on gating items or decision trees to reduce the burden on respondents. © 2013 Elsevier B.V. All rights reserved.

  18. On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study.

    PubMed

    Fan, Chunpeng; Zhang, Donghui; Zhang, Cun-Hui

    2011-03-01

    As the nonparametric generalization of the one-way analysis of variance model, the Kruskal-Wallis test applies when the goal is to test the difference between multiple samples and the underlying population distributions are nonnormal or unknown. Although the Kruskal-Wallis test has been widely used for data analysis, power and sample size methods for this test have been investigated to a much lesser extent. This article proposes new power and sample size calculation methods for the Kruskal-Wallis test based on the pilot study in either a completely nonparametric model or a semiparametric location model. No assumption is made on the shape of the underlying population distributions. Simulation results show that, in terms of sample size calculation for the Kruskal-Wallis test, the proposed methods are more reliable and preferable to some more traditional methods. A mouse peritoneal cavity study is used to demonstrate the application of the methods. © 2010, The International Biometric Society.

  19. The implementation of multiple intelligences based teaching model to improve mathematical problem solving ability for student of junior high school

    NASA Astrophysics Data System (ADS)

    Fasni, Nurli; Fatimah, Siti; Yulanda, Syerli

    2017-05-01

    This research aims to achieve some purposes such as: to know whether mathematical problem solving ability of students who have learned mathematics using Multiple Intelligences based teaching model is higher than the student who have learned mathematics using cooperative learning; to know the improvement of the mathematical problem solving ability of the student who have learned mathematics using Multiple Intelligences based teaching model., to know the improvement of the mathematical problem solving ability of the student who have learned mathematics using cooperative learning; to know the attitude of the students to Multiple Intelligences based teaching model. The method employed here is quasi-experiment which is controlled by pre-test and post-test. The population of this research is all of VII grade in SMP Negeri 14 Bandung even-term 2013/2014, later on two classes of it were taken for the samples of this research. A class was taught using Multiple Intelligences based teaching model and the other one was taught using cooperative learning. The data of this research were gotten from the test in mathematical problem solving, scale questionnaire of the student attitudes, and observation. The results show the mathematical problem solving of the students who have learned mathematics using Multiple Intelligences based teaching model learning is higher than the student who have learned mathematics using cooperative learning, the mathematical problem solving ability of the student who have learned mathematics using cooperative learning and Multiple Intelligences based teaching model are in intermediate level, and the students showed the positive attitude in learning mathematics using Multiple Intelligences based teaching model. As for the recommendation for next author, Multiple Intelligences based teaching model can be tested on other subject and other ability.

  20. Multiple disturbances classifier for electric signals using adaptive structuring neural networks

    NASA Astrophysics Data System (ADS)

    Lu, Yen-Ling; Chuang, Cheng-Long; Fahn, Chin-Shyurng; Jiang, Joe-Air

    2008-07-01

    This work proposes a novel classifier to recognize multiple disturbances for electric signals of power systems. The proposed classifier consists of a series of pipeline-based processing components, including amplitude estimator, transient disturbance detector, transient impulsive detector, wavelet transform and a brand-new neural network for recognizing multiple disturbances in a power quality (PQ) event. Most of the previously proposed methods usually treated a PQ event as a single disturbance at a time. In practice, however, a PQ event often consists of various types of disturbances at the same time. Therefore, the performances of those methods might be limited in real power systems. This work considers the PQ event as a combination of several disturbances, including steady-state and transient disturbances, which is more analogous to the real status of a power system. Six types of commonly encountered power quality disturbances are considered for training and testing the proposed classifier. The proposed classifier has been tested on electric signals that contain single disturbance or several disturbances at a time. Experimental results indicate that the proposed PQ disturbance classification algorithm can achieve a high accuracy of more than 97% in various complex testing cases.

  1. A Systematic Analysis of 2 Monoisocentric Techniques for the Treatment of Multiple Brain Metastases

    PubMed Central

    Stathakis, Sotirios; Gutierrez, Alonso N.; Pappas, Evangelos; Crownover, Richard; Floyd, John R.; Papanikolaou, Niko

    2016-01-01

    Background: In this treatment planning study, we compare the plan quality and delivery parameters for the treatment of multiple brain metastases using 2 monoisocentric techniques: the Multiple Metastases Element from Brainlab and the RapidArc volumetric-modulated arc therapy from Varian Medical Systems. Methods: Eight patients who were treated in our institution for multiple metastases (3-7 lesions) were replanned with Multiple Metastases Element using noncoplanar dynamic conformal arcs. The same patients were replanned with the RapidArc technique in Eclipse using 4 noncoplanar arcs. Both techniques were designed using a single isocenter. Plan quality metrics (conformity index, homogeneity index, gradient index, and R50%), monitor unit, and the planning time were recorded. Comparison of the Multiple Metastases Element and RapidArc plans was performed using Shapiro-Wilk test, paired Student t test, and Wilcoxon signed rank test. Results: A paired Wilcoxon signed rank test between Multiple Metastases Element and RapidArc showed comparable plan quality metrics and dose to brain. Mean ± standard deviation values of conformity index were 1.8 ± 0.7 and 1.7 ± 0.6, homogeneity index were 1.3 ± 0.1 and 1.3 ± 0.1, gradient index were 5.0 ± 1.8 and 5.1 ± 1.9, and R50% were 4.9 ± 1.8 and 5.0 ± 1.9 for Multiple Metastases Element and RapidArc plans, respectively. Mean brain dose was 2.3 and 2.7 Gy for Multiple Metastases Element and RapidArc plans, respectively. The mean value of monitor units in Multiple Metastases Element plan was 7286 ± 1065, which is significantly lower than the RapidArc monitor units of 9966 ± 1533 (P < .05). Conclusion: For the planning of multiple brain lesions to be treated with stereotactic radiosurgery, Multiple Metastases Element planning software produced equivalent conformity, homogeneity, dose falloff, and brain V12 Gy but required significantly lower monitor units, when compared to RapidArc plans. PMID:27612917

  2. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  3. Development of multiple choice pictorial test for measuring the dimensions of knowledge

    NASA Astrophysics Data System (ADS)

    Nahadi, Siswaningsih, Wiwi; Erna

    2017-05-01

    This study aims to develop a multiple choice pictorial test as a tool to measure dimension of knowledge in chemical equilibrium subject. The method used is Research and Development and validation that was conducted in the preliminary studies and model development. The product is multiple choice pictorial test. The test was developed by 22 items and tested to 64 high school students in XII grade. The quality of test was determined by value of validity, reliability, difficulty index, discrimination power, and distractor effectiveness. The validity of test was determined by CVR calculation using 8 validators (4 university teachers and 4 high school teachers) with average CVR value 0,89. The reliability of test has very high category with value 0,87. Discrimination power of items with a very good category is 32%, 59% as good category, and 20% as sufficient category. This test has a varying level of difficulty, item with difficult category is 23%, the medium category is 50%, and the easy category is 27%. The distractor effectiveness of items with a very poor category is 1%, poor category is 1%, medium category is 4%, good category is 39%, and very good category is 55%. The dimension of knowledge that was measured consist of factual knowledge, conceptual knowledge, and procedural knowledge. Based on the questionnaire, students responded quite well to the developed test and most of the students like this kind of multiple choice pictorial test that include picture as evaluation tool compared to the naration tests was dominated by text.

  4. On Max-Plus Algebra and Its Application on Image Steganography

    PubMed Central

    Santoso, Kiswara Agung

    2018-01-01

    We propose a new steganography method to hide an image into another image using matrix multiplication operations on max-plus algebra. This is especially interesting because the matrix used in encoding or information disguises generally has an inverse, whereas matrix multiplication operations in max-plus algebra do not have an inverse. The advantages of this method are the size of the image that can be hidden into the cover image, larger than the previous method. The proposed method has been tested on many secret images, and the results are satisfactory which have a high level of strength and a high level of security and can be used in various operating systems. PMID:29887761

  5. On Max-Plus Algebra and Its Application on Image Steganography.

    PubMed

    Santoso, Kiswara Agung; Fatmawati; Suprajitno, Herry

    2018-01-01

    We propose a new steganography method to hide an image into another image using matrix multiplication operations on max-plus algebra. This is especially interesting because the matrix used in encoding or information disguises generally has an inverse, whereas matrix multiplication operations in max-plus algebra do not have an inverse. The advantages of this method are the size of the image that can be hidden into the cover image, larger than the previous method. The proposed method has been tested on many secret images, and the results are satisfactory which have a high level of strength and a high level of security and can be used in various operating systems.

  6. A novel method for rapid determination of total solid content in viscous liquids by multiple headspace extraction gas chromatography.

    PubMed

    Xin, Li-Ping; Chai, Xin-Sheng; Hu, Hui-Chao; Barnes, Donald G

    2014-09-05

    This work demonstrates a novel method for rapid determination of total solid content in viscous liquid (polymer-enriched) samples. The method is based multiple headspace extraction gas chromatography (MHE-GC) on a headspace vial at a temperature above boiling point of water. Thus, the trend of water loss from the tested liquid due to evaporation can be followed. With the limited MHE-GC testing (e.g., 5 extractions) and a one-point calibration procedure (i.e., recording the weight difference before and after analysis), the total amount of water in the sample can be determined, from which the total solid contents in the liquid can be calculated. A number of black liquors were analyzed by the new method which yielded results that closely matched those of the reference method; i.e., the results of these two methods differed by no more than 2.3%. Compared with the reference method, the MHE-GC method is much simpler and more practical. Therefore, it is suitable for the rapid determination of the solid content in many polymer-containing liquid samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Multivariate two-part statistics for analysis of correlated mass spectrometry data from multiple biological specimens.

    PubMed

    Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi

    2017-01-01

    High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Reproducibility of a silicone-based test food to masticatory performance evaluation by different sieve methods.

    PubMed

    Sánchez-Ayala, Alfonso; Vilanova, Larissa Soares Reis; Costa, Marina Abrantes; Farias-Neto, Arcelino

    2014-01-01

    The aim of this study was to evaluate the reproducibility of the condensation silicone Optosil Comfort® as an artificial test food for masticatory performance evaluation. Twenty dentate subjects with mean age of 23.3±0.7 years were selected. Masticatory performance was evaluated using the simple (MPI), the double (IME) and the multiple sieve methods. Trials were carried out five times by three examiners: three times by the first, and once by the second and third examiners. Friedman's test was used to find the differences among time trials. Reproducibility was determined by the intra-class correlation (ICC) test (α=0.05). No differences among time trials were found, except for MPI-4 mm (p=0.022) from the first examiner results. The intra-examiner reproducibility (ICC) of almost all data was high (ICC≥0.92, p<0.001), being moderate only for MPI-0.50 mm (ICC=0.89, p<0.001). The inter-examiner reproducibility was high (ICC>0.93, p<0.001) for all results. For the multiple sieve method, the average mean of absolute difference from repeated measurements were lower than 1 mm. This trend was observed only from MPI-0.50 to MPI-1.4 for the single sieve method, and from IME-0.71/0.50 to IME-1.40/1.00 for the double sieve method. The results suggest that regardless of the method used, the reproducibility of Optosil Comfort® is high.

  9. Two-dimensional imaging via a narrowband MIMO radar system with two perpendicular linear arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Su, Yi

    2010-05-01

    This paper presents a system model and method for the 2-D imaging application via a narrowband multiple-input multiple-output (MIMO) radar system with two perpendicular linear arrays. Furthermore, the imaging formulation for our method is developed through a Fourier integral processing, and the parameters of antenna array including the cross-range resolution, required size, and sampling interval are also examined. Different from the spatial sequential procedure sampling the scattered echoes during multiple snapshot illuminations in inverse synthetic aperture radar (ISAR) imaging, the proposed method utilizes a spatial parallel procedure to sample the scattered echoes during a single snapshot illumination. Consequently, the complex motion compensation in ISAR imaging can be avoided. Moreover, in our array configuration, multiple narrowband spectrum-shared waveforms coded with orthogonal polyphase sequences are employed. The mainlobes of the compressed echoes from the different filter band could be located in the same range bin, and thus, the range alignment in classical ISAR imaging is not necessary. Numerical simulations based on synthetic data are provided for testing our proposed method.

  10. Detection of Salmonella spp. in veterinary samples by combining selective enrichment and real-time PCR.

    PubMed

    Goodman, Laura B; McDonough, Patrick L; Anderson, Renee R; Franklin-Guild, Rebecca J; Ryan, James R; Perkins, Gillian A; Thachil, Anil J; Glaser, Amy L; Thompson, Belinda S

    2017-11-01

    Rapid screening for enteric bacterial pathogens in clinical environments is essential for biosecurity. Salmonella found in veterinary hospitals, particularly Salmonella enterica serovar Dublin, can pose unique challenges for culture and testing because of its poor growth. Multiple Salmonella serovars including Dublin are emerging threats to public health given increasing prevalence and antimicrobial resistance. We adapted an automated food testing method to veterinary samples and evaluated the performance of the method in a variety of matrices including environmental samples ( n = 81), tissues ( n = 52), feces ( n = 148), and feed ( n = 29). A commercial kit was chosen as the basis for this approach in view of extensive performance characterizations published by multiple independent organizations. A workflow was established for efficiently and accurately testing veterinary matrices and environmental samples by use of real-time PCR after selective enrichment in Rappaport-Vassiliadis soya (RVS) medium. Using this method, the detection limit for S. Dublin improved by 100-fold over subculture on selective agars (eosin-methylene blue, brilliant green, and xylose-lysine-deoxycholate). Overall, the procedure was effective in detecting Salmonella spp. and provided next-day results.

  11. Asymmetry in Student Achievement on Multiple-Choice and Constructed-Response Items in Reversible Mathematics Processes

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Jones, Ian

    2017-01-01

    In this paper we report the results of an experiment designed to test the hypothesis that when faced with a question involving the inverse direction of a reversible mathematical process, students solve a multiple-choice version by verifying the answers presented to them by the direct method, not by undertaking the actual inverse calculation.…

  12. Evaluation of a Multiple Mediator Model of the Relationship between Core Self-Evaluations and Job Satisfaction in Employed Individuals with Disabilities

    ERIC Educational Resources Information Center

    Smedema, Susan Miller; Kesselmayer, Rachel Friefeld; Peterson, Lauren

    2018-01-01

    Purpose: To test a meditation model of the relationship between core self-evaluations (CSE) and job satisfaction in employed individuals with disabilities. Method: A quantitative descriptive design using Hayes's (2012) PROCESS macro for SPSS and multiple regression analysis. Two-hundred fifty-nine employed persons with disabilities were recruited…

  13. Simultaneous Inference Procedures for Means.

    ERIC Educational Resources Information Center

    Krishnaiah, P. R.

    Some aspects of simultaneous tests for means are reviewed. Specifically, the comparison of univariate or multivariate normal populations based on the values of the means or mean vectors when the variances or covariance matrices are equal is discussed. Tukey's and Dunnett's tests for multiple comparisons of means, Scheffe's method of examining…

  14. Testing Public Anxiety Treatments against a Credible Placebo Control

    ERIC Educational Resources Information Center

    Duff, Desiree C.; Levine, Timothy R.; Beatty, Michael J.; Woolbright, Jessica; Park, Hee Sun

    2007-01-01

    Research investigating public speaking anxiety treatments is subject to demand effects. This study tests the relative effectiveness of systematic desensitization (SD) and multiple treatment method (MT) containing visualization therapy against no-treatment and credible placebo controls. Data (N = 238) were collected at six points in a public…

  15. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    2001-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  16. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    1999-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  17. An efficient genome-wide association test for mixed binary and continuous phenotypes with applications to substance abuse research.

    PubMed

    Buu, Anne; Williams, L Keoki; Yang, James J

    2018-03-01

    We propose a new genome-wide association test for mixed binary and continuous phenotypes that uses an efficient numerical method to estimate the empirical distribution of the Fisher's combination statistic under the null hypothesis. Our simulation study shows that the proposed method controls the type I error rate and also maintains its power at the level of the permutation method. More importantly, the computational efficiency of the proposed method is much higher than the one of the permutation method. The simulation results also indicate that the power of the test increases when the genetic effect increases, the minor allele frequency increases, and the correlation between responses decreases. The statistical analysis on the database of the Study of Addiction: Genetics and Environment demonstrates that the proposed method combining multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests.

  18. A prevalence-based association test for case-control studies.

    PubMed

    Ryckman, Kelli K; Jiang, Lan; Li, Chun; Bartlett, Jacquelaine; Haines, Jonathan L; Williams, Scott M

    2008-11-01

    Genetic association is often determined in case-control studies by the differential distribution of alleles or genotypes. Recent work has demonstrated that association can also be assessed by deviations from the expected distributions of alleles or genotypes. Specifically, multiple methods motivated by the principles of Hardy-Weinberg equilibrium (HWE) have been developed. However, these methods do not take into account many of the assumptions of HWE. Therefore, we have developed a prevalence-based association test (PRAT) as an alternative method for detecting association in case-control studies. This method, also motivated by the principles of HWE, uses an estimated population allele frequency to generate expected genotype frequencies instead of using the case and control frequencies separately. Our method often has greater power, under a wide variety of genetic models, to detect association than genotypic, allelic or Cochran-Armitage trend association tests. Therefore, we propose PRAT as a powerful alternative method of testing for association.

  19. Computational Issues in Damping Identification for Large Scale Problems

    NASA Technical Reports Server (NTRS)

    Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.

    1997-01-01

    Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.

  20. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  1. Power Hardware-in-the-Loop Testing of Multiple Photovoltaic Inverters' Volt-Var Control with Real-Time Grid Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Sudipta; Nelson, Austin; Hoke, Anderson

    2016-12-12

    Traditional testing methods fall short in evaluating interactions between multiple smart inverters providing advanced grid support functions due to the fact that such interactions largely depend on their placements on the electric distribution systems with impedances between them. Even though significant concerns have been raised by the utilities on the effects of such interactions, little effort has been made to evaluate them. In this paper, power hardware-in-the-loop (PHIL) based testing was utilized to evaluate autonomous volt-var operations of multiple smart photovoltaic (PV) inverters connected to a simple distribution feeder model. The results provided in this paper show that depending onmore » volt-var control (VVC) parameters and grid parameters, interaction between inverters and between the inverter and the grid is possible in some extreme cases with very high VVC slopes, fast response times and large VVC response delays.« less

  2. Statistical significance of combinatorial regulations

    PubMed Central

    Terada, Aika; Okada-Hatakeyama, Mariko; Tsuda, Koji; Sese, Jun

    2013-01-01

    More than three transcription factors often work together to enable cells to respond to various signals. The detection of combinatorial regulation by multiple transcription factors, however, is not only computationally nontrivial but also extremely unlikely because of multiple testing correction. The exponential growth in the number of tests forces us to set a strict limit on the maximum arity. Here, we propose an efficient branch-and-bound algorithm called the “limitless arity multiple-testing procedure” (LAMP) to count the exact number of testable combinations and calibrate the Bonferroni factor to the smallest possible value. LAMP lists significant combinations without any limit, whereas the family-wise error rate is rigorously controlled under the threshold. In the human breast cancer transcriptome, LAMP discovered statistically significant combinations of as many as eight binding motifs. This method may contribute to uncover pathways regulated in a coordinated fashion and find hidden associations in heterogeneous data. PMID:23882073

  3. Supratentorial lesions contribute to trigeminal neuralgia in multiple sclerosis.

    PubMed

    Fröhlich, Kilian; Winder, Klemens; Linker, Ralf A; Engelhorn, Tobias; Dörfler, Arnd; Lee, De-Hyung; Hilz, Max J; Schwab, Stefan; Seifert, Frank

    2018-06-01

    Background It has been proposed that multiple sclerosis lesions afflicting the pontine trigeminal afferents contribute to trigeminal neuralgia in multiple sclerosis. So far, there are no imaging studies that have evaluated interactions between supratentorial lesions and trigeminal neuralgia in multiple sclerosis patients. Methods We conducted a retrospective study and sought multiple sclerosis patients with trigeminal neuralgia and controls in a local database. Multiple sclerosis lesions were manually outlined and transformed into stereotaxic space. We determined the lesion overlap and performed a voxel-wise subtraction analysis. Secondly, we conducted a voxel-wise non-parametric analysis using the Liebermeister test. Results From 12,210 multiple sclerosis patient records screened, we identified 41 patients with trigeminal neuralgia. The voxel-wise subtraction analysis yielded associations between trigeminal neuralgia and multiple sclerosis lesions in the pontine trigeminal afferents, as well as larger supratentorial lesion clusters in the contralateral insula and hippocampus. The non-parametric statistical analysis using the Liebermeister test yielded similar areas to be associated with multiple sclerosis-related trigeminal neuralgia. Conclusions Our study confirms previous data on associations between multiple sclerosis-related trigeminal neuralgia and pontine lesions, and showed for the first time an association with lesions in the insular region, a region involved in pain processing and endogenous pain modulation.

  4. The effect of reading assignments in guided inquiry learning on students’ critical thinking skills

    NASA Astrophysics Data System (ADS)

    Syarkowi, A.

    2018-05-01

    The purpose of this study was to determine the effect of reading assignment in guided inquiry learning on senior high school students’ critical thinking skills. The research method which was used in this research was quasi-experiment research method with reading task as the treatment. Topic of inquiry process was Kirchhoff law. The instrument was used for this research was 25 multiple choice interpretive exercises with justification. The multiple choice test was divided on 3 categories such as involve basic clarification, the bases for a decision and inference skills. The result of significance test proved the improvement of students’ critical thinking skills of experiment class was significantly higher when compared with the control class, so it could be concluded that reading assignment can improve students’ critical thinking skills.

  5. The establisment of an achievement test for determination of primary teachers’ knowledge level of earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aydin, Süleyman, E-mail: yupul@hotmail.com; Haşiloğlu, M. Akif, E-mail: mehmet.hasiloglu@hotmail.com; Kunduraci, Ayşe, E-mail: ayse-kndrc@hotmail.com

    In this study it was aimed to improve an academic achievement test to establish the students’ knowledge about the earthquake and the ways of protection from earthquakes. In the method of this study, the steps that Webb (1994) was created to improve an academic achievement test for a unit were followed. In the developmental process of multiple choice test having 25 questions, was prepared to measure the pre-service teachers’ knowledge levels about the earthquake and the ways of protection from earthquakes. The multiple choice test was presented to view of six academics (one of them was from geographic field andmore » five of them were science educator) and two expert teachers in science Prepared test was applied to 93 pre-service teachers studying in elementary education department in 2014-2015 academic years. As a result of validity and reliability of the study, the test was composed of 20 items. As a result of these applications, Pearson Moments Multiplication half-reliability coefficient was found to be 0.94. When this value is adjusted according to Spearman Brown reliability coefficient the reliability coefficient was set at 0.97.« less

  6. Identifying and exploiting trait-relevant tissues with multiple functional annotations in genome-wide association studies

    PubMed Central

    Zhang, Shujun

    2018-01-01

    Genome-wide association studies (GWASs) have identified many disease associated loci, the majority of which have unknown biological functions. Understanding the mechanism underlying trait associations requires identifying trait-relevant tissues and investigating associations in a trait-specific fashion. Here, we extend the widely used linear mixed model to incorporate multiple SNP functional annotations from omics studies with GWAS summary statistics to facilitate the identification of trait-relevant tissues, with which to further construct powerful association tests. Specifically, we rely on a generalized estimating equation based algorithm for parameter inference, a mixture modeling framework for trait-tissue relevance classification, and a weighted sequence kernel association test constructed based on the identified trait-relevant tissues for powerful association analysis. We refer to our analytic procedure as the Scalable Multiple Annotation integration for trait-Relevant Tissue identification and usage (SMART). With extensive simulations, we show how our method can make use of multiple complementary annotations to improve the accuracy for identifying trait-relevant tissues. In addition, our procedure allows us to make use of the inferred trait-relevant tissues, for the first time, to construct more powerful SNP set tests. We apply our method for an in-depth analysis of 43 traits from 28 GWASs using tissue-specific annotations in 105 tissues derived from ENCODE and Roadmap. Our results reveal new trait-tissue relevance, pinpoint important annotations that are informative of trait-tissue relationship, and illustrate how we can use the inferred trait-relevant tissues to construct more powerful association tests in the Wellcome trust case control consortium study. PMID:29377896

  7. The use of regression analysis in determining reference intervals for low hematocrit and thrombocyte count in multiple electrode aggregometry and platelet function analyzer 100 testing of platelet function.

    PubMed

    Kuiper, Gerhardus J A J M; Houben, Rik; Wetzels, Rick J H; Verhezen, Paul W M; Oerle, Rene van; Ten Cate, Hugo; Henskens, Yvonne M C; Lancé, Marcus D

    2017-11-01

    Low platelet counts and hematocrit levels hinder whole blood point-of-care testing of platelet function. Thus far, no reference ranges for MEA (multiple electrode aggregometry) and PFA-100 (platelet function analyzer 100) devices exist for low ranges. Through dilution methods of volunteer whole blood, platelet function at low ranges of platelet count and hematocrit levels was assessed on MEA for four agonists and for PFA-100 in two cartridges. Using (multiple) regression analysis, 95% reference intervals were computed for these low ranges. Low platelet counts affected MEA in a positive correlation (all agonists showed r 2 ≥ 0.75) and PFA-100 in an inverse correlation (closure times were prolonged with lower platelet counts). Lowered hematocrit did not affect MEA testing, except for arachidonic acid activation (ASPI), which showed a weak positive correlation (r 2 = 0.14). Closure time on PFA-100 testing was inversely correlated with hematocrit for both cartridges. Regression analysis revealed different 95% reference intervals in comparison with originally established intervals for both MEA and PFA-100 in low platelet or hematocrit conditions. Multiple regression analysis of ASPI and both tests on the PFA-100 for combined low platelet and hematocrit conditions revealed that only PFA-100 testing should be adjusted for both thrombocytopenia and anemia. 95% reference intervals were calculated using multiple regression analysis. However, coefficients of determination of PFA-100 were poor, and some variance remained unexplained. Thus, in this pilot study using (multiple) regression analysis, we could establish reference intervals of platelet function in anemia and thrombocytopenia conditions on PFA-100 and in thrombocytopenia conditions on MEA.

  8. Automatic detection of multiple UXO-like targets using magnetic anomaly inversion and self-adaptive fuzzy c-means clustering

    NASA Astrophysics Data System (ADS)

    Yin, Gang; Zhang, Yingtang; Fan, Hongbo; Ren, Guoquan; Li, Zhining

    2017-12-01

    We have developed a method for automatically detecting UXO-like targets based on magnetic anomaly inversion and self-adaptive fuzzy c-means clustering. Magnetic anomaly inversion methods are used to estimate the initial locations of multiple UXO-like sources. Although these initial locations have some errors with respect to the real positions, they form dense clouds around the actual positions of the magnetic sources. Then we use the self-adaptive fuzzy c-means clustering algorithm to cluster these initial locations. The estimated number of cluster centroids represents the number of targets and the cluster centroids are regarded as the locations of magnetic targets. Effectiveness of the method has been demonstrated using synthetic datasets. Computational results show that the proposed method can be applied to the case of several UXO-like targets that are randomly scattered within in a confined, shallow subsurface, volume. A field test was carried out to test the validity of the proposed method and the experimental results show that the prearranged magnets can be detected unambiguously and located precisely.

  9. The reliability and validity of fatigue measures during multiple-sprint work: an issue revisited.

    PubMed

    Glaister, Mark; Howatson, Glyn; Pattison, John R; McInnes, Gill

    2008-09-01

    The ability to repeatedly produce a high-power output or sprint speed is a key fitness component of most field and court sports. The aim of this study was to evaluate the validity and reliability of eight different approaches to quantify this parameter in tests of multiple-sprint performance. Ten physically active men completed two trials of each of two multiple-sprint running protocols with contrasting recovery periods. Protocol 1 consisted of 12 x 30-m sprints repeated every 35 seconds; protocol 2 consisted of 12 x 30-m sprints repeated every 65 seconds. All testing was performed in an indoor sports facility, and sprint times were recorded using twin-beam photocells. All but one of the formulae showed good construct validity, as evidenced by similar within-protocol fatigue scores. However, the assumptions on which many of the formulae were based, combined with poor or inconsistent test-retest reliability (coefficient of variation range: 0.8-145.7%; intraclass correlation coefficient range: 0.09-0.75), suggested many problems regarding logical validity. In line with previous research, the results support the percentage decrement calculation as the most valid and reliable method of quantifying fatigue in tests of multiple-sprint performance.

  10. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies

    PubMed Central

    Satzke, Catherine; Dunne, Eileen M.; Porter, Barbara D.; Klugman, Keith P.; Mulholland, E. Kim

    2015-01-01

    Background The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Methods and Findings Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Conclusions Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high. PMID:26575033

  11. Identification of differentially expressed genes and false discovery rate in microarray studies.

    PubMed

    Gusnanto, Arief; Calza, Stefano; Pawitan, Yudi

    2007-04-01

    To highlight the development in microarray data analysis for the identification of differentially expressed genes, particularly via control of false discovery rate. The emergence of high-throughput technology such as microarrays raises two fundamental statistical issues: multiplicity and sensitivity. We focus on the biological problem of identifying differentially expressed genes. First, multiplicity arises due to testing tens of thousands of hypotheses, rendering the standard P value meaningless. Second, known optimal single-test procedures such as the t-test perform poorly in the context of highly multiple tests. The standard approach of dealing with multiplicity is too conservative in the microarray context. The false discovery rate concept is fast becoming the key statistical assessment tool replacing the P value. We review the false discovery rate approach and argue that it is more sensible for microarray data. We also discuss some methods to take into account additional information from the microarrays to improve the false discovery rate. There is growing consensus on how to analyse microarray data using the false discovery rate framework in place of the classical P value. Further research is needed on the preprocessing of the raw data, such as the normalization step and filtering, and on finding the most sensitive test procedure.

  12. Simultaneous Identification and Antimicrobial Susceptibility Testing of Multiple Uropathogens on a Microfluidic Chip with Paper-Supported Cell Culture Arrays.

    PubMed

    Xu, Banglao; Du, Yan; Lin, Jinqiong; Qi, Mingyue; Shu, Bowen; Wen, Xiaoxia; Liang, Guangtie; Chen, Bin; Liu, Dayu

    2016-12-06

    A microfluidic chip was developed for one-step identification and antimicrobial susceptibility testing (AST) of multiple uropathogens. The polydimethylsiloxane (PDMS) microchip used had features of cell culture chamber arrays connected through a sample introduction channel. At the bottom of each chamber, a paper substrate preloaded with chromogenic media and antimicrobial agents was embedded. By integrating a hydrophobic membrane valve on the microchip, the urine sample can be equally distributed into and confined in individual chambers. The identification and AST assays on multiple uropathogens were performed by combining the spatial resolution of the cell culture arrays and the color resolution from the chromogenic reaction. The composite microbial testing assay was based on dynamic changes in color in a serial of chambers. The bacterial antimicrobial susceptibility was determined by the lowest concentration of an antimicrobial agent that is capable of inhibiting the chromogenic reaction. Using three common uropathogenic bacteria as test models, the developed microfluidic approach was demonstrated to be able to complete the multiple colorimetric assays in 15 h. The accuracy of the microchip method, in comparison with that of the conventional approach, showed a coincidence of 94.1%. Our data suggest this microfluidic approach will be a promising tool for simple and fast uropathogen testing in resource-limited settings.

  13. A SAS macro for testing differences among three or more independent groups using Kruskal-Wallis and Nemenyi tests.

    PubMed

    Liu, Yuewei; Chen, Weihong

    2012-02-01

    As a nonparametric method, the Kruskal-Wallis test is widely used to compare three or more independent groups when an ordinal or interval level of data is available, especially when the assumptions of analysis of variance (ANOVA) are not met. If the Kruskal-Wallis statistic is statistically significant, Nemenyi test is an alternative method for further pairwise multiple comparisons to locate the source of significance. Unfortunately, most popular statistical packages do not integrate the Nemenyi test, which is not easy to be calculated by hand. We described the theory and applications of the Kruskal-Wallis and Nemenyi tests, and presented a flexible SAS macro to implement the two tests. The SAS macro was demonstrated by two examples from our cohort study in occupational epidemiology. It provides a useful tool for SAS users to test the differences among three or more independent groups using a nonparametric method.

  14. Stable tearing behavior of a thin-sheet material with multiple cracks

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Sutton, M. A.; Amstutz, B. E.

    1994-01-01

    Fracture tests were conducted on 2.3mm thick, 305mm wide sheets of 2024-T3 aluminum alloy with 1-5 collinear cracks. The cracks were introduced (crack history) into the specimens by three methods: (1) saw cutting; (2) fatigue precracking at a low stress range; and (3) fatigue precracking at a high stress range. For the single crack tests, the initial crack history influenced the stress required for the onset of stable crack growth and the first 10mm of crack growth. The effect on failure stress was about 4 percent or less. For the multiple crack tests, the initial crack history was shown to cause differences of more than 20 percent in the link-up stress and 13 percent in failure stress. An elastic-plastic finite element analysis employing the Crack Tip Opening Angle (CTOA) fracture criterion was used to predict the fracture behavior of the single and multiple crack tests. The numerical predictions were within 7 percent of the observed link-up and failure stress in all the tests.

  15. Influence of crack history on the stable tearing behavior of a thin-sheet material with multiple cracks

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Sutton, M. A.; Amstutz, B. E.

    1994-01-01

    Fracture tests were conducted on 2.3mm thick, 305mm wide sheets of 2024-T3 aluminum alloy with from one to five collinear cracks. The cracks were introduced (crack history) into the specimens by three methods: saw cutting, fatigue precracking at a low stress range, and fatigue precracking at a high stress range. For the single crack tests, the initial crack history influenced the stress required for the onset of stable crack growth and the first 10mm of crack growth. The effect on failure stress was about 4 percent or less. For the multiple crack tests, the initial crack history was shown to cause differences of more than 20 percent in the link-up stress and 13 percent in failure stress. An elastic-plastic finite element analysis employing the CTOA fracture criterion was used to predict the fracture behavior of the single and multiple crack tests. The numerical predictions were within 7 percent of the observed link-up and failure stress in all the tests.

  16. Nurse-led immunotreatment DEcision Coaching In people with Multiple Sclerosis (DECIMS) - Feasibility testing, pilot randomised controlled trial and mixed methods process evaluation.

    PubMed

    Rahn, A C; Köpke, S; Backhus, I; Kasper, J; Anger, K; Untiedt, B; Alegiani, A; Kleiter, I; Mühlhauser, I; Heesen, C

    2018-02-01

    Treatment decision-making is complex for people with multiple sclerosis. Profound information on available options is virtually not possible in regular neurologist encounters. The "nurse decision coach model" was developed to redistribute health professionals' tasks in supporting immunotreatment decision-making following the principles of informed shared decision-making. To test the feasibility of a decision coaching programme and recruitment strategies to inform the main trial. Feasibility testing and parallel pilot randomised controlled trial, accompanied by a mixed methods process evaluation. Two German multiple sclerosis university centres. People with suspected or relapsing-remitting multiple sclerosis facing immunotreatment decisions on first line drugs were recruited. Randomisation to the intervention (n = 38) or control group (n = 35) was performed on a daily basis. Quantitative and qualitative process data were collected from people with multiple sclerosis, nurses and physicians. We report on the development and piloting of the decision coaching programme. It comprises a training course for multiple sclerosis nurses and the coaching intervention. The intervention consists of up to three structured nurse-led decision coaching sessions, access to an evidence-based online information platform (DECIMS-Wiki) and a final physician consultation. After feasibility testing, a pilot randomised controlled trial was performed. People with multiple sclerosis were randomised to the intervention or control group. The latter had also access to the DECIMS-Wiki, but received otherwise care as usual. Nurses were not blinded to group assignment, while people with multiple sclerosis and physicians were. The primary outcome was 'informed choice' after six months including the sub-dimensions' risk knowledge (after 14 days), attitude concerning immunotreatment (after physician consultation), and treatment uptake (after six months). Quantitative process evaluation data were collected via questionnaires. Qualitative interviews were performed with all nurses and a convenience sample of nine people with multiple sclerosis. 116 people with multiple sclerosis fulfilled the inclusion criteria and 73 (63%) were included. Groups were comparable at baseline. Data of 51 people with multiple sclerosis (70%) were available for the primary endpoint. In the intervention group 15 of 31 (48%) people with multiple sclerosis achieved an informed choice after six months and 6 of 20 (30%) in the control group. Process evaluation data illustrated a positive response towards the coaching programme as well as good acceptance. The pilot-phase showed promising results concerning acceptability and feasibility of the intervention, which was well perceived by people with multiple sclerosis, most nurses and physicians. Delegating parts of the immunotreatment decision-making process to trained nurses has the potential to increase informed choice and participation as well as effectiveness of patient-physician consultations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Identifying Variations in Hydraulic Conductivity on the East River at Crested Butte, CO

    NASA Astrophysics Data System (ADS)

    Ulmer, K. N.; Malenda, H. F.; Singha, K.

    2016-12-01

    Slug tests are a widely used method to measure saturated hydraulic conductivity, or how easily water flows through an aquifer, by perturbing the piezometric surface and measuring the time the local groundwater table takes to re-equilibrate. Saturated hydraulic conductivity is crucial to calculating the speed and direction of groundwater movement. Therefore, it is important to document data variance from in situ slug tests. This study addresses two potential sources of data variability: different users and different types of slug used. To test for user variability, two individuals slugged the same six wells with water multiple times at a stream meander on the East River near Crested Butte, CO. To test for variations in type of slug test, multiple water and metal slug tests were performed at a single well in the same meander. The distributions of hydraulic conductivities of each test were then tested for variance using both the Kruskal-Wallis test and the Brown-Forsythe test. When comparing the hydraulic conductivity distributions gathered by the two individuals, we found that they were statistically similar. However, we found that the two types of slug tests produced hydraulic conductivity distributions for the same well that are statistically dissimilar. In conclusion, multiple people should be able to conduct slug tests without creating any considerable variations in the resulting hydraulic conductivity values, but only a single type of slug should be used for those tests.

  18. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies.

    PubMed

    Satzke, Catherine; Dunne, Eileen M; Porter, Barbara D; Klugman, Keith P; Mulholland, E Kim

    2015-11-01

    The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high.

  19. Effects on Cognition of Stereotactic Lesional Surgery For the Treatment of Tremor in Multiple Sclerosis

    PubMed Central

    Jahanshahi, Marjan; Pieter, Socorro; Alusi, Sundus H.; Jones, Catherine R. G.; Glickman, Scott; Stein, John; Aziz, Tipu; Bain, Peter G.

    2008-01-01

    Objective: To assess the effect of stereotactic lesional surgery for treatment of tremor in multiple sclerosis on cognition. Methods: Eleven patients (3 males, 8 females) with multiple sclerosis participated in the study. Six subjects comprised the surgical group and five the matched control group. All patients were assessed at baseline and three months using a neuropsychological test battery that included measures of intellectual ability, memory, language, perception and executive function. Results: There were no significant differences between the surgical and control groups and no change from pre to post testing except for a decline in scores on the Mini-Mental State Examination (MMSE), WAIS-R Digit Span and Verbal Fluency in the surgical group. Conclusions: The results indicate that stereotactic lesional surgery does not result in major cognitive impairment in multiple sclerosis. However, the decline in MMSE scores, digit span and verbal fluency require further investigation in a larger sample. PMID:19491469

  20. Vertical decomposition with Genetic Algorithm for Multiple Sequence Alignment

    PubMed Central

    2011-01-01

    Background Many Bioinformatics studies begin with a multiple sequence alignment as the foundation for their research. This is because multiple sequence alignment can be a useful technique for studying molecular evolution and analyzing sequence structure relationships. Results In this paper, we have proposed a Vertical Decomposition with Genetic Algorithm (VDGA) for Multiple Sequence Alignment (MSA). In VDGA, we divide the sequences vertically into two or more subsequences, and then solve them individually using a guide tree approach. Finally, we combine all the subsequences to generate a new multiple sequence alignment. This technique is applied on the solutions of the initial generation and of each child generation within VDGA. We have used two mechanisms to generate an initial population in this research: the first mechanism is to generate guide trees with randomly selected sequences and the second is shuffling the sequences inside such trees. Two different genetic operators have been implemented with VDGA. To test the performance of our algorithm, we have compared it with existing well-known methods, namely PRRP, CLUSTALX, DIALIGN, HMMT, SB_PIMA, ML_PIMA, MULTALIGN, and PILEUP8, and also other methods, based on Genetic Algorithms (GA), such as SAGA, MSA-GA and RBT-GA, by solving a number of benchmark datasets from BAliBase 2.0. Conclusions The experimental results showed that the VDGA with three vertical divisions was the most successful variant for most of the test cases in comparison to other divisions considered with VDGA. The experimental results also confirmed that VDGA outperformed the other methods considered in this research. PMID:21867510

  1. Graphical method for comparative statistical study of vaccine potency tests.

    PubMed

    Pay, T W; Hingley, P J

    1984-03-01

    Producers and consumers are interested in some of the intrinsic characteristics of vaccine potency assays for the comparative evaluation of suitable experimental design. A graphical method is developed which represents the precision of test results, the sensitivity of such results to changes in dosage, and the relevance of the results in the way they reflect the protection afforded in the host species. The graphs can be constructed from Producer's scores and Consumer's scores on each of the scales of test score, antigen dose and probability of protection against disease. A method for calculating these scores is suggested and illustrated for single and multiple component vaccines, for tests which do or do not employ a standard reference preparation, and for tests which employ quantitative or quantal systems of scoring.

  2. Exponential integrators in time-dependent density-functional calculations

    NASA Astrophysics Data System (ADS)

    Kidd, Daniel; Covington, Cody; Varga, Kálmán

    2017-12-01

    The integrating factor and exponential time differencing methods are implemented and tested for solving the time-dependent Kohn-Sham equations. Popular time propagation methods used in physics, as well as other robust numerical approaches, are compared to these exponential integrator methods in order to judge the relative merit of the computational schemes. We determine an improvement in accuracy of multiple orders of magnitude when describing dynamics driven primarily by a nonlinear potential. For cases of dynamics driven by a time-dependent external potential, the accuracy of the exponential integrator methods are less enhanced but still match or outperform the best of the conventional methods tested.

  3. A multiwave range test for obstacle reconstructions with unknown physical properties

    NASA Astrophysics Data System (ADS)

    Potthast, Roland; Schulz, Jochen

    2007-08-01

    We develop a new multiwave version of the range test for shape reconstruction in inverse scattering theory. The range test [R. Potthast, et al., A `range test' for determining scatterers with unknown physical properties, Inverse Problems 19(3) (2003) 533-547] has originally been proposed to obtain knowledge about an unknown scatterer when the far field pattern for only one plane wave is given. Here, we extend the method to the case of multiple waves and show that the full shape of the unknown scatterer can be reconstructed. We further will clarify the relation between the range test methods, the potential method [A. Kirsch, R. Kress, On an integral equation of the first kind in inverse acoustic scattering, in: Inverse Problems (Oberwolfach, 1986), Internationale Schriftenreihe zur Numerischen Mathematik, vol. 77, Birkhauser, Basel, 1986, pp. 93-102] and the singular sources method [R. Potthast, Point sources and multipoles in inverse scattering theory, Habilitation Thesis, Gottingen, 1999]. In particular, we propose a new version of the Kirsch-Kress method using the range test and a new approach to the singular sources method based on the range test and potential method. Numerical examples of reconstructions for all four methods are provided.

  4. [Effect of preventive treatment on cognitive performance in patients with multiple sclerosis].

    PubMed

    Shorobura, Maria S

    2018-01-01

    Introduction: cognitive, emotional and psychopathological changes play a significant role in the clinical picture of multiple sclerosis and influence the effectiveness of drug therapy, working capacity, quality of life, and the process of rehabilitation of patients with multiple sclerosis. The aim: investigate the changes in cognitive function in patients with multiple sclerosis, such as information processing speed and working memory of patients before and after treatment with immunomodulating drug. Materials and methods:33 patients examined reliably diagnosed with multiple sclerosis who were treated with preventive examinations and treatment from 2012 to 2016. For all patients with multiple sclerosis had clinical-neurological examination (neurological status using the EDSS scale) and the cognitive status was evaluated using the PASAT auditory test. Patient screening was performed before, during and after the therapy. Statistical analysis of the results was performed in the system Statistica 8.0. We used Student's t-test (t), Mann-Whitney test (Z). Person evaluated the correlation coefficients and Spearman (r, R), Wilcoxon criterion (T), Chi-square (X²). Results: The age of patients with multiple sclerosis affects the growth and EDSS scale score decrease PASAT to treatment. Duration of illness affects the EDSS scale score and performance PASAT. Indicators PASAT not significantly decreased throughout the treatment. Conclusions: glatiramer acetate has a positive effect on cognitive function, information processing speed and working memory patients with multiple sclerosis, which is one of the important components of the therapeutic effect of this drug.

  5. Detecting epistasis with the marginal epistasis test in genetic mapping studies of quantitative traits

    PubMed Central

    Zeng, Ping; Mukherjee, Sayan; Zhou, Xiang

    2017-01-01

    Epistasis, commonly defined as the interaction between multiple genes, is an important genetic component underlying phenotypic variation. Many statistical methods have been developed to model and identify epistatic interactions between genetic variants. However, because of the large combinatorial search space of interactions, most epistasis mapping methods face enormous computational challenges and often suffer from low statistical power due to multiple test correction. Here, we present a novel, alternative strategy for mapping epistasis: instead of directly identifying individual pairwise or higher-order interactions, we focus on mapping variants that have non-zero marginal epistatic effects—the combined pairwise interaction effects between a given variant and all other variants. By testing marginal epistatic effects, we can identify candidate variants that are involved in epistasis without the need to identify the exact partners with which the variants interact, thus potentially alleviating much of the statistical and computational burden associated with standard epistatic mapping procedures. Our method is based on a variance component model, and relies on a recently developed variance component estimation method for efficient parameter inference and p-value computation. We refer to our method as the “MArginal ePIstasis Test”, or MAPIT. With simulations, we show how MAPIT can be used to estimate and test marginal epistatic effects, produce calibrated test statistics under the null, and facilitate the detection of pairwise epistatic interactions. We further illustrate the benefits of MAPIT in a QTL mapping study by analyzing the gene expression data of over 400 individuals from the GEUVADIS consortium. PMID:28746338

  6. The Selective Arterial Calcium Injection Test is a Valid Diagnostic Method for Invisible Gastrinoma with Duodenal Ulcer Stenosis: A Case Report.

    PubMed

    Okada, Kenjiro; Sudo, Takeshi; Miyamoto, Katsunari; Yokoyama, Yujiro; Sakashita, Yoshihiro; Hashimoto, Yasushi; Kobayashi, Hironori; Otsuka, Hiroyuki; Sakoda, Takuya; Shimamoto, Fumio

    2016-03-01

    The localization and diagnosis of microgastrinomas in a patient with multiple endocrine neoplasia type 1 is difficult preoperatively. The selective arterial calcium injection (SACI) test is a valid diagnostic method for the preoperative diagnosis of these invisible microgastrinomas. We report a rare case of multiple invisible duodenal microgastrinomas with severe duodenal stenosis diagnosed preoperatively by using the SACI test. A 50-year-old man was admitted to our hospital with recurrent duodenal ulcers. His serum gastrin level was elevated to 730 pg/ml. It was impossible for gastrointestinal endoscopy to pass through to visualize the inferior part of the duodenum, because recurrent duodenal ulcers had resulted in severe duodenal stenosis. The duodenal stenosis also prevented additional endoscopic examinations such as endoscopic ultrasonography. Computed tomography did not show any tumors in the duodenum and pancreas. The SACI test provided the evidence for a gastrinoma in the vascular territory of the inferior pancreatic-duodenal artery. We diagnosed a gastrinoma in the peri- ampullary lesion, so we performed Subtotal Stomach-Preserving Pancreatico- duodenectomy with regional lymphadenectomy. Histopathological findings showed multiple duodenal gastrinomas with lymph node metastasis and nonfunctioning pancreatic neuroendocrine tumors. Twenty months after surgery, the patient is alive with no evidence of recurrence and a normal gastrin level. In conclusion, the SACI test can enhance the accuracy of preoperative localization and diagnosis of invisible microgastrinomas, especially in the setting of severe duodenal stenosis.

  7. Efficient organ localization using multi-label convolutional neural networks in thorax-abdomen CT scans

    NASA Astrophysics Data System (ADS)

    Efrain Humpire-Mamani, Gabriel; Arindra Adiyoso Setio, Arnaud; van Ginneken, Bram; Jacobs, Colin

    2018-04-01

    Automatic localization of organs and other structures in medical images is an important preprocessing step that can improve and speed up other algorithms such as organ segmentation, lesion detection, and registration. This work presents an efficient method for simultaneous localization of multiple structures in 3D thorax-abdomen CT scans. Our approach predicts the location of multiple structures using a single multi-label convolutional neural network for each orthogonal view. Each network takes extra slices around the current slice as input to provide extra context. A sigmoid layer is used to perform multi-label classification. The output of the three networks is subsequently combined to compute a 3D bounding box for each structure. We used our approach to locate 11 structures of interest. The neural network was trained and evaluated on a large set of 1884 thorax-abdomen CT scans from patients undergoing oncological workup. Reference bounding boxes were annotated by human observers. The performance of our method was evaluated by computing the wall distance to the reference bounding boxes. The bounding boxes annotated by the first human observer were used as the reference standard for the test set. Using the best configuration, we obtained an average wall distance of 3.20~+/-~7.33 mm in the test set. The second human observer achieved 1.23~+/-~3.39 mm. For all structures, the results were better than those reported in previously published studies. In conclusion, we proposed an efficient method for the accurate localization of multiple organs. Our method uses multiple slices as input to provide more context around the slice under analysis, and we have shown that this improves performance. This method can easily be adapted to handle more organs.

  8. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  9. Methods for meta-analysis of multiple traits using GWAS summary statistics.

    PubMed

    Ray, Debashree; Boehnke, Michael

    2018-03-01

    Genome-wide association studies (GWAS) for complex diseases have focused primarily on single-trait analyses for disease status and disease-related quantitative traits. For example, GWAS on risk factors for coronary artery disease analyze genetic associations of plasma lipids such as total cholesterol, LDL-cholesterol, HDL-cholesterol, and triglycerides (TGs) separately. However, traits are often correlated and a joint analysis may yield increased statistical power for association over multiple univariate analyses. Recently several multivariate methods have been proposed that require individual-level data. Here, we develop metaUSAT (where USAT is unified score-based association test), a novel unified association test of a single genetic variant with multiple traits that uses only summary statistics from existing GWAS. Although the existing methods either perform well when most correlated traits are affected by the genetic variant in the same direction or are powerful when only a few of the correlated traits are associated, metaUSAT is designed to be robust to the association structure of correlated traits. metaUSAT does not require individual-level data and can test genetic associations of categorical and/or continuous traits. One can also use metaUSAT to analyze a single trait over multiple studies, appropriately accounting for overlapping samples, if any. metaUSAT provides an approximate asymptotic P-value for association and is computationally efficient for implementation at a genome-wide level. Simulation experiments show that metaUSAT maintains proper type-I error at low error levels. It has similar and sometimes greater power to detect association across a wide array of scenarios compared to existing methods, which are usually powerful for some specific association scenarios only. When applied to plasma lipids summary data from the METSIM and the T2D-GENES studies, metaUSAT detected genome-wide significant loci beyond the ones identified by univariate analyses. Evidence from larger studies suggest that the variants additionally detected by our test are, indeed, associated with lipid levels in humans. In summary, metaUSAT can provide novel insights into the genetic architecture of a common disease or traits. © 2017 WILEY PERIODICALS, INC.

  10. Bayesian models based on test statistics for multiple hypothesis testing problems.

    PubMed

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  11. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2014-07-08

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  12. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2015-01-27

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  13. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2015-02-24

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  14. Wear Resistance of Aluminum Matrix Composites Reinforced with Al2O3 Particles After Multiple Remelting

    NASA Astrophysics Data System (ADS)

    Klasik, Adam; Pietrzak, Krystyna; Makowska, Katarzyna; Sobczak, Jerzy; Rudnik, Dariusz; Wojciechowski, Andrzej

    2016-08-01

    Based on previous results, the commercial composites of A359 (AlSi9Mg) alloy reinforced with 22 vol.% Al2O3 particles were submitted to multiple remelting by means of gravity casting and squeeze-casting procedures. The studies were focused on tribological tests, x-ray phase analyses, and microstructural examinations. More promising results were obtained for squeeze-casting method mainly because of the reduction of the negative microstructural effects such as shrinkage porosity or other microstructural defects and discontinuities. The results showed that direct remelting may be treated as economically well-founded and alternative way compared to other recycling processes. It was underlined that the multiple remelting method must be analyzed for any material separately.

  15. Visualizing the Heterogeneity of Effects in the Analysis of Associations of Multiple Myeloma with Glyphosate Use. Comments on Sorahan, T. Multiple Myeloma and Glyphosate Use: A Re-Analysis of US Agricultural Health Study (AHS) Data. Int. J. Environ. Res. Public Health 2015, 12, 1548-1559.

    PubMed

    Burstyn, Igor; De Roos, Anneclaire J

    2016-12-22

    We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence.

  16. Visualizing the Heterogeneity of Effects in the Analysis of Associations of Multiple Myeloma with Glyphosate Use. Comments on Sorahan, T. Multiple Myeloma and Glyphosate Use: A Re-Analysis of US Agricultural Health Study (AHS) Data. Int. J. Environ. Res. Public Health 2015, 12, 1548–1559

    PubMed Central

    Burstyn, Igor; De Roos, Anneclaire J.

    2016-01-01

    We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence. PMID:28025514

  17. Gas Production Strategy of Underground Coal Gasification Based on Multiple Gas Sources

    PubMed Central

    Tianhong, Duan; Zuotang, Wang; Limin, Zhou; Dongdong, Li

    2014-01-01

    To lower stability requirement of gas production in UCG (underground coal gasification), create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method. PMID:25114953

  18. Gas production strategy of underground coal gasification based on multiple gas sources.

    PubMed

    Tianhong, Duan; Zuotang, Wang; Limin, Zhou; Dongdong, Li

    2014-01-01

    To lower stability requirement of gas production in UCG (underground coal gasification), create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method.

  19. Performance Tested Method multiple laboratory validation study of ELISA-based assays for the detection of peanuts in food.

    PubMed

    Park, Douglas L; Coates, Scott; Brewer, Vickery A; Garber, Eric A E; Abouzied, Mohamed; Johnson, Kurt; Ritter, Bruce; McKenzie, Deborah

    2005-01-01

    Performance Tested Method multiple laboratory validations for the detection of peanut protein in 4 different food matrixes were conducted under the auspices of the AOAC Research Institute. In this blind study, 3 commercially available ELISA test kits were validated: Neogen Veratox for Peanut, R-Biopharm RIDASCREEN FAST Peanut, and Tepnel BioKits for Peanut Assay. The food matrixes used were breakfast cereal, cookies, ice cream, and milk chocolate spiked at 0 and 5 ppm peanut. Analyses of the samples were conducted by laboratories representing industry and international and U.S governmental agencies. All 3 commercial test kits successfully identified spiked and peanut-free samples. The validation study required 60 analyses on test samples at the target level 5 microg peanut/g food and 60 analyses at a peanut-free level, which was designed to ensure that the lower 95% confidence limit for the sensitivity and specificity would not be <90%. The probability that a test sample contains an allergen given a prevalence rate of 5% and a positive test result using a single test kit analysis with 95% sensitivity and 95% specificity, which was demonstrated for these test kits, would be 50%. When 2 test kits are run simultaneously on all samples, the probability becomes 95%. It is therefore recommended that all field samples be analyzed with at least 2 of the validated kits.

  20. Testing cross-phenotype effects of rare variants in longitudinal studies of complex traits.

    PubMed

    Rudra, Pratyaydipta; Broadaway, K Alaine; Ware, Erin B; Jhun, Min A; Bielak, Lawrence F; Zhao, Wei; Smith, Jennifer A; Peyser, Patricia A; Kardia, Sharon L R; Epstein, Michael P; Ghosh, Debashis

    2018-06-01

    Many gene mapping studies of complex traits have identified genes or variants that influence multiple phenotypes. With the advent of next-generation sequencing technology, there has been substantial interest in identifying rare variants in genes that possess cross-phenotype effects. In the presence of such effects, modeling both the phenotypes and rare variants collectively using multivariate models can achieve higher statistical power compared to univariate methods that either model each phenotype separately or perform separate tests for each variant. Several studies collect phenotypic data over time and using such longitudinal data can further increase the power to detect genetic associations. Although rare-variant approaches exist for testing cross-phenotype effects at a single time point, there is no analogous method for performing such analyses using longitudinal outcomes. In order to fill this important gap, we propose an extension of Gene Association with Multiple Traits (GAMuT) test, a method for cross-phenotype analysis of rare variants using a framework based on the distance covariance. The approach allows for both binary and continuous phenotypes and can also adjust for covariates. Our simple adjustment to the GAMuT test allows it to handle longitudinal data and to gain power by exploiting temporal correlation. The approach is computationally efficient and applicable on a genome-wide scale due to the use of a closed-form test whose significance can be evaluated analytically. We use simulated data to demonstrate that our method has favorable power over competing approaches and also apply our approach to exome chip data from the Genetic Epidemiology Network of Arteriopathy. © 2018 WILEY PERIODICALS, INC.

  1. Adaptive Discontinuous Galerkin Methods in Multiwavelets Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archibald, Richard K; Fann, George I; Shelton Jr, William Allison

    2011-01-01

    We use a multiwavelet basis with the Discontinuous Galerkin (DG) method to produce a multi-scale DG method. We apply this Multiwavelet DG method to convection and convection-diffusion problems in multiple dimensions. Merging the DG method with multiwavelets allows the adaptivity in the DG method to be resolved through manipulation of multiwavelet coefficients rather than grid manipulation. Additionally, the Multiwavelet DG method is tested on non-linear equations in one dimension and on the cubed sphere.

  2. New approach to CT pixel-based photon dose calculations in heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, J.W.; Henkelman, R.M.

    The effects of small cavities on dose in water and the dose in a homogeneous nonunit density medium illustrate that inhomogeneities do not act independently in photon dose perturbation, and serve as two constraints which should be satisfied by approximate methods of computed tomography (CT) pixel-based dose calculations. Current methods at best satisfy only one of the two constraints and show inadequacies in some intermediate geometries. We have developed an approximate method that satisfies both these constraints and treats much of the synergistic effect of multiple inhomogeneities correctly. The method calculates primary and first-scatter doses by first-order ray tracing withmore » the first-scatter contribution augmented by a component of second scatter that behaves like first scatter. Multiple-scatter dose perturbation values extracted from small cavity experiments are used in a function which approximates the small residual multiple-scatter dose. For a wide range of geometries tested, our method agrees very well with measurements. The average deviation is less than 2% with a maximum of 3%. In comparison, calculations based on existing methods can have errors larger than 10%.« less

  3. Masonry fireplace emissions test method: Repeatability and sensitivity to fueling protocol.

    PubMed

    Stern, C H; Jaasma, D R; Champion, M R

    1993-03-01

    A test method for masonry fireplaces has been evaluated during testing on six masonry fireplace configurations. The method determines carbon monoxide and particulate matter emission rates (g/h) and factors (g/kg) and does not require weighing of the appliance to determine the timing of fuel loading.The intralaboratory repeatability of the test method has been determined from multiple tests on the six fireplaces. For the tested fireplaces, the ratio of the highest to lowest measured PM rate averaged 1.17 and in no case was greater than 1.32. The data suggest that some of the variation is due to differences in fuel properties.The influence of fueling protocol on emissions has also been studied. A modified fueling protocol, tested in large and small fireplaces, reduced CO and PM emission factors by roughly 40% and reduced CO and PM rates from 0 to 30%. For both of these fireplaces, emission rates were less sensitive to fueling protocol than emission factors.

  4. Estimating scaled treatment effects with multiple outcomes.

    PubMed

    Kennedy, Edward H; Kangovi, Shreya; Mitra, Nandita

    2017-01-01

    In classical study designs, the aim is often to learn about the effects of a treatment or intervention on a single outcome; in many modern studies, however, data on multiple outcomes are collected and it is of interest to explore effects on multiple outcomes simultaneously. Such designs can be particularly useful in patient-centered research, where different outcomes might be more or less important to different patients. In this paper, we propose scaled effect measures (via potential outcomes) that translate effects on multiple outcomes to a common scale, using mean-variance and median-interquartile range based standardizations. We present efficient, nonparametric, doubly robust methods for estimating these scaled effects (and weighted average summary measures), and for testing the null hypothesis that treatment affects all outcomes equally. We also discuss methods for exploring how treatment effects depend on covariates (i.e., effect modification). In addition to describing efficiency theory for our estimands and the asymptotic behavior of our estimators, we illustrate the methods in a simulation study and a data analysis. Importantly, and in contrast to much of the literature concerning effects on multiple outcomes, our methods are nonparametric and can be used not only in randomized trials to yield increased efficiency, but also in observational studies with high-dimensional covariates to reduce confounding bias.

  5. Moment Method and Pixel-by-Pixel Method: Complementary Mode Identification I. Testing FG Vir-like pulsation modes

    NASA Astrophysics Data System (ADS)

    Zima, W.; Kolenberg, K.; Briquet, M.; Breger, M.

    2004-06-01

    We have carried out a Hare-and-Hound test to determine the reliability of the Moment Method (Briquet & Aerts 2003) and the Pixel-by-Pixel Method (Mantegazza 2000) for the identification of pulsation modes in Delta Scuti stars. For this purpose we calculated synthetic line profiles, exhibiting six pulsation modes of low degree and with input parameters initially unknown to us. The aim was to test and increase the quality of the mode identification by applying both methods independently and by using a combined technique. Our results show that, whereas the azimuthal order m and its sign can be fixed by both methods, the degree l is not determined unambiguously. Both identification methods show a better reliability if multiple modes are fitted simultaneously. In particular, the inclination angle is better determined. We have to emphasize that the outcome of this test is only meaningful for stars having pulsational velocities below 0.2 vsini. This is the first part of a series of articles, in which we will test these spectroscopic identification methods.

  6. Concerns regarding a call for pluralism of information theory and hypothesis testing

    USGS Publications Warehouse

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  7. Development and Initial Testing of a Structured Clinical Observation Tool to Assess Pharmacotherapy Competence

    ERIC Educational Resources Information Center

    Young, John Q.; Lieu, Sandra; O'Sullivan, Patricia; Tong, Lowell

    2011-01-01

    Objective: The authors developed and tested the feasibility and utility of a new direct-observation instrument to assess trainee performance of a medication management session. Methods: The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) instrument was developed based on multiple sources of expertise and then implemented in 4…

  8. Assessment of Foundation Knowledge: Are Students Confident in Their Ability?

    ERIC Educational Resources Information Center

    Fenna, Doug S.

    2004-01-01

    Multiple-choice testing (MCT) has several advantages which are becoming more relevant in the current financial climate. In particular, they can be machine marked. As an objective testing method it is particularly relevant to engineering and other factual courses, but MCTs are not widely used in engineering because students can benefit from…

  9. Understanding Genetic Toxicity Through Data Mining: The Process of Building Knowledge by Integrating Multiple Genetic Toxicity Databases

    EPA Science Inventory

    This paper demonstrates the usefulness of representing a chemical by its structural features and the use of these features to profile a battery of tests rather than relying on a single toxicity test of a given chemical. This paper presents data mining/profiling methods applied in...

  10. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    ERIC Educational Resources Information Center

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  11. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    NASA Astrophysics Data System (ADS)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  12. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    ERIC Educational Resources Information Center

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…

  13. Low Bone Mineral Density Risk Factors and Testing Patterns in Institutionalized Adults with Intellectual and Developmental Disabilities

    ERIC Educational Resources Information Center

    Hess, Mailee; Campagna, Elizabeth J.; Jensen, Kristin M.

    2018-01-01

    Background: Adults with intellectual or developmental disability (ID/DD) have multiple risks for low bone mineral density (BMD) without formal guidelines to guide testing. We sought to identify risk factors and patterns of BMD testing among institutionalized adults with ID/DD. Methods: We evaluated risk factors for low BMD (Z-/T-score < -1) and…

  14. HIV-Related Risk Behaviors, Perceptions of Risk, HIV Testing, and Exposure to Prevention Messages and Methods among Urban American Indians and Alaska Natives

    ERIC Educational Resources Information Center

    Lapidus, Jodi A.; Bertolli, Jeanne; McGowan, Karen; Sullivan, Patrick

    2006-01-01

    The goal of this study was to describe HIV risk behaviors, perceptions, testing, and prevention exposure among urban American Indians and Alaska Natives (AI/AN). Interviewers administered a questionnaire to participants recruited through anonymous peer-referral sampling. Chi-square tests and multiple logistic regression were used to compare HIV…

  15. Application of Bayesian methods to habitat selection modeling of the northern spotted owl in California: new statistical methods for wildlife research

    Treesearch

    Howard B. Stauffer; Cynthia J. Zabel; Jeffrey R. Dunk

    2005-01-01

    We compared a set of competing logistic regression habitat selection models for Northern Spotted Owls (Strix occidentalis caurina) in California. The habitat selection models were estimated, compared, evaluated, and tested using multiple sample datasets collected on federal forestlands in northern California. We used Bayesian methods in interpreting...

  16. A Robust Real Time Direction-of-Arrival Estimation Method for Sequential Movement Events of Vehicles.

    PubMed

    Liu, Huawei; Li, Baoqing; Yuan, Xiaobing; Zhou, Qianwei; Huang, Jingchang

    2018-03-27

    Parameters estimation of sequential movement events of vehicles is facing the challenges of noise interferences and the demands of portable implementation. In this paper, we propose a robust direction-of-arrival (DOA) estimation method for the sequential movement events of vehicles based on a small Micro-Electro-Mechanical System (MEMS) microphone array system. Inspired by the incoherent signal-subspace method (ISM), the method that is proposed in this work employs multiple sub-bands, which are selected from the wideband signals with high magnitude-squared coherence to track moving vehicles in the presence of wind noise. The field test results demonstrate that the proposed method has a better performance in emulating the DOA of a moving vehicle even in the case of severe wind interference than the narrowband multiple signal classification (MUSIC) method, the sub-band DOA estimation method, and the classical two-sided correlation transformation (TCT) method.

  17. Evidence synthesis to inform model-based cost-effectiveness evaluations of diagnostic tests: a methodological review of health technology assessments.

    PubMed

    Shinkins, Bethany; Yang, Yaling; Abel, Lucy; Fanshawe, Thomas R

    2017-04-14

    Evaluations of diagnostic tests are challenging because of the indirect nature of their impact on patient outcomes. Model-based health economic evaluations of tests allow different types of evidence from various sources to be incorporated and enable cost-effectiveness estimates to be made beyond the duration of available study data. To parameterize a health-economic model fully, all the ways a test impacts on patient health must be quantified, including but not limited to diagnostic test accuracy. We assessed all UK NIHR HTA reports published May 2009-July 2015. Reports were included if they evaluated a diagnostic test, included a model-based health economic evaluation and included a systematic review and meta-analysis of test accuracy. From each eligible report we extracted information on the following topics: 1) what evidence aside from test accuracy was searched for and synthesised, 2) which methods were used to synthesise test accuracy evidence and how did the results inform the economic model, 3) how/whether threshold effects were explored, 4) how the potential dependency between multiple tests in a pathway was accounted for, and 5) for evaluations of tests targeted at the primary care setting, how evidence from differing healthcare settings was incorporated. The bivariate or HSROC model was implemented in 20/22 reports that met all inclusion criteria. Test accuracy data for health economic modelling was obtained from meta-analyses completely in four reports, partially in fourteen reports and not at all in four reports. Only 2/7 reports that used a quantitative test gave clear threshold recommendations. All 22 reports explored the effect of uncertainty in accuracy parameters but most of those that used multiple tests did not allow for dependence between test results. 7/22 tests were potentially suitable for primary care but the majority found limited evidence on test accuracy in primary care settings. The uptake of appropriate meta-analysis methods for synthesising evidence on diagnostic test accuracy in UK NIHR HTAs has improved in recent years. Future research should focus on other evidence requirements for cost-effectiveness assessment, threshold effects for quantitative tests and the impact of multiple diagnostic tests.

  18. Using a derivative-free optimization method for multiple solutions of inverse transport problems

    DOE PAGES

    Armstrong, Jerawan C.; Favorite, Jeffrey A.

    2016-01-14

    Identifying unknown components of an object that emits radiation is an important problem for national and global security. Radiation signatures measured from an object of interest can be used to infer object parameter values that are not known. This problem is called an inverse transport problem. An inverse transport problem may have multiple solutions and the most widely used approach for its solution is an iterative optimization method. This paper proposes a stochastic derivative-free global optimization algorithm to find multiple solutions of inverse transport problems. The algorithm is an extension of a multilevel single linkage (MLSL) method where a meshmore » adaptive direct search (MADS) algorithm is incorporated into the local phase. Furthermore, numerical test cases using uncollided fluxes of discrete gamma-ray lines are presented to show the performance of this new algorithm.« less

  19. A survey of variable selection methods in two Chinese epidemiology journals

    PubMed Central

    2010-01-01

    Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252

  20. Comparison of two stand-alone CADe systems at multiple operating points

    NASA Astrophysics Data System (ADS)

    Sahiner, Berkman; Chen, Weijie; Pezeshk, Aria; Petrick, Nicholas

    2015-03-01

    Computer-aided detection (CADe) systems are typically designed to work at a given operating point: The device displays a mark if and only if the level of suspiciousness of a region of interest is above a fixed threshold. To compare the standalone performances of two systems, one approach is to select the parameters of the systems to yield a target false-positive rate that defines the operating point, and to compare the sensitivities at that operating point. Increasingly, CADe developers offer multiple operating points, which necessitates the comparison of two CADe systems involving multiple comparisons. To control the Type I error, multiple-comparison correction is needed for keeping the family-wise error rate (FWER) less than a given alpha-level. The sensitivities of a single modality at different operating points are correlated. In addition, the sensitivities of the two modalities at the same or different operating points are also likely to be correlated. It has been shown in the literature that when test statistics are correlated, well-known methods for controlling the FWER are conservative. In this study, we compared the FWER and power of three methods, namely the Bonferroni, step-up, and adjusted step-up methods in comparing the sensitivities of two CADe systems at multiple operating points, where the adjusted step-up method uses the estimated correlations. Our results indicate that the adjusted step-up method has a substantial advantage over other the two methods both in terms of the FWER and power.

  1. Performance of the AOAC use-dilution method with targeted modifications: collaborative study.

    PubMed

    Tomasino, Stephen F; Parker, Albert E; Hamilton, Martin A; Hamilton, Gordon C

    2012-01-01

    The U.S. Environmental Protection Agency (EPA), in collaboration with an industry work group, spearheaded a collaborative study designed to further enhance the AOAC use-dilution method (UDM). Based on feedback from laboratories that routinely conduct the UDM, improvements to the test culture preparation steps were prioritized. A set of modifications, largely based on culturing the test microbes on agar as specified in the AOAC hard surface carrier test method, were evaluated in a five-laboratory trial. The modifications targeted the preparation of the Pseudomonas aeruginosa test culture due to the difficulty in separating the pellicle from the broth in the current UDM. The proposed modifications (i.e., the modified UDM) were compared to the current UDM methodology for P. aeruginosa and Staphylococcus aureus. Salmonella choleraesuis was not included in the study. The goal was to determine if the modifications reduced method variability. Three efficacy response variables were statistically analyzed: the number of positive carriers, the log reduction, and the pass/fail outcome. The scope of the collaborative study was limited to testing one liquid disinfectant (an EPA-registered quaternary ammonium product) at two levels of presumed product efficacies, high and low. Test conditions included use of 400 ppm hard water as the product diluent and a 5% organic soil load (horse serum) added to the inoculum. Unfortunately, the study failed to support the adoption of the major modification (use of an agar-based approach to grow the test cultures) based on an analysis of method's variability. The repeatability and reproducibility standard deviations for the modified method were equal to or greater than those for the current method across the various test variables. However, the authors propose retaining the frozen stock preparation step of the modified method, and based on the statistical equivalency of the control log densities, support its adoption as a procedural change to the current UDM. The current UDM displayed acceptable responsiveness to changes in product efficacy; acceptable repeatability across multiple tests in each laboratory for the control counts and log reductions; and acceptable reproducibility across multiple laboratories for the control log density values and log reductions. Although the data do not support the adoption of all modifications, the UDM collaborative study data are valuable for assessing sources of method variability and a reassessment of the performance standard for the UDM.

  2. Diagramming the Never Ending Story: Student-generated diagrammatic stories integrate and retain science concepts improving science literacy

    NASA Astrophysics Data System (ADS)

    Pillsbury, Ralph T.

    This research examined an instructional strategy called Diagramming the Never Ending Story: A method called diagramming was taught to sixth grade students via an outdoor science inquiry ecology unit. Students generated diagrams of the new ecology concepts they encountered, creating explanatory 'captions' for their newly drawn diagrams while connecting them in a memorable story. The diagramming process culminates in 20-30 meter-long murals called the Never Ending Story: Months of science instruction are constructed as pictorial scrolls, making sense of all new science concepts they encounter. This method was taught at a North Carolina "Public" Charter School, Children's Community School, to measure its efficacy in helping students comprehend scientific concepts and retain them thereby increasing science literacy. There were four demographically similar classes of 20 students each. Two 'treatment' classes, randomly chosen from the four classes, generated their own Never Ending Stories after being taught the diagramming method. A Solomon Four-Group Design was employed: Two Classes (one control, one treatment) were administered pre- and post; two classes received post tests only. The tests were comprised of multiple choice, fill-in and extended response (open-ended) sections. Multiple choice and fill-in test data were not statistically significant whereas extended response test data confirm that treatment classes made statistically significant gains.

  3. Development of a multiple-parameter nonlinear perturbation procedure for transonic turbomachinery flows: Preliminary application to design/optimization problems

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.

    1983-01-01

    An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.

  4. Irreproducible and uninterpretable Polymyxin B MICs for Enterobacter cloacae and Enterobacter aerogenes.

    PubMed

    Landman, David; Salamera, Julius; Quale, John

    2013-12-01

    Carbapenem-resistant Enterobacter species are emerging nosocomial pathogens. As with most multidrug-resistant Gram-negative pathogens, the polymyxins are often the only therapeutic option. In this study involving clinical isolates of E. cloacae and E. aerogenes, susceptibility testing methods with polymyxin B were analyzed. All isolates underwent testing by the broth microdilution (in duplicate) and agar dilution (in duplicate) methods, and select isolates were examined by the Etest method. Selected isolates were also examined for heteroresistance by population analysis profiling. Using a susceptibility breakpoint of ≤2 μg/ml, categorical agreement by all four dilution tests (two broth microdilution and two agar dilution) was achieved in only 76/114 (67%) of E. cloacae isolates (65 susceptible, 11 resistant). Thirty-eight (33%) had either conflicting or uninterpretable results (multiple skip wells, i.e., wells that exhibit no growth although growth does occur at higher concentrations). Of the 11 consistently resistant isolates, five had susceptible MICs as determined by Etest. Heteroresistant subpopulations were detected in eight of eight isolates tested, with greater percentages in isolates with uninterpretable MICs. For E. aerogenes, categorical agreement between the four dilution tests was obtained in 48/56 (86%), with conflicting and/or uninterpretable results in 8/56 (14%). For polymyxin susceptibility testing of Enterobacter species, close attention must be paid to the presence of multiple skip wells, leading to uninterpretable results. Susceptibility also should not be assumed based on the results of a single test. Until the clinical relevance of skip wells is defined, interpretation of polymyxin susceptibility tests for Enterobacter species should be undertaken with extreme caution.

  5. Irreproducible and Uninterpretable Polymyxin B MICs for Enterobacter cloacae and Enterobacter aerogenes

    PubMed Central

    Landman, David; Salamera, Julius

    2013-01-01

    Carbapenem-resistant Enterobacter species are emerging nosocomial pathogens. As with most multidrug-resistant Gram-negative pathogens, the polymyxins are often the only therapeutic option. In this study involving clinical isolates of E. cloacae and E. aerogenes, susceptibility testing methods with polymyxin B were analyzed. All isolates underwent testing by the broth microdilution (in duplicate) and agar dilution (in duplicate) methods, and select isolates were examined by the Etest method. Selected isolates were also examined for heteroresistance by population analysis profiling. Using a susceptibility breakpoint of ≤2 μg/ml, categorical agreement by all four dilution tests (two broth microdilution and two agar dilution) was achieved in only 76/114 (67%) of E. cloacae isolates (65 susceptible, 11 resistant). Thirty-eight (33%) had either conflicting or uninterpretable results (multiple skip wells, i.e., wells that exhibit no growth although growth does occur at higher concentrations). Of the 11 consistently resistant isolates, five had susceptible MICs as determined by Etest. Heteroresistant subpopulations were detected in eight of eight isolates tested, with greater percentages in isolates with uninterpretable MICs. For E. aerogenes, categorical agreement between the four dilution tests was obtained in 48/56 (86%), with conflicting and/or uninterpretable results in 8/56 (14%). For polymyxin susceptibility testing of Enterobacter species, close attention must be paid to the presence of multiple skip wells, leading to uninterpretable results. Susceptibility also should not be assumed based on the results of a single test. Until the clinical relevance of skip wells is defined, interpretation of polymyxin susceptibility tests for Enterobacter species should be undertaken with extreme caution. PMID:24088860

  6. Testing biological liquid samples using modified m-line spectroscopy method

    NASA Astrophysics Data System (ADS)

    Augusciuk, Elzbieta; Rybiński, Grzegorz

    2005-09-01

    Non-chemical method of detection of sugar concentration in biological (animal and plant source) liquids has been investigated. Simplified set was build to show the easy way of carrying out the survey and to make easy to gather multiple measurements for error detecting and statistics. Method is suggested as easy and cheap alternative for chemical methods of measuring sugar concentration, but needing a lot effort to be made precise.

  7. Comparison of Methods for Adjusting Incorrect Assignments of Items to Subtests: Oblique Multiple Group Method versus Confirmatory Common Factor Method

    ERIC Educational Resources Information Center

    Stuive, Ilse; Kiers, Henk A. L.; Timmerman, Marieke E.

    2009-01-01

    A common question in test evaluation is whether an a priori assignment of items to subtests is supported by empirical data. If the analysis results indicate the assignment of items to subtests under study is not supported by data, the assignment is often adjusted. In this study the authors compare two methods on the quality of their suggestions to…

  8. System For Research On Multiple-Arm Robots

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Hayati, Samad; Tso, Kam S.; Hayward, Vincent

    1991-01-01

    Kali system of computer programs and equipment provides environment for research on distributed programming and distributed control of coordinated-multiple-arm robots. Suitable for telerobotics research involving sensing and execution of low level tasks. Software and configuration of hardware designed flexible so system modified easily to test various concepts in control and programming of robots, including multiple-arm control, redundant-arm control, shared control, traded control, force control, force/position hybrid control, design and integration of sensors, teleoperation, task-space description and control, methods of adaptive control, control of flexible arms, and human factors.

  9. An entropy-based nonparametric test for the validation of surrogate endpoints.

    PubMed

    Miao, Xiaopeng; Wang, Yong-Cheng; Gangopadhyay, Ashis

    2012-06-30

    We present a nonparametric test to validate surrogate endpoints based on measure of divergence and random permutation. This test is a proposal to directly verify the Prentice statistical definition of surrogacy. The test does not impose distributional assumptions on the endpoints, and it is robust to model misspecification. Our simulation study shows that the proposed nonparametric test outperforms the practical test of the Prentice criterion in terms of both robustness of size and power. We also evaluate the performance of three leading methods that attempt to quantify the effect of surrogate endpoints. The proposed method is applied to validate magnetic resonance imaging lesions as the surrogate endpoint for clinical relapses in a multiple sclerosis trial. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Analysis and Testing of Mobile Wireless Networks

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.

  11. A chimera grid scheme. [multiple overset body-conforming mesh system for finite difference adaptation to complex aircraft configurations

    NASA Technical Reports Server (NTRS)

    Steger, J. L.; Dougherty, F. C.; Benek, J. A.

    1983-01-01

    A mesh system composed of multiple overset body-conforming grids is described for adapting finite-difference procedures to complex aircraft configurations. In this so-called 'chimera mesh,' a major grid is generated about a main component of the configuration and overset minor grids are used to resolve all other features. Methods for connecting overset multiple grids and modifications of flow-simulation algorithms are discussed. Computational tests in two dimensions indicate that the use of multiple overset grids can simplify the task of grid generation without an adverse effect on flow-field algorithms and computer code complexity.

  12. Estimating hydraulic properties using a moving-model approach and multiple aquifer tests

    USGS Publications Warehouse

    Halford, K.J.; Yobbi, D.

    2006-01-01

    A new method was developed for characterizing geohydrologic columns that extended >600 m deep at sites with as many as six discrete aquifers. This method was applied at 12 sites within the Southwest Florida Water Management District. Sites typically were equipped with multiple production wells, one for each aquifer and one or more observation wells per aquifer. The average hydraulic properties of the aquifers and confining units within radii of 30 to >300 m were characterized at each site. Aquifers were pumped individually and water levels were monitored in stressed and adjacent aquifers during each pumping event. Drawdowns at a site were interpreted using a radial numerical model that extended from land surface to the base of the geohydrologic column and simulated all pumping events. Conceptually, the radial model moves between stress periods and recenters on the production well during each test. Hydraulic conductivity was assumed homogeneous and isotropic within each aquifer and confining unit. Hydraulic property estimates for all of the aquifers and confining units were consistent and reasonable because results from multiple aquifers and pumping events were analyzed simultaneously. Copyright ?? 2005 National Ground Water Association.

  13. Estimating hydraulic properties using a moving-model approach and multiple aquifer tests.

    PubMed

    Halford, Keith J; Yobbi, Dann

    2006-01-01

    A new method was developed for characterizing geohydrologic columns that extended >600 m deep at sites with as many as six discrete aquifers. This method was applied at 12 sites within the Southwest Florida Water Management District. Sites typically were equipped with multiple production wells, one for each aquifer and one or more observation wells per aquifer. The average hydraulic properties of the aquifers and confining units within radii of 30 to >300 m were characterized at each site. Aquifers were pumped individually and water levels were monitored in stressed and adjacent aquifers during each pumping event. Drawdowns at a site were interpreted using a radial numerical model that extended from land surface to the base of the geohydrologic column and simulated all pumping events. Conceptually, the radial model moves between stress periods and recenters on the production well during each test. Hydraulic conductivity was assumed homogeneous and isotropic within each aquifer and confining unit. Hydraulic property estimates for all of the aquifers and confining units were consistent and reasonable because results from multiple aquifers and pumping events were analyzed simultaneously.

  14. Separation of left and right lungs using 3D information of sequential CT images and a guided dynamic programming algorithm

    PubMed Central

    Park, Sang Cheol; Leader, Joseph Ken; Tan, Jun; Lee, Guee Sang; Kim, Soo Hyung; Na, In Seop; Zheng, Bin

    2011-01-01

    Objective this article presents a new computerized scheme that aims to accurately and robustly separate left and right lungs on CT examinations. Methods we developed and tested a method to separate the left and right lungs using sequential CT information and a guided dynamic programming algorithm using adaptively and automatically selected start point and end point with especially severe and multiple connections. Results the scheme successfully identified and separated all 827 connections on the total 4034 CT images in an independent testing dataset of CT examinations. The proposed scheme separated multiple connections regardless of their locations, and the guided dynamic programming algorithm reduced the computation time to approximately 4.6% in comparison with the traditional dynamic programming and avoided the permeation of the separation boundary into normal lung tissue. Conclusions The proposed method is able to robustly and accurately disconnect all connections between left and right lungs and the guided dynamic programming algorithm is able to remove redundant processing. PMID:21412104

  15. Prediction system of hydroponic plant growth and development using algorithm Fuzzy Mamdani method

    NASA Astrophysics Data System (ADS)

    Sudana, I. Made; Purnawirawan, Okta; Arief, Ulfa Mediaty

    2017-03-01

    Hydroponics is a method of farming without soil. One of the Hydroponic plants is Watercress (Nasturtium Officinale). The development and growth process of hydroponic Watercress was influenced by levels of nutrients, acidity and temperature. The independent variables can be used as input variable system to predict the value level of plants growth and development. The prediction system is using Fuzzy Algorithm Mamdani method. This system was built to implement the function of Fuzzy Inference System (Fuzzy Inference System/FIS) as a part of the Fuzzy Logic Toolbox (FLT) by using MATLAB R2007b. FIS is a computing system that works on the principle of fuzzy reasoning which is similar to humans' reasoning. Basically FIS consists of four units which are fuzzification unit, fuzzy logic reasoning unit, base knowledge unit and defuzzification unit. In addition to know the effect of independent variables on the plants growth and development that can be visualized with the function diagram of FIS output surface that is shaped three-dimensional, and statistical tests based on the data from the prediction system using multiple linear regression method, which includes multiple linear regression analysis, T test, F test, the coefficient of determination and donations predictor that are calculated using SPSS (Statistical Product and Service Solutions) software applications.

  16. Experimental Design for Multi-drug Combination Studies Using Signaling Networks

    PubMed Central

    Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.

    2017-01-01

    Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231

  17. Hydrologic testing of tight zones in southeastern New Mexico.

    USGS Publications Warehouse

    Dennehy, K.F.; Davis, P.A.

    1981-01-01

    Increased attention is being directed toward the investigation of tight zones in relation to the storage and disposal of hazardous wastes. Shut-in tests, slug tests, and pressure-slug tests are being used at the proposed Waste Isolation Pilot Plant site, New Mexico, to evaluate the fluid-transmitting properties of several zones above the proposed repository zone. All three testing methods were used in various combinations to obtain values for the hydraulic properties of the test zones. Multiple testing on the same zone produced similar results. -from Authors

  18. A Randomized Controlled Trial of Cognitive Behavioral Therapy (CBT) for Adjusting to Multiple Sclerosis (The saMS Trial): Does CBT Work and for Whom Does It Work?

    ERIC Educational Resources Information Center

    Moss-Morris, Rona; Dennison, Laura; Landau, Sabine; Yardley, Lucy; Silber, Eli; Chalder, Trudie

    2013-01-01

    Objective: The aims were (a) to test the effectiveness of a nurse-led cognitive behavioral therapy (CBT) program to assist adjustment in the early stages of multiple sclerosis (MS) and (b) to determine moderators of treatment including baseline distress, social support (SS), and treatment preference. Method: Ninety-four ambulatory people with MS…

  19. The potential for increased power from combining P-values testing the same hypothesis.

    PubMed

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  20. Learning-based meta-algorithm for MRI brain extraction.

    PubMed

    Shi, Feng; Wang, Li; Gilmore, John H; Lin, Weili; Shen, Dinggang

    2011-01-01

    Multiple-segmentation-and-fusion method has been widely used for brain extraction, tissue segmentation, and region of interest (ROI) localization. However, such studies are hindered in practice by their computational complexity, mainly coming from the steps of template selection and template-to-subject nonlinear registration. In this study, we address these two issues and propose a novel learning-based meta-algorithm for MRI brain extraction. Specifically, we first use exemplars to represent the entire template library, and assign the most similar exemplar to the test subject. Second, a meta-algorithm combining two existing brain extraction algorithms (BET and BSE) is proposed to conduct multiple extractions directly on test subject. Effective parameter settings for the meta-algorithm are learned from the training data and propagated to subject through exemplars. We further develop a level-set based fusion method to combine multiple candidate extractions together with a closed smooth surface, for obtaining the final result. Experimental results show that, with only a small portion of subjects for training, the proposed method is able to produce more accurate and robust brain extraction results, at Jaccard Index of 0.956 +/- 0.010 on total 340 subjects under 6-fold cross validation, compared to those by the BET and BSE even using their best parameter combinations.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kin, Tadahiro; Oshima, Masumi; Furutaka, Kazuyoshi

    We developed a spectrometer for multiple prompt gamma-ray measurements to identify nuclear levels to determine neutron capture cross sections. From a test of finding candidates of {sup 15}N levels with a developing method, we found performance of the spectrometer is sufficient.

  2. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  3. Manufacturing Methods and Technology for Digital Fault Isolation of Hybrid Microelectronic Assemblies.

    DTIC Science & Technology

    1982-03-01

    Aircraft Company ARECAaSOENT CSR Ground Systems Group Task 007 Fullerton, California 92634 Project No. R1023 I$. =OTRS4.IWmOr SP NAnE lAD ABDASE it. REPORT...HMA feed mechanism, multiple type test sockets or adapters, and a localized UUT vessel for functional tests at temperature. The engineering model AP...test excluding (deactivated) microprocessor. * Models UUT and test adapter as a ROM. Independent latches or registers from interconnecting ports to

  4. Comparison of Deep Learning With Multiple Machine Learning Methods and Metrics Using Diverse Drug Discovery Data Sets.

    PubMed

    Korotcov, Alexandru; Tkachenko, Valery; Russo, Daniel P; Ekins, Sean

    2017-12-04

    Machine learning methods have been applied to many data sets in pharmaceutical research for several decades. The relative ease and availability of fingerprint type molecular descriptors paired with Bayesian methods resulted in the widespread use of this approach for a diverse array of end points relevant to drug discovery. Deep learning is the latest machine learning algorithm attracting attention for many of pharmaceutical applications from docking to virtual screening. Deep learning is based on an artificial neural network with multiple hidden layers and has found considerable traction for many artificial intelligence applications. We have previously suggested the need for a comparison of different machine learning methods with deep learning across an array of varying data sets that is applicable to pharmaceutical research. End points relevant to pharmaceutical research include absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) properties, as well as activity against pathogens and drug discovery data sets. In this study, we have used data sets for solubility, probe-likeness, hERG, KCNQ1, bubonic plague, Chagas, tuberculosis, and malaria to compare different machine learning methods using FCFP6 fingerprints. These data sets represent whole cell screens, individual proteins, physicochemical properties as well as a data set with a complex end point. Our aim was to assess whether deep learning offered any improvement in testing when assessed using an array of metrics including AUC, F1 score, Cohen's kappa, Matthews correlation coefficient and others. Based on ranked normalized scores for the metrics or data sets Deep Neural Networks (DNN) ranked higher than SVM, which in turn was ranked higher than all the other machine learning methods. Visualizing these properties for training and test sets using radar type plots indicates when models are inferior or perhaps over trained. These results also suggest the need for assessing deep learning further using multiple metrics with much larger scale comparisons, prospective testing as well as assessment of different fingerprints and DNN architectures beyond those used.

  5. Reactions to the Implicit Association Test as an Educational Tool: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Hillard, Amy L.; Ryan, Carey S.; Gervais, Sarah J.

    2013-01-01

    We examined reactions to the Race Implicit Association Test (IAT), which has been widely used but rarely examined as an educational tool to raise awareness about racial bias. College students (N = 172) were assigned to read that the IAT reflected either personal beliefs or both personal and extrapersonal factors (single vs. multiple explanation…

  6. On the Optimality of Answer-Copying Indices: Theory and Practice

    ERIC Educational Resources Information Center

    Romero, Mauricio; Riascos, Álvaro; Jara, Diego

    2015-01-01

    Multiple-choice exams are frequently used as an efficient and objective method to assess learning, but they are more vulnerable to answer copying than tests based on open questions. Several statistical tests (known as indices in the literature) have been proposed to detect cheating; however, to the best of our knowledge, they all lack mathematical…

  7. A Comparison of Reliability and Precision of Subscore Reporting Methods for a State English Language Proficiency Assessment

    ERIC Educational Resources Information Center

    Longabach, Tanya; Peyton, Vicki

    2018-01-01

    K-12 English language proficiency tests that assess multiple content domains (e.g., listening, speaking, reading, writing) often have subsections based on these content domains; scores assigned to these subsections are commonly known as subscores. Testing programs face increasing customer demands for the reporting of subscores in addition to the…

  8. Accounting for multiple stressors in regional stream ecosystem analysis: A demonstration with riparian invasive plants

    EPA Science Inventory

    Background/Questions/Methods: Large cross-sectional data sets allow testing of hypotheses about how one part of an ecosystem relates to other parts. Tests such as these are of interest for many reasons, one of which is to gain insight into the role of stressors, such as land co...

  9. SPSS Syntax for Missing Value Imputation in Test and Questionnaire Data

    ERIC Educational Resources Information Center

    van Ginkel, Joost R.; van der Ark, L. Andries

    2005-01-01

    A well-known problem in the analysis of test and questionnaire data is that some item scores may be missing. Advanced methods for the imputation of missing data are available, such as multiple imputation under the multivariate normal model and imputation under the saturated logistic model (Schafer, 1997). Accompanying software was made available…

  10. A method for experimental modal separation

    NASA Technical Reports Server (NTRS)

    Hallauer, W. L., Jr.

    1977-01-01

    A method is described for the numerical simulation of multiple-shaker modal survey testing using simulated experimental data to optimize the shaker force-amplitude distribution for the purpose of isolating individual modes of vibration. Inertia, damping, stiffness, and model data are stored on magnetic disks, available by direct access to the interactive FORTRAN programs which perform all computations required by this relative force amplitude distribution method.

  11. Factors influencing tests of auditory processing: a perspective on current issues and relevant concerns.

    PubMed

    Cacace, Anthony T; McFarland, Dennis J

    2013-01-01

    Tests of auditory perception, such as those used in the assessment of central auditory processing disorders ([C]APDs), represent a domain in audiological assessment where measurement of this theoretical construct is often confounded by nonauditory abilities due to methodological shortcomings. These confounds include the effects of cognitive variables such as memory and attention and suboptimal testing paradigms, including the use of verbal reproduction as a form of response selection. We argue that these factors need to be controlled more carefully and/or modified so that their impact on tests of auditory and visual perception is only minimal. To advocate for a stronger theoretical framework than currently exists and to suggest better methodological strategies to improve assessment of auditory processing disorders (APDs). Emphasis is placed on adaptive forced-choice psychophysical methods and the use of matched tasks in multiple sensory modalities to achieve these goals. Together, this approach has potential to improve the construct validity of the diagnosis, enhance and develop theory, and evolve into a preferred method of testing. Examination of methods commonly used in studies of APDs. Where possible, currently used methodology is compared to contemporary psychophysical methods that emphasize computer-controlled forced-choice paradigms. In many cases, the procedures used in studies of APD introduce confounding factors that could be minimized if computer-controlled forced-choice psychophysical methods were utilized. Ambiguities of interpretation, indeterminate diagnoses, and unwanted confounds can be avoided by minimizing memory and attentional demands on the input end and precluding the use of response-selection strategies that use complex motor processes on the output end. Advocated are the use of computer-controlled forced-choice psychophysical paradigms in combination with matched tasks in multiple sensory modalities to enhance the prospect of obtaining a valid diagnosis. American Academy of Audiology.

  12. Comparing the index-flood and multiple-regression methods using L-moments

    NASA Astrophysics Data System (ADS)

    Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.

    In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin in central Iran. To estimate floods of various return periods for gauged catchments in the study area, the mean annual peak flood of the catchments may be multiplied by corresponding values of the growth factors, and computed using the GEV distribution.

  13. Efficient computation of significance levels for multiple associations in large studies of correlated data, including genomewide association studies.

    PubMed

    Dudbridge, Frank; Koeleman, Bobby P C

    2004-09-01

    Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.

  14. Preparation, Purification, and Stability of Tuberculin

    PubMed Central

    Landi, S.

    1963-01-01

    The method used to produce “Connaught” tuberculin purified protein derivative (PPD) is described. The tuberculin PPD for the multiple-puncture method was shown to be stable for at least 24 months at 5 C; tuberculin PPD for the intracutaneous method was shown to be stable at 5 C and 24 C for a period of 18 months in the presence of Tween 80. Evans blue or brillant vital red was added to tuberculin PPD for improved testing by the multiple-puncture method. These tinted tuberculin preparations were found to be as stable as the Connaught tuberculin PPD preparations without dye at 5 C. Freeze-dried tuberculin PPD with Plasdone as an inert base was found to be remarkably stable for a period of at least 24 months at 5, 24, and 37 C. PMID:14063782

  15. Rapid determination of moisture content in paper materials by multiple headspace extraction gas chromatography.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2016-04-22

    This paper describes a new method for the rapid determination of the moisture content in paper materials. The method is based on multiple headspace extraction gas chromatography (MHE-GC) at a temperature above the boiling point of water, from which an integrated water loss from the tested sample due to evaporation can be measured and from which the moisture content in the sample can be determined. The results show that the new method has a good precision (with the relative standard deviation <0.96%), high sensitivity (the limit of quantitation=0.005%) and good accuracy (the relative differences <1.4%). Therefore, the method is quite suitable for many uses in research and industrial applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Identifying a maximum tolerated contour in two-dimensional dose-finding

    PubMed Central

    Wages, Nolan A.

    2016-01-01

    The majority of Phase I methods for multi-agent trials have focused on identifying a single maximum tolerated dose combination (MTDC) among those being investigated. Some published methods in the area have been based on the notion that there is no unique MTDC, and that the set of dose combinations with acceptable toxicity forms an equivalence contour in two dimensions. Therefore, it may be of interest to find multiple MTDC's for further testing for efficacy in a Phase II setting. In this paper, we present a new dose-finding method that extends the continual reassessment method to account for the location of multiple MTDC's. Operating characteristics are demonstrated through simulation studies, and are compared to existing methodology. Some brief discussion of implementation and available software is also provided. PMID:26910586

  17. Rare Variant Association Test with Multiple Phenotypes

    PubMed Central

    Lee, Selyeong; Won, Sungho; Kim, Young Jin; Kim, Yongkang; Kim, Bong-Jo; Park, Taesung

    2016-01-01

    Although genome-wide association studies (GWAS) have now discovered thousands of genetic variants associated with common traits, such variants cannot explain the large degree of “missing heritability,” likely due to rare variants. The advent of next generation sequencing technology has allowed rare variant detection and association with common traits, often by investigating specific genomic regions for rare variant effects on a trait. Although multiply correlated phenotypes are often concurrently observed in GWAS, most studies analyze only single phenotypes, which may lessen statistical power. To increase power, multivariate analyses, which consider correlations between multiple phenotypes, can be used. However, few existing multi-variant analyses can identify rare variants for assessing multiple phenotypes. Here, we propose Multivariate Association Analysis using Score Statistics (MAAUSS), to identify rare variants associated with multiple phenotypes, based on the widely used Sequence Kernel Association Test (SKAT) for a single phenotype. We applied MAAUSS to Whole Exome Sequencing (WES) data from a Korean population of 1,058 subjects, to discover genes associated with multiple traits of liver function. We then assessed validation of those genes by a replication study, using an independent dataset of 3,445 individuals. Notably, we detected the gene ZNF620 among five significant genes. We then performed a simulation study to compare MAAUSS's performance with existing methods. Overall, MAAUSS successfully conserved type 1 error rates and in many cases, had a higher power than the existing methods. This study illustrates a feasible and straightforward approach for identifying rare variants correlated with multiple phenotypes, with likely relevance to missing heritability. PMID:28039885

  18. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Student certainty answering misconception question: study of Three-Tier Multiple-Choice Diagnostic Test in Acid-Base and Solubility Equilibrium

    NASA Astrophysics Data System (ADS)

    Ardiansah; Masykuri, M.; Rahardjo, S. B.

    2018-04-01

    Students’ concept comprehension in three-tier multiple-choice diagnostic test related to student confidence level. The confidence level related to certainty and student’s self-efficacy. The purpose of this research was to find out students’ certainty in misconception test. This research was quantitative-qualitative research method counting students’ confidence level. The research participants were 484 students that were studying acid-base and equilibrium solubility subject. Data was collected using three-tier multiple-choice (3TMC) with thirty questions and students’ questionnaire. The findings showed that #6 item gives the highest misconception percentage and high student confidence about the counting of ultra-dilute solution’s pH. Other findings were that 1) the student tendency chosen the misconception answer is to increase over item number, 2) student certainty decreased in terms of answering the 3TMC, and 3) student self-efficacy and achievement were related each other in the research. The findings suggest some implications and limitations for further research.

  20. Subject-independent emotion recognition based on physiological signals: a three-stage decision method.

    PubMed

    Chen, Jing; Hu, Bin; Wang, Yue; Moore, Philip; Dai, Yongqiang; Feng, Lei; Ding, Zhijie

    2017-12-20

    Collaboration between humans and computers has become pervasive and ubiquitous, however current computer systems are limited in that they fail to address the emotional component. An accurate understanding of human emotions is necessary for these computers to trigger proper feedback. Among multiple emotional channels, physiological signals are synchronous with emotional responses; therefore, analyzing physiological changes is a recognized way to estimate human emotions. In this paper, a three-stage decision method is proposed to recognize four emotions based on physiological signals in the multi-subject context. Emotion detection is achieved by using a stage-divided strategy in which each stage deals with a fine-grained goal. The decision method consists of three stages. During the training process, the initial stage transforms mixed training subjects to separate groups, thus eliminating the effect of individual differences. The second stage categorizes four emotions into two emotion pools in order to reduce recognition complexity. The third stage trains a classifier based on emotions in each emotion pool. During the testing process, a test case or test trial will be initially classified to a group followed by classification into an emotion pool in the second stage. An emotion will be assigned to the test trial in the final stage. In this paper we consider two different ways of allocating four emotions into two emotion pools. A comparative analysis is also carried out between the proposal and other methods. An average recognition accuracy of 77.57% was achieved on the recognition of four emotions with the best accuracy of 86.67% to recognize the positive and excited emotion. Using differing ways of allocating four emotions into two emotion pools, we found there is a difference in the effectiveness of a classifier on learning each emotion. When compared to other methods, the proposed method demonstrates a significant improvement in recognizing four emotions in the multi-subject context. The proposed three-stage decision method solves a crucial issue which is 'individual differences' in multi-subject emotion recognition and overcomes the suboptimal performance with respect to direct classification of multiple emotions. Our study supports the observation that the proposed method represents a promising methodology for recognizing multiple emotions in the multi-subject context.

  1. Exact p-values for pairwise comparison of Friedman rank sums, with application to comparing classifiers.

    PubMed

    Eisinga, Rob; Heskes, Tom; Pelzer, Ben; Te Grotenhuis, Manfred

    2017-01-25

    The Friedman rank sum test is a widely-used nonparametric method in computational biology. In addition to examining the overall null hypothesis of no significant difference among any of the rank sums, it is typically of interest to conduct pairwise comparison tests. Current approaches to such tests rely on large-sample approximations, due to the numerical complexity of computing the exact distribution. These approximate methods lead to inaccurate estimates in the tail of the distribution, which is most relevant for p-value calculation. We propose an efficient, combinatorial exact approach for calculating the probability mass distribution of the rank sum difference statistic for pairwise comparison of Friedman rank sums, and compare exact results with recommended asymptotic approximations. Whereas the chi-squared approximation performs inferiorly to exact computation overall, others, particularly the normal, perform well, except for the extreme tail. Hence exact calculation offers an improvement when small p-values occur following multiple testing correction. Exact inference also enhances the identification of significant differences whenever the observed values are close to the approximate critical value. We illustrate the proposed method in the context of biological machine learning, were Friedman rank sum difference tests are commonly used for the comparison of classifiers over multiple datasets. We provide a computationally fast method to determine the exact p-value of the absolute rank sum difference of a pair of Friedman rank sums, making asymptotic tests obsolete. Calculation of exact p-values is easy to implement in statistical software and the implementation in R is provided in one of the Additional files and is also available at http://www.ru.nl/publish/pages/726696/friedmanrsd.zip .

  2. Multiple sclerosis - etiology and diagnostic potential.

    PubMed

    Kamińska, Joanna; Koper, Olga M; Piechal, Kinga; Kemona, Halina

    2017-06-30

    Multiple sclerosis (MS) is a chronic inflammatory and demyelinating disease of autoimmune originate. The main agents responsible for the MS development include exogenous, environmental, and genetic factors. MS is characterized by multifocal and temporally scattered central nervous system (CNS) damage which lead to the axonal damage. Among clinical courses of MS it can be distinguish relapsing-remitting multiple sclerosis (RRMS), secondary progressive multiple sclerosis (SPSM), primary progressive multiple sclerosis (PPMS), and progressive-relapsing multiple sclerosis (RPMS). Depending on the severity of signs and symptoms MS can be described as benign MS or malignant MS. MS diagnosis is based on McDonald's diagnostic criteria, which link clinical manifestation with characteristic lesions demonstrated by magnetic resonance imaging (MRI), cerebrospinal fluid (CSF) analysis, and visual evoked potentials. Among CSF laboratory tests used to the MS diagnosis are applied: Tibbling & Link IgG index, reinbegrams, and CSF isoelectrofocusing for oligoclonal bands detection. It should be emphasized, that despite huge progress regarding MS as well as the availability of different diagnostics methods this disease is still a diagnostic challenge. It may result from fact that MS has diverse clinical course and there is a lack of single test, which would be of appropriate diagnostic sensitivity and specificity for quick and accurate diagnosis.

  3. Effectiveness of team-based learning methodology in teaching transfusion medicine to medical undergraduates in third semester: A comparative study.

    PubMed

    Doshi, Neena Piyush

    2017-01-01

    Team-based learning (TBL) combines small and large group learning by incorporating multiple small groups in a large group setting. It is a teacher-directed method that encourages student-student interaction. This study compares student learning and teaching satisfaction between conventional lecture and TBL in the subject of pathology. The present study is aimed to assess the effectiveness of TBL method of teaching over the conventional lecture. The present study was conducted in the Department of Pathology, GMERS Medical College and General Hospital, Gotri, Vadodara, Gujarat. The study population comprised 126 students of second-year MBBS, in their third semester of the academic year 2015-2016. "Hemodynamic disorders" were taught by conventional method and "transfusion medicine" by TBL method. Effectiveness of both the methods was assessed. A posttest multiple choice question was conducted at the end of "hemodynamic disorders." Assessment of TBL was based on individual score, team score, and each member's contribution to the success of the team. The individual score and overall score were compared with the posttest score on "hemodynamic disorders." A feedback was taken from the students regarding their experience with TBL. Tukey's multiple comparisons test and ANOVA summary were used to find the significance of scores between didactic and TBL methods. Student feedback was taken using "Student Satisfaction Scale" based on Likert scoring method. The mean of student scores by didactic, Individual Readiness Assurance Test (score "A"), and overall (score "D") was 49.8% (standard deviation [SD]-14.8), 65.6% (SD-10.9), and 65.6% (SD-13.8), respectively. The study showed positive educational outcome in terms of knowledge acquisition, participation and engagement, and team performance with TBL.

  4. GeneNetFinder2: Improved Inference of Dynamic Gene Regulatory Relations with Multiple Regulators.

    PubMed

    Han, Kyungsook; Lee, Jeonghoon

    2016-01-01

    A gene involved in complex regulatory interactions may have multiple regulators since gene expression in such interactions is often controlled by more than one gene. Another thing that makes gene regulatory interactions complicated is that regulatory interactions are not static, but change over time during the cell cycle. Most research so far has focused on identifying gene regulatory relations between individual genes in a particular stage of the cell cycle. In this study we developed a method for identifying dynamic gene regulations of several types from the time-series gene expression data. The method can find gene regulations with multiple regulators that work in combination or individually as well as those with single regulators. The method has been implemented as the second version of GeneNetFinder (hereafter called GeneNetFinder2) and tested on several gene expression datasets. Experimental results with gene expression data revealed the existence of genes that are not regulated by individual genes but rather by a combination of several genes. Such gene regulatory relations cannot be found by conventional methods. Our method finds such regulatory relations as well as those with multiple, independent regulators or single regulators, and represents gene regulatory relations as a dynamic network in which different gene regulatory relations are shown in different stages of the cell cycle. GeneNetFinder2 is available at http://bclab.inha.ac.kr/GeneNetFinder and will be useful for modeling dynamic gene regulations with multiple regulators.

  5. A Critical Analysis of the Body of Work Method for Setting Cut-Scores

    ERIC Educational Resources Information Center

    Radwan, Nizam; Rogers, W. Todd

    2006-01-01

    The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…

  6. Initial Experiences with Machine-Assisted Reconsiderative Test Scoring: A New Method for Partial Credit and Multiple Correct Responses.

    ERIC Educational Resources Information Center

    Anderson, Paul S.

    Initial experiences with computer-assisted reconsiderative scoring are described. Reconsiderative scoring occurs when student responses are received and reviewed by the teacher before points for correctness are assigned. Manually scored completion-style questions are reconsiderative. A new method of machine assistance produces an item analysis on…

  7. Synthesizing Experiences in Arts Methods Courses: Creating Artists' Maps in Preservice Elementary Teacher Education

    ERIC Educational Resources Information Center

    Huxhold, Dianna; Willcox, Libba

    2014-01-01

    Each semester, preservice elementary generalist teachers navigate to and through the multiple sections of our art methods courses. These elementary education majors bring concerns relating to dominant education discourse such as high stakes testing and accountability measures that relate to how they will be evaluated as future teachers. Often,…

  8. Endovascular Skills for Trauma and Resuscitative Surgery (ESTARS) Course: Curriculum Development, Content Validation, and Program Assessment

    DTIC Science & Technology

    2014-01-01

    fundamental endovascular training for trauma surgeons. METHODS: ESTARS 2-day course incorporated pretest / posttest examinations, precourse materials...and 17 multiple true/false items. The purpose of the test was pri- marily formative; the same items were used for pretesting and posttesting , and the... pretest served as a learning tool focusing learners on the content of importance. Mean scores were computed, treating each item as one point (multiple

  9. Multiple exposures of sevoflurane during pregnancy induces memory impairment in young female offspring mice

    PubMed Central

    Chung, Woosuk; Yoon, Seunghwan

    2017-01-01

    Background Earlier studies have reported conflicting results regarding long-term behavioral consequences after anesthesia during the fetal period. Previous studies also suggest several factors that may explain such conflicting data. Thus, we examined the influence of age and sex on long-term behavioral consequences after multiple sevoflurane exposures during the fetal period. Methods C57BL/6J pregnant mice received oxygen with or without sevoflurane for 2 hours at gestational day (GD) 14-16. Offspring mice were subjected to behavioral assays for general activity (open field test), learning, and memory (fear chamber test) at postnatal day 30–35. Results Multiple sevoflurane exposures at GD 14–16 caused significant changes during the fear chamber test in young female offspring mice. Such changes did not occur in young male offspring mice. However, general activity was not affected in both male and female mice. Conclusions Multiple sevoflurane exposures in the second trimester of pregnancy affects learning and memory only in young female mice. Further studies focusing on diverse cognitive functions in an age-, sex-dependent manner may provide valuable insights regarding anesthesia-induced neurotoxicity. PMID:29225748

  10. Two self-test methods applied to an inertial system problem. [estimating gyroscope and accelerometer bias

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.; Deyst, J. J.; Crawford, B. S.

    1975-01-01

    The paper describes two self-test procedures applied to the problem of estimating the biases in accelerometers and gyroscopes on an inertial platform. The first technique is the weighted sum-squared residual (WSSR) test, with which accelerator bias jumps are easily isolated, but gyro bias jumps are difficult to isolate. The WSSR method does not take full advantage of the knowledge of system dynamics. The other technique is a multiple hypothesis method developed by Buxbaum and Haddad (1969). It has the advantage of directly providing jump isolation information, but suffers from computational problems. It might be possible to use the WSSR to detect state jumps and then switch to the BH system for jump isolation and estimate compensation.

  11. Testing primary-school children's understanding of the nature of science.

    PubMed

    Koerber, Susanne; Osterhaus, Christopher; Sodian, Beate

    2015-03-01

    Understanding the nature of science (NOS) is a critical aspect of scientific reasoning, yet few studies have investigated its developmental beginnings and initial structure. One contributing reason is the lack of an adequate instrument. Two studies assessed NOS understanding among third graders using a multiple-select (MS) paper-and-pencil test. Study 1 investigated the validity of the MS test by presenting the items to 68 third graders (9-year-olds) and subsequently interviewing them on their underlying NOS conception of the items. All items were significantly related between formats, indicating that the test was valid. Study 2 applied the same instrument to a larger sample of 243 third graders, and their performance was compared to a multiple-choice (MC) version of the test. Although the MC format inflated the guessing probability, there was a significant relation between the two formats. In summary, the MS format was a valid method revealing third graders' NOS understanding, thereby representing an economical test instrument. A latent class analysis identified three groups of children with expertise in qualitatively different aspects of NOS, suggesting that there is not a single common starting point for the development of NOS understanding; instead, multiple developmental pathways may exist. © 2014 The British Psychological Society.

  12. Automatically generated acceptance test: A software reliability experiment

    NASA Technical Reports Server (NTRS)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  13. Assessing secondary science students' knowledge of molecule movement, concentration gradients, and equilibrium through multiple contexts

    NASA Astrophysics Data System (ADS)

    Raven, Sara

    2015-09-01

    Background: Studies have shown that students' knowledge of osmosis and diffusion and the concepts associated with these processes is often inaccurate. This is important to address, as these concepts not only provide the foundation for more advanced topics in biology and chemistry, but are also threaded throughout both state and national science standards. Purpose: In this study, designed to determine the completeness and accuracy of three specific students' knowledge of molecule movement, concentration gradients, and equilibrium, I sought to address the following question: Using multiple evaluative methods, how can students' knowledge of molecule movement, concentration gradients, and equilibrium be characterized? Sample: This study focuses on data gathered from three students - Emma, Henry, and Riley - all of whom were gifted/honors ninth-grade biology students at a suburban high school in the southeast United States. Design and Methods: Using various qualitative data analysis techniques, I analyzed multiple sources of data from the three students, including multiple-choice test results, written free-response answers, think-aloud interview responses, and student drawings. Results: Results of the analysis showed that students maintained misconceptions about molecule movement, concentration gradients, and equilibrium. The conceptual knowledge students demonstrated differed depending on the assessment method, with the most distinct differences appearing on the multiple-choice versus the free-response questions, and in verbal versus written formats. Conclusions: Multiple levels of assessment may be required to obtain an accurate picture of content knowledge, as free-response and illustrative tasks made it difficult for students to conceal any misconceptions. Using a variety of assessment methods within a section of the curriculum can arguably help to provide a deeper understanding of student knowledge and learning, as well as illuminate misconceptions that may have remained unknown if only one assessment method was used. Furthermore, beyond simply evaluating past learning, multiple assessment methods may aid in student comprehension of key concepts.

  14. Multiple regression analysis in nomogram development for myopic wavefront laser in situ keratomileusis: Improving astigmatic outcomes.

    PubMed

    Allan, Bruce D; Hassan, Hala; Ieong, Alvin

    2015-05-01

    To describe and evaluate a new multiple regression-derived nomogram for myopic wavefront laser in situ keratomileusis (LASIK). Moorfields Eye Hospital, London, United Kingdom. Prospective comparative case series. Multiple regression modeling was used to derive a simplified formula for adjusting attempted spherical correction in myopic LASIK. An adaptation of Thibos' power vector method was then applied to derive adjustments to attempted cylindrical correction in eyes with 1.0 diopter (D) or more of preoperative cylinder. These elements were combined in a new nomogram (nomogram II). The 3-month refractive results for myopic wavefront LASIK (spherical equivalent ≤11.0 D; cylinder ≤4.5 D) were compared between 299 consecutive eyes treated using the earlier nomogram (nomogram I) in 2009 and 2010 and 414 eyes treated using nomogram II in 2011 and 2012. There was no significant difference in treatment accuracy (variance in the postoperative manifest refraction spherical equivalent error) between nomogram I and nomogram II (P = .73, Bartlett test). Fewer patients treated with nomogram II had more than 0.5 D of residual postoperative astigmatism (P = .0001, Fisher exact test). There was no significant coupling between adjustments to the attempted cylinder and the achieved sphere (P = .18, t test). Discarding marginal influences from a multiple regression-derived nomogram for myopic wavefront LASIK had no clinically significant effect on treatment accuracy. Thibos' power vector method can be used to guide adjustments to the treatment cylinder alongside nomograms designed to optimize postoperative spherical equivalent results in myopic LASIK. mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  15. Grains of connectivity: analysis at multiple spatial scales in landscape genetics.

    PubMed

    Galpern, Paul; Manseau, Micheline; Wilson, Paul

    2012-08-01

    Landscape genetic analyses are typically conducted at one spatial scale. Considering multiple scales may be essential for identifying landscape features influencing gene flow. We examined landscape connectivity for woodland caribou (Rangifer tarandus caribou) at multiple spatial scales using a new approach based on landscape graphs that creates a Voronoi tessellation of the landscape. To illustrate the potential of the method, we generated five resistance surfaces to explain how landscape pattern may influence gene flow across the range of this population. We tested each resistance surface using a raster at the spatial grain of available landscape data (200 m grid squares). We then used our method to produce up to 127 additional grains for each resistance surface. We applied a causal modelling framework with partial Mantel tests, where evidence of landscape resistance is tested against an alternative hypothesis of isolation-by-distance, and found statistically significant support for landscape resistance to gene flow in 89 of the 507 spatial grains examined. We found evidence that major roads as well as the cumulative effects of natural and anthropogenic disturbance may be contributing to the genetic structure. Using only the original grid surface yielded no evidence for landscape resistance to gene flow. Our results show that using multiple spatial grains can reveal landscape influences on genetic structure that may be overlooked with a single grain, and suggest that coarsening the grain of landcover data may be appropriate for highly mobile species. We discuss how grains of connectivity and related analyses have potential landscape genetic applications in a broad range of systems. © 2012 Blackwell Publishing Ltd.

  16. Renal cortex segmentation using optimal surface search with novel graph construction.

    PubMed

    Li, Xiuli; Chen, Xinjian; Yao, Jianhua; Zhang, Xing; Tian, Jie

    2011-01-01

    In this paper, we propose a novel approach to solve the renal cortex segmentation problem, which has rarely been studied. In this study, the renal cortex segmentation problem is handled as a multiple-surfaces extraction problem, which is solved using the optimal surface search method. We propose a novel graph construction scheme in the optimal surface search to better accommodate multiple surfaces. Different surface sub-graphs are constructed according to their properties, and inter-surface relationships are also modeled in the graph. The proposed method was tested on 17 clinical CT datasets. The true positive volume fraction (TPVF) and false positive volume fraction (FPVF) are 74.10% and 0.08%, respectively. The experimental results demonstrate the effectiveness of the proposed method.

  17. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  18. Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.

    PubMed

    Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin

    2014-12-01

    Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.

  19. Registration and Fusion of Multiple Source Remotely Sensed Image Data

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline

    2004-01-01

    Earth and Space Science often involve the comparison, fusion, and integration of multiple types of remotely sensed data at various temporal, radiometric, and spatial resolutions. Results of this integration may be utilized for global change analysis, global coverage of an area at multiple resolutions, map updating or validation of new instruments, as well as integration of data provided by multiple instruments carried on multiple platforms, e.g. in spacecraft constellations or fleets of planetary rovers. Our focus is on developing methods to perform fast, accurate and automatic image registration and fusion. General methods for automatic image registration are being reviewed and evaluated. Various choices for feature extraction, feature matching and similarity measurements are being compared, including wavelet-based algorithms, mutual information and statistically robust techniques. Our work also involves studies related to image fusion and investigates dimension reduction and co-kriging for application-dependent fusion. All methods are being tested using several multi-sensor datasets, acquired at EOS Core Sites, and including multiple sensors such as IKONOS, Landsat-7/ETM+, EO1/ALI and Hyperion, MODIS, and SeaWIFS instruments. Issues related to the coregistration of data from the same platform (i.e., AIRS and MODIS from Aqua) or from several platforms of the A-train (i.e., MLS, HIRDLS, OMI from Aura with AIRS and MODIS from Terra and Aqua) will also be considered.

  20. Born approximation, multiple scattering, and butterfly algorithm

    NASA Astrophysics Data System (ADS)

    Martinez, Alex; Qiao, Zhijun

    2014-06-01

    Many imaging algorithms have been designed assuming the absence of multiple scattering. In the 2013 SPIE proceeding, we discussed an algorithm for removing high order scattering components from collected data. In this paper, our goal is to continue this work. First, we survey the current state of multiple scattering in SAR. Then, we revise our method and test it. Given an estimate of our target reflectivity, we compute the multi scattering effects in our target region for various frequencies. Furthermore, we propagate this energy through free space towards our antenna, and remove it from the collected data.

  1. Increasing Efficiency of Fecal Coliform Testing Through EPA-Approved Alternate Method Colilert*-18

    NASA Technical Reports Server (NTRS)

    Cornwell, Brian

    2017-01-01

    The 21 SM 9221 E multiple-tube fermentation method for fecal coliform analysis requires a large time and reagent investment for the performing laboratory. In late 2010, the EPA approved an alternative procedure for the determination of fecal coliforms designated as Colilert*-18. However, as of late 2016, only two VELAP-certified laboratories in the Commonwealth of Virginia have been certified in this method.

  2. On the Use of the Immediate Recall Task as a Measure of Second Language Reading Comprehension

    ERIC Educational Resources Information Center

    Chang, Yuh-Fang

    2006-01-01

    The immediate written recall task, a widely used measure of both first language (L1) and second language (L2) reading comprehension, has been advocated over traditional test methods such as multiple choice, cloze tests and open-ended questions because it is a direct and integrative assessment task. It has been, however, criticized as requiring…

  3. A Normalized Direct Approach for Estimating the Parameters of the Normal Ogive Three-Parameter Model for Ability Tests.

    ERIC Educational Resources Information Center

    Gugel, John F.

    A new method for estimating the parameters of the normal ogive three-parameter model for multiple-choice test items--the normalized direct (NDIR) procedure--is examined. The procedure is compared to a more commonly used estimation procedure, Lord's LOGIST, using computer simulations. The NDIR procedure uses the normalized (mid-percentile)…

  4. A Comparison of Methods to Screen Middle School Students for Reading and Math Difficulties

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Lackner, Stacey K.

    2016-01-01

    The current study explored multiple ways in which middle schools can use and integrate data sources to predict proficiency on future high-stakes state achievement tests. The diagnostic accuracy of (a) prior achievement data, (b) teacher rating scale scores, (c) a composite score combining state test scores and rating scale responses, and (d) two…

  5. Statistics in biomedical laboratory and clinical science: applications, issues and pitfalls.

    PubMed

    Ludbrook, John

    2008-01-01

    This review is directed at biomedical scientists who want to gain a better understanding of statistics: what tests to use, when, and why. In my view, even during the planning stage of a study it is very important to seek the advice of a qualified biostatistician. When designing and analyzing a study, it is important to construct and test global hypotheses, rather than to make multiple tests on the data. If the latter cannot be avoided, it is essential to control the risk of making false-positive inferences by applying multiple comparison procedures. For comparing two means or two proportions, it is best to use exact permutation tests rather then the better known, classical, ones. For comparing many means, analysis of variance, often of a complex type, is the most powerful approach. The correlation coefficient should never be used to compare the performances of two methods of measurement, or two measures, because it does not detect bias. Instead the Altman-Bland method of differences or least-products linear regression analysis should be preferred. Finally, the educational value to investigators of interaction with a biostatistician, before, during and after a study, cannot be overemphasized. (c) 2007 S. Karger AG, Basel.

  6. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  7. A new statistical method for transfer coefficient calculations in the framework of the general multiple-compartment model of transport for radionuclides in biological systems.

    PubMed

    Garcia, F; Arruda-Neto, J D; Manso, M V; Helene, O M; Vanin, V R; Rodriguez, O; Mesa, J; Likhachev, V P; Filho, J W; Deppman, A; Perez, G; Guzman, F; de Camargo, S P

    1999-10-01

    A new and simple statistical procedure (STATFLUX) for the calculation of transfer coefficients of radionuclide transport to animals and plants is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. By using experimentally available curves of radionuclide concentrations versus time, for each animal compartment (organs), flow parameters were estimated by employing a least-squares procedure, whose consistency is tested. Some numerical results are presented in order to compare the STATFLUX transfer coefficients with those from other works and experimental data.

  8. Exposure assessment for endocrine disruptors: some considerations in the design of studies.

    PubMed Central

    Rice, Carol; Birnbaum, Linda S; Cogliano, James; Mahaffey, Kathryn; Needham, Larry; Rogan, Walter J; vom Saal, Frederick S

    2003-01-01

    In studies designed to evaluate exposure-response relationships in children's development from conception through puberty, multiple factors that affect the generation of meaningful exposure metrics must be considered. These factors include multiple routes of exposure; the timing, frequency, and duration of exposure; need for qualitative and quantitative data; sample collection and storage protocols; and the selection and documentation of analytic methods. The methods for exposure data collection and analysis must be sufficiently robust to accommodate the a priori hypotheses to be tested, as well as hypotheses generated from the data. A number of issues that must be considered in study design are summarized here. PMID:14527851

  9. Automated identification of best-quality coronary artery segments from multiple-phase coronary CT angiography (cCTA) for vessel analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-03-01

    We are developing an automated method to identify the best quality segment among the corresponding segments in multiple-phase cCTA. The coronary artery trees are automatically extracted from different cCTA phases using our multi-scale vessel segmentation and tracking method. An automated registration method is then used to align the multiple-phase artery trees. The corresponding coronary artery segments are identified in the registered vessel trees and are straightened by curved planar reformation (CPR). Four features are extracted from each segment in each phase as quality indicators in the original CT volume and the straightened CPR volume. Each quality indicator is used as a voting classifier to vote the corresponding segments. A newly designed weighted voting ensemble (WVE) classifier is finally used to determine the best-quality coronary segment. An observer preference study is conducted with three readers to visually rate the quality of the vessels in 1 to 6 rankings. Six and 10 cCTA cases are used as training and test set in this preliminary study. For the 10 test cases, the agreement between automatically identified best-quality (AI-BQ) segments and radiologist's top 2 rankings is 79.7%, and between AI-BQ and the other two readers are 74.8% and 83.7%, respectively. The results demonstrated that the performance of our automated method was comparable to those of experienced readers for identification of the best-quality coronary segments.

  10. Sheet metals characterization using the virtual fields method

    NASA Astrophysics Data System (ADS)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2018-05-01

    In this work, a characterisation method involving a deep-notched specimen subjected to a tensile loading is introduced. This specimen leads to heterogeneous states of stress and strain, the latter being measured using a stereo DIC system (MatchID). This heterogeneity enables the identification of multiple material parameters in a single test. In order to identify material parameters from the DIC data, an inverse method called the Virtual Fields Method is employed. The method combined with recently developed sensitivity-based virtual fields allows to optimally locate areas in the test where information about each material parameter is encoded, improving accuracy of the identification over the traditional user-defined virtual fields. It is shown that a single test performed at 45° to the rolling direction is sufficient to obtain all anisotropic plastic parameters, thus reducing experimental effort involved in characterisation. The paper presents the methodology and some numerical validation.

  11. TEAMS (Tele-Exercise and Multiple Sclerosis), a Tailored Telerehabilitation mHealth App: Participant-Centered Development and Usability Study

    PubMed Central

    Rimmer, James H; Johnson, George; Wilroy, Jereme; Young, Hui-Ju; Mehta, Tapan; Lai, Byron

    2018-01-01

    Background People with multiple sclerosis face varying levels of disability and symptoms, thus requiring highly trained therapists and/or exercise trainers to design personalized exercise programs. However, for people living in geographically isolated communities, access to such trained professionals can be challenging due to a number of barriers associated with cost, access to transportation, and travel distance. Generic mobile health exercise apps often fall short of what people with multiple sclerosis need to become physically active (ie, exercise content that has been adapted to accommodate a wide range of functional limitations). Objective This usability study describes the development process of the TEAMS (Tele-Exercise and Multiple Sclerosis) app, which is being used by people with multiple sclerosis in a large randomized controlled trial to engage in home-based telerehabilitation. Methods Twenty-one participants with disabilities (10 people with multiple sclerosis) were involved in the double iterative design, which included the simultaneous development of the app features and exercise content (exercise videos and articles). Framed within a user-centered design approach, the development process included 2 stages: ground-level creation (focus group followed by early stage evaluations and developments), and proof of concept through 2 usability tests. Usability (effectiveness, usefulness, and satisfaction) was evaluated using a mixed-methods approach. Results During testing of the app’s effectiveness, the second usability test resulted in an average of 1 problem per participant, a decrease of 53% compared to the initial usability test. Five themes were constructed from the qualitative data that related to app usefulness and satisfaction, namely: high perceived confidence for app usability, positive perceptions of exercise videos, viable exercise option at home, orientation and familiarity required for successful participation, and app issues. Participants acknowledged that the final app was ready to be delivered to the public after minor revisions. After including these revisions, the project team released the final app that is being used in the randomized controlled trial. Conclusions A multi-level user-centered development process resulted in the development of an inclusive exercise program for people with multiple sclerosis operated through an easy-to-use app. The promotion of exercise through self-regulated mHealth programs requires a stakeholder-driven approach to app development. This ensures that app and content match the preferences and functional abilities of the end user (ie, people with varying levels of multiple sclerosis). PMID:29798832

  12. Intensity ratio to improve black hole assessment in multiple sclerosis.

    PubMed

    Adusumilli, Gautam; Trinkaus, Kathryn; Sun, Peng; Lancia, Samantha; Viox, Jeffrey D; Wen, Jie; Naismith, Robert T; Cross, Anne H

    2018-01-01

    Improved imaging methods are critical to assess neurodegeneration and remyelination in multiple sclerosis. Chronic hypointensities observed on T1-weighted brain MRI, "persistent black holes," reflect severe focal tissue damage. Present measures consist of determining persistent black holes numbers and volumes, but do not quantitate severity of individual lesions. Develop a method to differentiate black and gray holes and estimate the severity of individual multiple sclerosis lesions using standard magnetic resonance imaging. 38 multiple sclerosis patients contributed images. Intensities of lesions on T1-weighted scans were assessed relative to cerebrospinal fluid intensity using commercial software. Magnetization transfer imaging, diffusion tensor imaging and clinical testing were performed to assess associations with T1w intensity-based measures. Intensity-based assessments of T1w hypointensities were reproducible and achieved > 90% concordance with expert rater determinations of "black" and "gray" holes. Intensity ratio values correlated with magnetization transfer ratios (R = 0.473) and diffusion tensor imaging metrics (R values ranging from 0.283 to -0.531) that have been associated with demyelination and axon loss. Intensity ratio values incorporated into T1w hypointensity volumes correlated with clinical measures of cognition. This method of determining the degree of hypointensity within multiple sclerosis lesions can add information to conventional imaging. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  14. Diagnostic Utility of Total IgE in Foods, Inhalant, and Multiple Allergies in Saudi Arabia.

    PubMed

    Al-Mughales, Jamil A

    2016-01-01

    Objective. To assess the diagnostic significance of total IgE in foods, inhalant, and multiple allergies. Methods. Retrospective review of the laboratory records of patients who presented with clinical suspicion of food or inhalant allergy between January 2013 and December 2014. Total IgE level was defined as positive for a value >195 kU/L; and diagnosis was confirmed by the detection of specific IgE (golden standard) for at least one food or inhalant allergen and at least two allergens in multiple allergies. Results. A total of 1893 (male ratio = 0.68, mean age = 39.0 ± 19.2 years) patients were included. Total IgE had comparable sensitivity (55.8% versus 59.6%) and specificity (83.9% versus 84.4%) in food versus inhalant allergy, respectively, but a superior PPV in inhalant allergy (79.1% versus 54.4%). ROC curve analysis showed a better diagnostic value in inhalant allergies (AUC = 0.817 (95% CI = 0.796-0.837) versus 0.770 (95% CI = 0.707-0.833)). In multiple allergies, total IgE had a relatively good sensitivity (78.6%), while negative IgE testing (<195 kU/L) predicted the absence of multiple allergies with 91.5% certitude. Conclusion. Total IgE assay is not efficient as a diagnostic test for foods, inhalant, or multiple allergies. The best strategy should refer to specific IgE testing guided by a comprehensive atopic history.

  15. Diagnostic Utility of Total IgE in Foods, Inhalant, and Multiple Allergies in Saudi Arabia

    PubMed Central

    Al-Mughales, Jamil A.

    2016-01-01

    Objective. To assess the diagnostic significance of total IgE in foods, inhalant, and multiple allergies. Methods. Retrospective review of the laboratory records of patients who presented with clinical suspicion of food or inhalant allergy between January 2013 and December 2014. Total IgE level was defined as positive for a value >195 kU/L; and diagnosis was confirmed by the detection of specific IgE (golden standard) for at least one food or inhalant allergen and at least two allergens in multiple allergies. Results. A total of 1893 (male ratio = 0.68, mean age = 39.0 ± 19.2 years) patients were included. Total IgE had comparable sensitivity (55.8% versus 59.6%) and specificity (83.9% versus 84.4%) in food versus inhalant allergy, respectively, but a superior PPV in inhalant allergy (79.1% versus 54.4%). ROC curve analysis showed a better diagnostic value in inhalant allergies (AUC = 0.817 (95% CI = 0.796–0.837) versus 0.770 (95% CI = 0.707–0.833)). In multiple allergies, total IgE had a relatively good sensitivity (78.6%), while negative IgE testing (<195 kU/L) predicted the absence of multiple allergies with 91.5% certitude. Conclusion. Total IgE assay is not efficient as a diagnostic test for foods, inhalant, or multiple allergies. The best strategy should refer to specific IgE testing guided by a comprehensive atopic history. PMID:27314052

  16. Multiple comparisons permutation test for image based data mining in radiotherapy.

    PubMed

    Chen, Chun; Witte, Marnix; Heemsbergen, Wilma; van Herk, Marcel

    2013-12-23

    : Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy.

  17. Design of lightning protection for a full-authority digital engine control

    NASA Technical Reports Server (NTRS)

    Dargi, M.; Rupke, E.; Wiles, K.

    1991-01-01

    The steps and procedures are described which are necessary to achieve a successful lightning-protection design for a state-of-the-art Full-Authority Digital Engine Control (FADEC) system. The engine and control systems used as examples are fictional, but the design and verification methods are real. Topics discussed include: applicable airworthiness regulation, selection of equipment transient design and control levels for the engine/airframe and intra-engine segments of the system, the use of cable shields, terminal-protection devices and filter circuits in hardware protection design, and software approaches to minimize upset potential. Shield terminations, grounding, and bonding are also discussed, as are the important elements of certification and test plans, and the role of tests and analyses. Also included are examples of multiple-stroke and multiple-burst testing. A review of design pitfalls and challenges, and status of applicable test standards such as RTCA DO-160, Section 22, are presented.

  18. The positive and negative consequences of multiple-choice testing.

    PubMed

    Roediger, Henry L; Marsh, Elizabeth J

    2005-09-01

    Multiple-choice tests are commonly used in educational settings but with unknown effects on students' knowledge. The authors examined the consequences of taking a multiple-choice test on a later general knowledge test in which students were warned not to guess. A large positive testing effect was obtained: Prior testing of facts aided final cued-recall performance. However, prior testing also had negative consequences. Prior reading of a greater number of multiple-choice lures decreased the positive testing effect and increased production of multiple-choice lures as incorrect answers on the final test. Multiple-choice testing may inadvertently lead to the creation of false knowledge.

  19. LP-search and its use in analysis of the accuracy of control systems with acoustical models

    NASA Technical Reports Server (NTRS)

    Sergeyev, V. I.; Sobol, I. M.; Statnikov, R. B.; Statnikov, I. N.

    1973-01-01

    The LP-search is proposed as an analog of the Monte Carlo method for finding values in nonlinear statistical systems. It is concluded that: To attain the required accuracy in solution to the problem of control for a statistical system in the LP-search, a considerably smaller number of tests is required than in the Monte Carlo method. The LP-search allows the possibility of multiple repetitions of tests under identical conditions and observability of the output variables of the system.

  20. Evaluation of Low-Tech Indoor Remediation Methods ...

    EPA Pesticide Factsheets

    Report This study identified, collected, evaluated, and summarized available articles, reports, guidance documents, and other pertinent information related to common housekeeping activities within the United States. This resulted in a summary compendium including relevant information about multiple low-tech cleaning methods from the literature search results. Through discussion and prioritization, an EPA project team, made up of several EPA scientists and emergency responders, focused the information into a list of 14 housekeeping activities for decontamination evaluation testing. These types of activities are collectively referred to as “low-tech” remediation methods because of the comparative simple tools, equipment, and operations involved. Similarly, eight common household surfaces were chosen that were contaminated using three different contamination conditions. Thirty-three combinations of methods and surfaces were chosen for testing under the three contamination conditions for a total of 99 tests.

  1. Effectiveness of applying progressive muscle relaxation technique on quality of life of patients with multiple sclerosis.

    PubMed

    Ghafari, Somayeh; Ahmadi, Fazlolah; Nabavi, Masoud; Anoshirvan, Kazemnejad; Memarian, Robabe; Rafatbakhsh, Mohamad

    2009-08-01

    To identify the effects of applying Progressive Muscle Relaxation Technique on Quality of Life of patients with multiple Sclerosis. In view of the growing caring options in Multiple Sclerosis, improvement of quality of life has become increasingly relevant as a caring intervention. Complementary therapies are widely used by multiple sclerosis patients and Progressive Muscle Relaxation Technique is a form of complementary therapies. Quasi-experimental study. Multiple Sclerosis patients (n = 66) were selected with no probability sampling then assigned to experimental and control groups (33 patients in each group). Means of data collection included: Individual Information Questionnaire, SF-8 Health Survey, Self-reported checklist. PMRT performed for 63 sessions by experimental group during two months but no intervention was done for control group. Statistical analysis was done by SPSS software. Student t-test showed that there was no significant difference between two groups in mean scores of health-related quality of life before the study but this test showed a significant difference between two groups, one and two months after intervention (p < 0.05). anova test with repeated measurements showed that there is a significant difference in mean score of whole and dimensions of health-related quality of life between two groups in three times (p < 0.05). Although this study provides modest support for the effectiveness of Progressive Muscle Relaxation Technique on quality of life of multiple sclerosis patients, further research is required to determine better methods to promote quality of life of patients suffer multiple sclerosis and other chronic disease. Progressive Muscle Relaxation Technique is practically feasible and is associated with increase of life quality of multiple sclerosis patients; so that health professionals need to update their knowledge about complementary therapies.

  2. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be detected accurately. This will be an important step towards automatic multiple image analysis for CAD techniques.

  3. Standardization and validation of a cytometric bead assay to assess antibodies to multiple Plasmodium falciparum recombinant antigens

    PubMed Central

    2012-01-01

    Background Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Methods Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Results Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. Conclusion With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA. PMID:23259607

  4. Double-multiple streamtube model for Darrieus in turbines

    NASA Technical Reports Server (NTRS)

    Paraschivoiu, I.

    1981-01-01

    An analytical model is proposed for calculating the rotor performance and aerodynamic blade forces for Darrieus wind turbines with curved blades. The method of analysis uses a multiple-streamtube model, divided into two parts: one modeling the upstream half-cycle of the rotor and the other, the downstream half-cycle. The upwind and downwind components of the induced velocities at each level of the rotor were obtained using the principle of two actuator disks in tandem. Variation of the induced velocities in the two parts of the rotor produces larger forces in the upstream zone and smaller forces in the downstream zone. Comparisons of the overall rotor performance with previous methods and field test data show the important improvement obtained with the present model. The calculations were made using the computer code CARDAA developed at IREQ. The double-multiple streamtube model presented has two major advantages: it requires a much shorter computer time than the three-dimensional vortex model and is more accurate than multiple-streamtube model in predicting the aerodynamic blade loads.

  5. An Automated Parallel Image Registration Technique Based on the Correlation of Wavelet Features

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Campbell, William J.; Cromp, Robert F.; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    With the increasing importance of multiple platform/multiple remote sensing missions, fast and automatic integration of digital data from disparate sources has become critical to the success of these endeavors. Our work utilizes maxima of wavelet coefficients to form the basic features of a correlation-based automatic registration algorithm. Our wavelet-based registration algorithm is tested successfully with data from the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) and the Landsat/Thematic Mapper(TM), which differ by translation and/or rotation. By the choice of high-frequency wavelet features, this method is similar to an edge-based correlation method, but by exploiting the multi-resolution nature of a wavelet decomposition, our method achieves higher computational speeds for comparable accuracies. This algorithm has been implemented on a Single Instruction Multiple Data (SIMD) massively parallel computer, the MasPar MP-2, as well as on the CrayT3D, the Cray T3E and a Beowulf cluster of Pentium workstations.

  6. Stochastic Modeling and Analysis of Multiple Nonlinear Accelerated Degradation Processes through Information Fusion

    PubMed Central

    Sun, Fuqiang; Liu, Le; Li, Xiaoyang; Liao, Haitao

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient technique for evaluating the lifetime of a highly reliable product whose underlying failure process may be traced by the degradation of the product’s performance parameters with time. However, most research on ADT mainly focuses on a single performance parameter. In reality, the performance of a modern product is usually characterized by multiple parameters, and the degradation paths are usually nonlinear. To address such problems, this paper develops a new s-dependent nonlinear ADT model for products with multiple performance parameters using a general Wiener process and copulas. The general Wiener process models the nonlinear ADT data, and the dependency among different degradation measures is analyzed using the copula method. An engineering case study on a tuner’s ADT data is conducted to demonstrate the effectiveness of the proposed method. The results illustrate that the proposed method is quite effective in estimating the lifetime of a product with s-dependent performance parameters. PMID:27509499

  7. Stochastic Modeling and Analysis of Multiple Nonlinear Accelerated Degradation Processes through Information Fusion.

    PubMed

    Sun, Fuqiang; Liu, Le; Li, Xiaoyang; Liao, Haitao

    2016-08-06

    Accelerated degradation testing (ADT) is an efficient technique for evaluating the lifetime of a highly reliable product whose underlying failure process may be traced by the degradation of the product's performance parameters with time. However, most research on ADT mainly focuses on a single performance parameter. In reality, the performance of a modern product is usually characterized by multiple parameters, and the degradation paths are usually nonlinear. To address such problems, this paper develops a new s-dependent nonlinear ADT model for products with multiple performance parameters using a general Wiener process and copulas. The general Wiener process models the nonlinear ADT data, and the dependency among different degradation measures is analyzed using the copula method. An engineering case study on a tuner's ADT data is conducted to demonstrate the effectiveness of the proposed method. The results illustrate that the proposed method is quite effective in estimating the lifetime of a product with s-dependent performance parameters.

  8. High-resolution imaging using a wideband MIMO radar system with two distributed arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Chen, A-Lei; Su, Yi

    2010-05-01

    Imaging a fast maneuvering target has been an active research area in past decades. Usually, an array antenna with multiple elements is implemented to avoid the motion compensations involved in the inverse synthetic aperture radar (ISAR) imaging. Nevertheless, there is a price dilemma due to the high level of hardware complexity compared to complex algorithm implemented in the ISAR imaging system with only one antenna. In this paper, a wideband multiple-input multiple-output (MIMO) radar system with two distributed arrays is proposed to reduce the hardware complexity of the system. Furthermore, the system model, the equivalent array production method and the imaging procedure are presented. As compared with the classical real aperture radar (RAR) imaging system, there is a very important contribution in our method that the lower hardware complexity can be involved in the imaging system since many additive virtual array elements can be obtained. Numerical simulations are provided for testing our system and imaging method.

  9. Allelic-based gene-gene interaction associated with quantitative traits.

    PubMed

    Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M

    2009-05-01

    Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.

  10. Construction of mathematical model for measuring material concentration by colorimetric method

    NASA Astrophysics Data System (ADS)

    Liu, Bing; Gao, Lingceng; Yu, Kairong; Tan, Xianghua

    2018-06-01

    This paper use the method of multiple linear regression to discuss the data of C problem of mathematical modeling in 2017. First, we have established a regression model for the concentration of 5 substances. But only the regression model of the substance concentration of urea in milk can pass through the significance test. The regression model established by the second sets of data can pass the significance test. But this model exists serious multicollinearity. We have improved the model by principal component analysis. The improved model is used to control the system so that it is possible to measure the concentration of material by direct colorimetric method.

  11. Acoustic Guided Wave Testing of Pipes of Small Diameters

    NASA Astrophysics Data System (ADS)

    Muravev, V. V.; Muraveva, O. V.; Strizhak, V. A.; Myshkin, Y. V.

    2017-10-01

    Acoustic path is analyzed and main parameters of guided wave testing are substanti- ated applied to pipes of small diameters. The method is implemented using longitudinal L(0,1) and torsional T(0,1) waves based on electromagnetic-acoustic (EMA) transducers. The method of multiple reflections (MMR) combines echo-through, amplitude-shadow and time-shadow methods. Due to the effect of coherent amplification of echo-pulses from defects the sensitivity to the defects of small sizes at the signal analysis on the far reflections is increased. An oppor- tunity of detection of both local defects (dents, corrosion damages, rolling features, pitting, cracks) and defects extended along the pipe is shown.

  12. A review of advantages of high-efficiency X-ray spectrum imaging for analysis of nanostructured ferritic alloys

    DOE PAGES

    Parish, Chad M.; Miller, Michael K.

    2014-12-09

    Nanostructured ferritic alloys (NFAs) exhibit complex microstructures consisting of 100-500 nm ferrite grains, grain boundary solute enrichment, and multiple populations of precipitates and nanoclusters (NCs). Understanding these materials' excellent creep and radiation-tolerance properties requires a combination of multiple atomic-scale experimental techniques. Recent advances in scanning transmission electron microscopy (STEM) hardware and data analysis methods have the potential to revolutionize nanometer to micrometer scale materials analysis. The application of these methods is applied to NFAs as a test case and is compared to both conventional STEM methods as well as complementary methods such as scanning electron microscopy and atom probe tomography.more » In this paper, we review past results and present new results illustrating the effectiveness of latest-generation STEM instrumentation and data analysis.« less

  13. Semiquantitative determination of mesophilic, aerobic microorganisms in cocoa products using the Soleris NF-TVC method.

    PubMed

    Montei, Carolyn; McDougal, Susan; Mozola, Mark; Rice, Jennifer

    2014-01-01

    The Soleris Non-fermenting Total Viable Count method was previously validated for a wide variety of food products, including cocoa powder. A matrix extension study was conducted to validate the method for use with cocoa butter and cocoa liquor. Test samples included naturally contaminated cocoa liquor and cocoa butter inoculated with natural microbial flora derived from cocoa liquor. A probability of detection statistical model was used to compare Soleris results at multiple test thresholds (dilutions) with aerobic plate counts determined using the AOAC Official Method 966.23 dilution plating method. Results of the two methods were not statistically different at any dilution level in any of the three trials conducted. The Soleris method offers the advantage of results within 24 h, compared to the 48 h required by standard dilution plating methods.

  14. A Novel Sensor System for Measuring Wheel Loads of Vehicles on Highways

    PubMed Central

    Zhang, Wenbin; Suo, Chunguang; Wang, Qi

    2008-01-01

    With the development of the highway transportation and business trade, vehicle Weigh-In-Motion (WIM) technology has become a key technology for measuring traffic loads. In this paper a novel WIM system based on monitoring of pavement strain responses in rigid pavement was investigated. In this WIM system multiple low cost, light weight, small volume and high accuracy embedded concrete strain sensors were used as WIM sensors to measure rigid pavement strain responses. In order to verify the feasibility of the method, a system prototype based on multiple sensors was designed and deployed on a relatively busy freeway. Field calibration and tests were performed with known two-axle truck wheel loads and the measurement errors were calculated based on the static weights measured with a static weighbridge. This enables the weights of other vehicles to be calculated from the calibration constant. Calibration and test results for individual sensors or three-sensor fusions are both provided. Repeatability, sources of error, and weight accuracy are discussed. Successful results showed that the proposed method was feasible and proven to have a high accuracy. Furthermore, a sample mean approach using multiple fused individual sensors could provide better performance compared to individual sensors. PMID:27873952

  15. Practical Implementation of Multiple Model Adaptive Estimation Using Neyman-Pearson Based Hypothesis Testing and Spectral Estimation Tools

    DTIC Science & Technology

    1996-09-01

    Generalized Likelihood Ratio (GLR) and voting techniques. The third class consisted of multiple hypothesis filter detectors, specifically the MMAE. The...vector version, versus a tensor if we use the matrix version of the power spectral density estimate. Using this notation, we will derive an...as MATLAB , have an intrinsic sample covariance computation available, which makes this method quite easy to implement. In practice, the mean for the

  16. Functional Connectivity in Multiple Cortical Networks Is Associated with Performance Across Cognitive Domains in Older Adults.

    PubMed

    Shaw, Emily E; Schultz, Aaron P; Sperling, Reisa A; Hedden, Trey

    2015-10-01

    Intrinsic functional connectivity MRI has become a widely used tool for measuring integrity in large-scale cortical networks. This study examined multiple cortical networks using Template-Based Rotation (TBR), a method that applies a priori network and nuisance component templates defined from an independent dataset to test datasets of interest. A priori templates were applied to a test dataset of 276 older adults (ages 65-90) from the Harvard Aging Brain Study to examine the relationship between multiple large-scale cortical networks and cognition. Factor scores derived from neuropsychological tests represented processing speed, executive function, and episodic memory. Resting-state BOLD data were acquired in two 6-min acquisitions on a 3-Tesla scanner and processed with TBR to extract individual-level metrics of network connectivity in multiple cortical networks. All results controlled for data quality metrics, including motion. Connectivity in multiple large-scale cortical networks was positively related to all cognitive domains, with a composite measure of general connectivity positively associated with general cognitive performance. Controlling for the correlations between networks, the frontoparietal control network (FPCN) and executive function demonstrated the only significant association, suggesting specificity in this relationship. Further analyses found that the FPCN mediated the relationships of the other networks with cognition, suggesting that this network may play a central role in understanding individual variation in cognition during aging.

  17. An Investigation of the Accuracy of Alternative Methods of True Score Estimation in High-Stakes Mixed-Format Examinations.

    ERIC Educational Resources Information Center

    Klinger, Don A.; Rogers, W. Todd

    2003-01-01

    The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…

  18. Effects of the Use of Two Visual Methods in Teaching College Chemistry to Non-Science Majors.

    ERIC Educational Resources Information Center

    Koechel, Loretta

    This was a quantified study on the learning of certain theoretical topics in general chemistry as influenced by two methods of visual technique (single concept films, overhead projections). Four classes of chemistry students (non-science majors) registered in sections on a random basis, participated. Objective, multiple choice tests on each of the…

  19. Multiple Marking of English Compositions: An Account of an Experiment.

    ERIC Educational Resources Information Center

    Britton, J. N.; And Others

    An experiment was conducted to find a better method of marking English composition than that which is in general use. The method tested was one in which each composition was assessed independently by three markers who judged the general impression of the writing, and by a fourth marker who applied a code of penalties for mechanical errors. Brief…

  20. Simulation Methods for Design of Networked Power Electronics and Information Systems

    DTIC Science & Technology

    2014-07-01

    Insertion of latency in every branch and at every node permits the system model to be efficiently distributed across many separate computing cores. An... the system . We demonstrated extensibility and generality of the Virtual Test Bed (VTB) framework to support multiple solvers and their associated...Information Systems Objectives The overarching objective of this program is to develop methods for fast

  1. Strength and life criteria for corrugated fiberboard by three methods

    Treesearch

    Thomas J. Urbanik

    1997-01-01

    The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...

  2. Influence of Family Communication Structure and Vanity Trait on Consumption Behavior: A Case Study of Adolescent Students in Taiwan

    ERIC Educational Resources Information Center

    Chang, Wei-Lung; Liu, Hsiang-Te; Lin, Tai-An; Wen, Yung-Sung

    2008-01-01

    The purpose of this research was to study the relationship between family communication structure, vanity trait, and related consumption behavior. The study used an empirical method with adolescent students from the northern part of Taiwan as the subjects. Multiple statistical methods and the SEM model were used for testing the hypotheses. The…

  3. Impairment in Children with and without ADHD: Contributions from Oppositional Defiant Disorder and Callous-Unemotional Traits

    ERIC Educational Resources Information Center

    Brammer, Whitney A.; Lee, Steve S.

    2012-01-01

    Objective: To ascertain the association of childhood ADHD and oppositional defiant disorder (ODD) on functional impairment and to test the moderating influence of callous-unemotional (CU) traits. Method: Ethnically diverse 6- to 9-year-old children with (n = 59) and without (n = 47) ADHD were ascertained using multiple methods (i.e., rating scales…

  4. Examination of the Measurement of Absorption Using the Reverberant Room Method for Highly Absorptive Acoustic Foam

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.; Chris Nottoli; Eric Wolfram

    2015-01-01

    The absorption coefficient for material specimens are needed to quantify the expected acoustic performance of that material in its actual usage and environment. The ASTM C423-09a standard, "Standard Test Method for Sound Absorption and Sound Absorption Coefficients by the Reverberant Room Method" is often used to measure the absorption coefficient of material test specimens. This method has its basics in the Sabine formula. Although widely used, the interpretation of these measurements are a topic of interest. For example, in certain cases the measured Sabine absorption coefficients are greater than 1.0 for highly absorptive materials. This is often attributed to the diffraction edge effect phenomenon. An investigative test program to measure the absorption properties of highly absorbent melamine foam has been performed at the Riverbank Acoustical Laboratories. This paper will present and discuss the test results relating to the effect of the test materials' surface area, thickness and edge sealing conditions. A follow-on paper is envisioned that will present and discuss the results relating to the spacing between multiple piece specimens, and the mounting condition of the test specimen.

  5. Downscaling remotely sensed imagery using area-to-point cokriging and multiple-point geostatistical simulation

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong

    2015-03-01

    A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.

  6. Estimation of the Percentage of Newly Diagnosed HIV-Positive Persons Linked to HIV Medical Care in CDC-Funded HIV Testing Programs.

    PubMed

    Wang, Guoshen; Pan, Yi; Seth, Puja; Song, Ruiguang; Belcher, Lisa

    2017-01-01

    Missing data create challenges for determining progress made in linking HIV-positive persons to HIV medical care. Statistical methods are not used to address missing program data on linkage. In 2014, 61 health department jurisdictions were funded by Centers for Disease Control and Prevention (CDC) and submitted data on HIV testing, newly diagnosed HIV-positive persons, and linkage to HIV medical care. Missing or unusable data existed in our data set. A new approach using multiple imputation to address missing linkage data was proposed, and results were compared to the current approach that uses data with complete information. There were 12,472 newly diagnosed HIV-positive persons from CDC-funded HIV testing events in 2014. Using multiple imputation, 94.1% (95% confidence interval (CI): [93.7%, 94.6%]) of newly diagnosed persons were referred to HIV medical care, 88.6% (95% CI: [88.0%, 89.1%]) were linked to care within any time frame, and 83.6% (95% CI: [83.0%, 84.3%]) were linked to care within 90 days. Multiple imputation is recommended for addressing missing linkage data in future analyses when the missing percentage is high. The use of multiple imputation for missing values can result in a better understanding of how programs are performing on key HIV testing and HIV service delivery indicators.

  7. Comparison of three methods of solution to the inverse problem of groundwater hydrology for multiple pumping stimulation

    NASA Astrophysics Data System (ADS)

    Giudici, Mauro; Casabianca, Davide; Comunian, Alessandro

    2015-04-01

    The basic classical inverse problem of groundwater hydrology aims at determining aquifer transmissivity (T ) from measurements of hydraulic head (h), estimates or measures of source terms and with the least possible knowledge on hydraulic transmissivity. The theory of inverse problems shows that this is an example of ill-posed problem, for which non-uniqueness and instability (or at least ill-conditioning) might preclude the computation of a physically acceptable solution. One of the methods to reduce the problems with non-uniqueness, ill-conditioning and instability is a tomographic approach, i.e., the use of data corresponding to independent flow situations. The latter might correspond to different hydraulic stimulations of the aquifer, i.e., to different pumping schedules and flux rates. Three inverse methods have been analyzed and tested to profit from the use of multiple sets of data: the Differential System Method (DSM), the Comparison Model Method (CMM) and the Double Constraint Method (DCM). DSM and CMM need h all over the domain and thus the first step for their application is the interpolation of measurements of h at sparse points. Moreover, they also need the knowledge of the source terms (aquifer recharge, well pumping rates) all over the aquifer. DSM is intrinsically based on the use of multiple data sets, which permit to write a first-order partial differential equation for T , whereas CMM and DCM were originally proposed to invert a single data set and have been extended to work with multiple data sets in this work. CMM and DCM are based on Darcy's law, which is used to update an initial guess of the T field with formulas based on a comparison of different hydraulic gradients. In particular, the CMM algorithm corrects the T estimate with ratio of the observed hydraulic gradient and that obtained with a comparison model which shares the same boundary conditions and source terms as the model to be calibrated, but a tentative T field. On the other hand the DCM algorithm applies the ratio of the hydraulic gradients obtained for two different forward models, one with the same boundary conditions and source terms as the model to be calibrated and the other one with prescribed head at the positions where in- or out-flow is known and h is measured. For DCM and CMM, multiple stimulation is used by updating the T field separately for each data set and then combining the resulting updated fields with different possible statistics (arithmetic, geometric or harmonic mean, median, least change, etc.). The three algorithms are tested and their characteristics and results are compared with a field data set, which was provided by prof. Fritz Stauffer (ETH) and corresponding to a pumping test in a thin alluvial aquifer in northern Switzerland. Three data sets are available and correspond to the undisturbed state, to the flow field created by a single pumping well and to the situation created by an 'hydraulic dipole', i.e., an extraction and an injection wells. These data sets permit to test the three inverse methods and the different options which can be chosen for their use.

  8. Automated plasma control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.

  9. Laser-Induced Thermal Acoustics Theory and Expected Experimental Errors when Applied to a Scramjet Isolator Model

    NASA Technical Reports Server (NTRS)

    Middleton, Troy F.; Balla, Robert Jeffrey; Baurle, Robert A.; Wilson, Lloyd G.

    2011-01-01

    A scramjet isolator model test apparatus is being assembled in the Isolator Dynamics Research Lab (IDRL) at the NASA Langley Research Center in Hampton, Virginia. The test apparatus is designed to support multiple measurement techniques for investigating the flow field in a scramjet isolator model. The test section is 1-inch high by 2-inch wide by 24-inch long and simulates a scramjet isolator with an aspect ratio of two. Unheated, dry air at a constant stagnation pressure and temperature is delivered to the isolator test section through a Mach 2.5 planar nozzle. The isolator test section is mechanically back-pressured to contain the resulting shock train within the 24-inch isolator length and supports temperature, static pressure, and high frequency pressure measurements at the wall. Additionally, nonintrusive methods including laser-induced thermal acoustics (LITA), spontaneous Raman scattering, particle image velocimetry, and schlieren imaging are being incorporated to measure off-wall fluid dynamic, thermodynamic, and transport properties of the flow field. Interchangeable glass and metallic sidewalls and optical access appendages permit making multiple measurements simultaneously. The measurements will be used to calibrate computational fluid dynamics turbulence models and characterize the back-pressured flow of a scramjet isolator. This paper describes the test apparatus, including the optical access appendages; the physics of the LITA method; and estimates of LITA measurement uncertainty for measurements of the speed of sound and temperature.

  10. Bayesian estimation of the transmissivity spatial structure from pumping test data

    NASA Astrophysics Data System (ADS)

    Demir, Mehmet Taner; Copty, Nadim K.; Trinchero, Paolo; Sanchez-Vila, Xavier

    2017-06-01

    Estimating the statistical parameters (mean, variance, and integral scale) that define the spatial structure of the transmissivity or hydraulic conductivity fields is a fundamental step for the accurate prediction of subsurface flow and contaminant transport. In practice, the determination of the spatial structure is a challenge because of spatial heterogeneity and data scarcity. In this paper, we describe a novel approach that uses time drawdown data from multiple pumping tests to determine the transmissivity statistical spatial structure. The method builds on the pumping test interpretation procedure of Copty et al. (2011) (Continuous Derivation method, CD), which uses the time-drawdown data and its time derivative to estimate apparent transmissivity values as a function of radial distance from the pumping well. A Bayesian approach is then used to infer the statistical parameters of the transmissivity field by combining prior information about the parameters and the likelihood function expressed in terms of radially-dependent apparent transmissivities determined from pumping tests. A major advantage of the proposed Bayesian approach is that the likelihood function is readily determined from randomly generated multiple realizations of the transmissivity field, without the need to solve the groundwater flow equation. Applying the method to synthetically-generated pumping test data, we demonstrate that, through a relatively simple procedure, information on the spatial structure of the transmissivity may be inferred from pumping tests data. It is also shown that the prior parameter distribution has a significant influence on the estimation procedure, given the non-uniqueness of the estimation procedure. Results also indicate that the reliability of the estimated transmissivity statistical parameters increases with the number of available pumping tests.

  11. Analysis of Duplicated Multiple-Samples Rank Data Using the Mack-Skillings Test.

    PubMed

    Carabante, Kennet Mariano; Alonso-Marenco, Jose Ramon; Chokumnoyporn, Napapan; Sriwattana, Sujinda; Prinyawiwatkul, Witoon

    2016-07-01

    Appropriate analysis for duplicated multiple-samples rank data is needed. This study compared analysis of duplicated rank preference data using the Friedman versus Mack-Skillings tests. Panelists (n = 125) ranked twice 2 orange juice sets: different-samples set (100%, 70%, vs. 40% juice) and similar-samples set (100%, 95%, vs. 90%). These 2 sample sets were designed to get contrasting differences in preference. For each sample set, rank sum data were obtained from (1) averaged rank data of each panelist from the 2 replications (n = 125), (2) rank data of all panelists from each of the 2 separate replications (n = 125 each), (3) jointed rank data of all panelists from the 2 replications (n = 125), and (4) rank data of all panelists pooled from the 2 replications (n = 250); rank data (1), (2), and (4) were separately analyzed by the Friedman test, although those from (3) by the Mack-Skillings test. The effect of sample sizes (n = 10 to 125) was evaluated. For the similar-samples set, higher variations in rank data from the 2 replications were observed; therefore, results of the main effects were more inconsistent among methods and sample sizes. Regardless of analysis methods, the larger the sample size, the higher the χ(2) value, the lower the P-value (testing H0 : all samples are not different). Analyzing rank data (2) separately by replication yielded inconsistent conclusions across sample sizes, hence this method is not recommended. The Mack-Skillings test was more sensitive than the Friedman test. Furthermore, it takes into account within-panelist variations and is more appropriate for analyzing duplicated rank data. © 2016 Institute of Food Technologists®

  12. Multi-objects recognition for distributed intelligent sensor networks

    NASA Astrophysics Data System (ADS)

    He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.

    2008-04-01

    This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.

  13. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deline, Chris; MacAlpine, Sara; Marion, Bill

    2016-11-21

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less

  14. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deline, Chris; MacAlpine, Sara; Marion, Bill

    2016-06-16

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less

  15. Multiple Testing in the Context of Gene Discovery in Sickle Cell Disease Using Genome-Wide Association Studies.

    PubMed

    Kuo, Kevin H M

    2017-01-01

    The issue of multiple testing, also termed multiplicity, is ubiquitous in studies where multiple hypotheses are tested simultaneously. Genome-wide association study (GWAS), a type of genetic association study that has gained popularity in the past decade, is most susceptible to the issue of multiple testing. Different methodologies have been employed to address the issue of multiple testing in GWAS. The purpose of the review is to examine the methodologies employed in dealing with multiple testing in the context of gene discovery using GWAS in sickle cell disease complications.

  16. Self-Field-Dominated Plasma

    DTIC Science & Technology

    1998-03-31

    plasma focus discharges. Part of the tests summarized here address methods and means for achieving controlled variations of the current sheath (CS) structure via electrode geometry modifications. CS parameters are monitored with multiple magnetic probes in the case of cylindrical - and open-funnel electrode

  17. A Comparison of Approximation Modeling Techniques: Polynomial Versus Interpolating Models

    NASA Technical Reports Server (NTRS)

    Giunta, Anthony A.; Watson, Layne T.

    1998-01-01

    Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimization problems. The first approximation model is a quadratic polynomial created using the method of least squares. This type of polynomial model has seen considerable use in recent engineering optimization studies due to its computational simplicity and ease of use. However, quadratic polynomial models may be of limited accuracy when the response data to be modeled have multiple local extrema. The second approximation model employs an interpolation scheme known as kriging developed in the fields of spatial statistics and geostatistics. This class of interpolating model has the flexibility to model response data with multiple local extrema. However, this flexibility is obtained at an increase in computational expense and a decrease in ease of use. The intent of this study is to provide an initial exploration of the accuracy and modeling capabilities of these two approximation methods.

  18. Packing Boxes into Multiple Containers Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Menghani, Deepak; Guha, Anirban

    2016-07-01

    Container loading problems have been studied extensively in the literature and various analytical, heuristic and metaheuristic methods have been proposed. This paper presents two different variants of a genetic algorithm framework for the three-dimensional container loading problem for optimally loading boxes into multiple containers with constraints. The algorithms are designed so that it is easy to incorporate various constraints found in real life problems. The algorithms are tested on data of standard test cases from literature and are found to compare well with the benchmark algorithms in terms of utilization of containers. This, along with the ability to easily incorporate a wide range of practical constraints, makes them attractive for implementation in real life scenarios.

  19. Chromosomal Microarray Testing in 42 Korean Patients with Unexplained Developmental Delay, Intellectual Disability, Autism Spectrum Disorders, and Multiple Congenital Anomalies.

    PubMed

    Lee, Sun Ho; Song, Wung Joo

    2017-09-01

    Chromosomal microarray (CMA) is a high-resolution, high-throughput method of identifying submicroscopic genomic copy number variations (CNVs). CMA has been established as the first-line diagnostic test for individuals with developmental delay (DD), intellectual disability (ID), autism spectrum disorders (ASDs), and multiple congenital anomalies (MCAs). CMA analysis was performed in 42 Korean patients who had been diagnosed with unexplained DD, ID, ASDs, and MCAs. Clinically relevant CNVs were discovered in 28 patients. Variants of unknown significance were detected in 13 patients. The diagnostic yield was high (66.7%). CMA is a superior diagnostic tool compared with conventional karyotyping and fluorescent in situ hybridization.

  20. The X-33 Extended Flight Test Range

    NASA Technical Reports Server (NTRS)

    Mackall, Dale A.; Sakahara, Robert; Kremer, Steven E.

    1998-01-01

    Development of an extended test range, with range instrumentation providing continuous vehicle communications, is required to flight-test the X-33, a scaled version of a reusable launch vehicle. The extended test range provides vehicle communications coverage from California to landing at Montana or Utah. This paper provides an overview of the approaches used to meet X-33 program requirements, including using multiple ground stations, and methods to reduce problems caused by reentry plasma radio frequency blackout. The advances used to develop the extended test range show other hypersonic and access-to-space programs can benefit from the development of the extended test range.

  1. Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing.

    PubMed

    Butler, Andrew C; Roediger, Henry L

    2008-04-01

    Multiple-choice tests are used frequently in higher education without much consideration of the impact this form of assessment has on learning. Multiple-choice testing enhances retention of the material tested (the testing effect); however, unlike other tests, multiple-choice can also be detrimental because it exposes students to misinformation in the form of lures. The selection of lures can lead students to acquire false knowledge (Roediger & Marsh, 2005). The present research investigated whether feedback could be used to boost the positive effects and reduce the negative effects of multiple-choice testing. Subjects studied passages and then received a multiple-choice test with immediate feedback, delayed feedback, or no feedback. In comparison with the no-feedback condition, both immediate and delayed feedback increased the proportion of correct responses and reduced the proportion of intrusions (i.e., lure responses from the initial multiple-choice test) on a delayed cued recall test. Educators should provide feedback when using multiple-choice tests.

  2. Plasma process control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.

  3. The predictive value of multiple electrode platelet aggregometry for postoperative bleeding complications in patients undergoing coronary artery bypass graft surgery

    PubMed Central

    Woźniak, Karolina; Hryniewiecki, Tomasz; Kruk, Mariusz; Różański, Jacek; Kuśmierczyk, Mariusz

    2016-01-01

    Introduction Postoperative bleeding is one of the most serious complications of cardiac surgery and requires transfusion of blood or blood products. Acetylsalicylic acid (ASA) and clopidogrel (CLO) are the two most commonly used antiplatelet agents; when used in combination (i.e., as dual antiplatelet therapy [DAPT]), they exert a synergistic effect. Dual antiplatelet therapy, however, significantly increases the risk of postoperative bleeding. The effect of antiplatelet therapy can be monitored by platelet aggregation testing. One of the most commonly methods used for assessing platelet reactivity is multiple electrode aggregometry (MEA) which can be performed with the use of Multiplate analyzer. Although the method has long been used in interventional cardiology to assess the effect of antiplatelet therapy, it is not available at cardiac surgery departments as a standard diagnostic procedure. The aim of the study was to establish the frequency of bleeding complications following coronary artery bypass graft (CABG) surgery in patients on single antiplatelet therapy (SAPT) and patients on DAPT and to determine the usefulness of routine measurement of platelet responsiveness before CABG surgery in patients receiving antiplatelet therapy. Material and methods A consecutive cohort of 200 patients referred for elective surgical treatment of stable coronary artery disease was enrolled (100 consecutive patients on SAPT [ASA 75 mg/day] and 100 consecutive patients on DAPT [ASA 75 mg/day + CLO 75 mg/day]). All subjects continued their antiplatelet therapy until the day before surgery. For each subject, platelet aggregation testing in the form of an ASPI test and an ADP test was performed on the Multiplate analyzer. Each subject underwent coronary artery bypass grafting surgery. For the primary and secondary endpoints in our study we adopted the definition provided in ‘Standardised Bleeding Definitions for Cardiovascular Clinical Trials: A Consensus Report from the Bleeding Academic Research Consortium’ (‘Circulation’, 2011) for BARC type 4 bleeding (i.e. CABG-related bleeding). Results An ROC curve was constructed for the ASPI test and ADP test for a total of 200 patients. No significant correlations were demonstrated between the ASPI test results and either the primary endpoint or the secondary endpoints. A correlation was found between the ADP test results and the composite primary endpoint and each of the secondary endpoints. The primary endpoint of major postoperative bleeding occurred in 16 subjects. From the ROC curve, we established the optimal cut-off value for the ADP test of 26 U at sensitivity of 72%, specificity of 69%, positive predictive value of 69.90%, and negative predictive value of 71.13%. Conclusions In patients on antiplatelet therapy, an ADP test result of < 26 U is strongly predictive of serious bleeding complications after CABG surgery. The MEA ADP test allows to identify the group of patients at an increased risk of postoperative bleeding. PMID:27212971

  4. Self-Developed Testing System for Determining the Temperature Behavior of Concrete.

    PubMed

    Zhu, He; Li, Qingbin; Hu, Yu

    2017-04-16

    Cracking due to temperature and restraint in mass concrete is an important issue. A temperature stress testing machine (TSTM) is an effective test method to study the mechanism of temperature cracking. A synchronous closed loop federated control TSTM system has been developed by adopting the design concepts of a closed loop federated control, a detachable mold design, a direct measuring deformation method, and a temperature deformation compensation method. The results show that the self-developed system has the comprehensive ability of simulating different restraint degrees, multiple temperature and humidity modes, and closed-loop control of multi-TSTMs during one test period. Additionally, the direct measuring deformation method can obtain a more accurate deformation and restraint degree result with little local damage. The external temperature deformation affecting the concrete specimen can be eliminated by adopting the temperature deformation compensation method with different considerations of steel materials. The concrete quality of different TSTMs can be guaranteed by being vibrated on the vibrating stand synchronously. The detachable mold design and assembled method has greatly overcome the difficulty of eccentric force and deformation.

  5. Self-Developed Testing System for Determining the Temperature Behavior of Concrete

    PubMed Central

    Zhu, He; Li, Qingbin; Hu, Yu

    2017-01-01

    Cracking due to temperature and restraint in mass concrete is an important issue. A temperature stress testing machine (TSTM) is an effective test method to study the mechanism of temperature cracking. A synchronous closed loop federated control TSTM system has been developed by adopting the design concepts of a closed loop federated control, a detachable mold design, a direct measuring deformation method, and a temperature deformation compensation method. The results show that the self-developed system has the comprehensive ability of simulating different restraint degrees, multiple temperature and humidity modes, and closed-loop control of multi-TSTMs during one test period. Additionally, the direct measuring deformation method can obtain a more accurate deformation and restraint degree result with little local damage. The external temperature deformation affecting the concrete specimen can be eliminated by adopting the temperature deformation compensation method with different considerations of steel materials. The concrete quality of different TSTMs can be guaranteed by being vibrated on the vibrating stand synchronously. The detachable mold design and assembled method has greatly overcome the difficulty of eccentric force and deformation. PMID:28772778

  6. Evaluation of Targeted Next-Generation Sequencing for Detection of Bovine Pathogens in Clinical Samples.

    PubMed

    Anis, Eman; Hawkins, Ian K; Ilha, Marcia R S; Woldemeskel, Moges W; Saliki, Jeremiah T; Wilkes, Rebecca P

    2018-07-01

    The laboratory diagnosis of infectious diseases, especially those caused by mixed infections, is challenging. Routinely, it requires submission of multiple samples to separate laboratories. Advances in next-generation sequencing (NGS) have provided the opportunity for development of a comprehensive method to identify infectious agents. This study describes the use of target-specific primers for PCR-mediated amplification with the NGS technology in which pathogen genomic regions of interest are enriched and selectively sequenced from clinical samples. In the study, 198 primers were designed to target 43 common bovine and small-ruminant bacterial, fungal, viral, and parasitic pathogens, and a bioinformatics tool was specifically constructed for the detection of targeted pathogens. The primers were confirmed to detect the intended pathogens by testing reference strains and isolates. The method was then validated using 60 clinical samples (including tissues, feces, and milk) that were also tested with other routine diagnostic techniques. The detection limits of the targeted NGS method were evaluated using 10 representative pathogens that were also tested by quantitative PCR (qPCR), and the NGS method was able to detect the organisms from samples with qPCR threshold cycle ( C T ) values in the 30s. The method was successful for the detection of multiple pathogens in the clinical samples, including some additional pathogens missed by the routine techniques because the specific tests needed for the particular organisms were not performed. The results demonstrate the feasibility of the approach and indicate that it is possible to incorporate NGS as a diagnostic tool in a cost-effective manner into a veterinary diagnostic laboratory. Copyright © 2018 Anis et al.

  7. Visual Habituation Paradigm with Adults with Profound Intellectual and Multiple Disabilities: A New Way for Cognitive Assessment?

    ERIC Educational Resources Information Center

    Chard, Melissa; Roulin, Jean-Luc; Bouvard, Martine

    2014-01-01

    Background: The use of common psychological assessment tools is invalidated with persons with PIMD. The aim of this study was to test the feasibility of using a visual habituation procedure with a group of adults with PIMD, to develop a new theoretical and practical framework for the assessment of cognitive abilities. Methods: To test the…

  8. Polymeric assay film for direct colorimetric detection

    DOEpatents

    Charych, Deborah; Nagy, Jon; Spevak, Wayne

    2002-01-01

    A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.

  9. Polymeric assay film for direct colorimetric detection

    DOEpatents

    Charych, Deborah; Nagy, Jon; Spevak, Wayne

    1999-01-01

    A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.

  10. Smoothing and Equating Methods Applied to Different Types of Test Score Distributions and Evaluated with Respect to Multiple Equating Criteria. Research Report. ETS RR-11-20

    ERIC Educational Resources Information Center

    Moses, Tim; Liu, Jinghua

    2011-01-01

    In equating research and practice, equating functions that are smooth are typically assumed to be more accurate than equating functions with irregularities. This assumption presumes that population test score distributions are relatively smooth. In this study, two examples were used to reconsider common beliefs about smoothing and equating. The…

  11. Assessing the Treatment Effects in Apraxia of Speech: Introduction and Evaluation of the Modified Diadochokinesis Test

    ERIC Educational Resources Information Center

    Hurkmans, Joost; Jonkers, Roel; Boonstra, Anne M.; Stewart, Roy E.; Reinders-Messelink, Heleen A.

    2012-01-01

    Background: The number of reliable and valid instruments to measure the effects of therapy in apraxia of speech (AoS) is limited. Aims: To evaluate the newly developed Modified Diadochokinesis Test (MDT), which is a task to assess the effects of rate and rhythm therapies for AoS in a multiple baseline across behaviours design. Methods: The…

  12. [Study of Cervical Exfoliated Cell's DNA Quantitative Analysis Based on Multi-Spectral Imaging Technology].

    PubMed

    Wu, Zheng; Zeng, Li-bo; Wu, Qiong-shui

    2016-02-01

    The conventional cervical cancer screening methods mainly include TBS (the bethesda system) classification method and cellular DNA quantitative analysis, however, by using multiple staining method in one cell slide, which is staining the cytoplasm with Papanicolaou reagent and the nucleus with Feulgen reagent, the study of achieving both two methods in the cervical cancer screening at the same time is still blank. Because the difficulty of this multiple staining method is that the absorbance of the non-DNA material may interfere with the absorbance of DNA, so that this paper has set up a multi-spectral imaging system, and established an absorbance unmixing model by using multiple linear regression method based on absorbance's linear superposition character, and successfully stripped out the absorbance of DNA to run the DNA quantitative analysis, and achieved the perfect combination of those two kinds of conventional screening method. Through a series of experiment we have proved that between the absorbance of DNA which is calculated by the absorbance unmixxing model and the absorbance of DNA which is measured there is no significant difference in statistics when the test level is 1%, also the result of actual application has shown that there is no intersection between the confidence interval of the DNA index of the tetraploid cells which are screened by using this paper's analysis method when the confidence level is 99% and the DNA index's judging interval of cancer cells, so that the accuracy and feasibility of the quantitative DNA analysis with multiple staining method expounded by this paper have been verified, therefore this analytical method has a broad application prospect and considerable market potential in early diagnosis of cervical cancer and other cancers.

  13. Quaternion normalization in spacecraft attitude determination

    NASA Technical Reports Server (NTRS)

    Deutschmann, Julie; Bar-Itzhack, Itzhack; Galal, Ken

    1992-01-01

    Methods are presented to normalize the attitude quaternion in two extended Kalman filters (EKF), namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). It is concluded that all the normalization methods work well and yield comparable results. In the AEKF, normalization is not essential, since the data chosen for the test do not have a rapidly varying attitude. In the MEKF, normalization is necessary to avoid divergence of the attitude estimate. All of the methods of the methods behave similarly when the spacecraft experiences low angular rates.

  14. Prediction of beta-turns and beta-turn types by a novel bidirectional Elman-type recurrent neural network with multiple output layers (MOLEBRNN).

    PubMed

    Kirschner, Andreas; Frishman, Dmitrij

    2008-10-01

    Prediction of beta-turns from amino acid sequences has long been recognized as an important problem in structural bioinformatics due to their frequent occurrence as well as their structural and functional significance. Because various structural features of proteins are intercorrelated, secondary structure information has been often employed as an additional input for machine learning algorithms while predicting beta-turns. Here we present a novel bidirectional Elman-type recurrent neural network with multiple output layers (MOLEBRNN) capable of predicting multiple mutually dependent structural motifs and demonstrate its efficiency in recognizing three aspects of protein structure: beta-turns, beta-turn types, and secondary structure. The advantage of our method compared to other predictors is that it does not require any external input except for sequence profiles because interdependencies between different structural features are taken into account implicitly during the learning process. In a sevenfold cross-validation experiment on a standard test dataset our method exhibits the total prediction accuracy of 77.9% and the Mathew's Correlation Coefficient of 0.45, the highest performance reported so far. It also outperforms other known methods in delineating individual turn types. We demonstrate how simultaneous prediction of multiple targets influences prediction performance on single targets. The MOLEBRNN presented here is a generic method applicable in a variety of research fields where multiple mutually depending target classes need to be predicted. http://webclu.bio.wzw.tum.de/predator-web/.

  15. Georeferencing the Large-Scale Aerial Photographs of a Great Lakes Coastal Wetland: A Modified Photogrammetric Method

    USGS Publications Warehouse

    Murphy, Marilyn K.; Kowalski, Kurt P.; Grapentine, Joel L.

    2010-01-01

    The geocontrol template method was developed to georeference multiple, overlapping analog aerial photographs without reliance upon conventionally obtained horizontal ground control. The method was tested as part of a long-term wetland habitat restoration project at a Lake Erie coastal wetland complex in the U.S. Fish and Wildlife Service Ottawa National Wildlife Refuge. As in most coastal wetlands, annually identifiable ground-control features required to georeference photo-interpreted data are difficult to find. The geocontrol template method relies on the following four components: (a) an uncontrolled aerial photo mosaic of the study area, (b) global positioning system (GPS) derived horizontal coordinates of each photo’s principal point, (c) a geocontrol template created by the transfer of fiducial markings and calculated principal points to clear acetate from individual photographs arranged in a mosaic, and (d) the root-mean-square-error testing of the system to ensure an acceptable level of planimetric accuracy. Once created for a study area, the geocontrol template can be registered in geographic information system (GIS) software to facilitate interpretation of multiple images without individual image registration. The geocontrol template enables precise georeferencing of single images within larger blocks of photographs using a repeatable and consistent method.

  16. A sampling-based method for ranking protein structural models by integrating multiple scores and features.

    PubMed

    Shi, Xiaohu; Zhang, Jingfen; He, Zhiquan; Shang, Yi; Xu, Dong

    2011-09-01

    One of the major challenges in protein tertiary structure prediction is structure quality assessment. In many cases, protein structure prediction tools generate good structural models, but fail to select the best models from a huge number of candidates as the final output. In this study, we developed a sampling-based machine-learning method to rank protein structural models by integrating multiple scores and features. First, features such as predicted secondary structure, solvent accessibility and residue-residue contact information are integrated by two Radial Basis Function (RBF) models trained from different datasets. Then, the two RBF scores and five selected scoring functions developed by others, i.e., Opus-CA, Opus-PSP, DFIRE, RAPDF, and Cheng Score are synthesized by a sampling method. At last, another integrated RBF model ranks the structural models according to the features of sampling distribution. We tested the proposed method by using two different datasets, including the CASP server prediction models of all CASP8 targets and a set of models generated by our in-house software MUFOLD. The test result shows that our method outperforms any individual scoring function on both best model selection, and overall correlation between the predicted ranking and the actual ranking of structural quality.

  17. Fast and Accurate Approximation to Significance Tests in Genome-Wide Association Studies

    PubMed Central

    Zhang, Yu; Liu, Jun S.

    2011-01-01

    Genome-wide association studies commonly involve simultaneous tests of millions of single nucleotide polymorphisms (SNP) for disease association. The SNPs in nearby genomic regions, however, are often highly correlated due to linkage disequilibrium (LD, a genetic term for correlation). Simple Bonferonni correction for multiple comparisons is therefore too conservative. Permutation tests, which are often employed in practice, are both computationally expensive for genome-wide studies and limited in their scopes. We present an accurate and computationally efficient method, based on Poisson de-clumping heuristics, for approximating genome-wide significance of SNP associations. Compared with permutation tests and other multiple comparison adjustment approaches, our method computes the most accurate and robust p-value adjustments for millions of correlated comparisons within seconds. We demonstrate analytically that the accuracy and the efficiency of our method are nearly independent of the sample size, the number of SNPs, and the scale of p-values to be adjusted. In addition, our method can be easily adopted to estimate false discovery rate. When applied to genome-wide SNP datasets, we observed highly variable p-value adjustment results evaluated from different genomic regions. The variation in adjustments along the genome, however, are well conserved between the European and the African populations. The p-value adjustments are significantly correlated with LD among SNPs, recombination rates, and SNP densities. Given the large variability of sequence features in the genome, we further discuss a novel approach of using SNP-specific (local) thresholds to detect genome-wide significant associations. This article has supplementary material online. PMID:22140288

  18. Measuring air-water interfacial area for soils using the mass balance surfactant-tracer method.

    PubMed

    Araujo, Juliana B; Mainhagu, Jon; Brusseau, Mark L

    2015-09-01

    There are several methods for conducting interfacial partitioning tracer tests to measure air-water interfacial area in porous media. One such approach is the mass balance surfactant tracer method. An advantage of the mass-balance method compared to other tracer-based methods is that a single test can produce multiple interfacial area measurements over a wide range of water saturations. The mass-balance method has been used to date only for glass beads or treated quartz sand. The purpose of this research is to investigate the effectiveness and implementability of the mass-balance method for application to more complex porous media. The results indicate that interfacial areas measured with the mass-balance method are consistent with values obtained with the miscible-displacement method. This includes results for a soil, for which solid-phase adsorption was a significant component of total tracer retention. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. "TNOs are Cool": A survey of the trans-Neptunian region. XIII. Statistical analysis of multiple trans-Neptunian objects observed with Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Kovalenko, I. D.; Doressoundiram, A.; Lellouch, E.; Vilenius, E.; Müller, T.; Stansberry, J.

    2017-11-01

    Context. Gravitationally bound multiple systems provide an opportunity to estimate the mean bulk density of the objects, whereas this characteristic is not available for single objects. Being a primitive population of the outer solar system, binary and multiple trans-Neptunian objects (TNOs) provide unique information about bulk density and internal structure, improving our understanding of their formation and evolution. Aims: The goal of this work is to analyse parameters of multiple trans-Neptunian systems, observed with Herschel and Spitzer space telescopes. Particularly, statistical analysis is done for radiometric size and geometric albedo, obtained from photometric observations, and for estimated bulk density. Methods: We use Monte Carlo simulation to estimate the real size distribution of TNOs. For this purpose, we expand the dataset of diameters by adopting the Minor Planet Center database list with available values of the absolute magnitude therein, and the albedo distribution derived from Herschel radiometric measurements. We use the 2-sample Anderson-Darling non-parametric statistical method for testing whether two samples of diameters, for binary and single TNOs, come from the same distribution. Additionally, we use the Spearman's coefficient as a measure of rank correlations between parameters. Uncertainties of estimated parameters together with lack of data are taken into account. Conclusions about correlations between parameters are based on statistical hypothesis testing. Results: We have found that the difference in size distributions of multiple and single TNOs is biased by small objects. The test on correlations between parameters shows that the effective diameter of binary TNOs strongly correlates with heliocentric orbital inclination and with magnitude difference between components of binary system. The correlation between diameter and magnitude difference implies that small and large binaries are formed by different mechanisms. Furthermore, the statistical test indicates, although not significant with the sample size, that a moderately strong correlation exists between diameter and bulk density. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  20. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    ERIC Educational Resources Information Center

    Porter, Kristin E.

    2018-01-01

    Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…

  1. Detection of multiple perturbations in multi-omics biological networks.

    PubMed

    Griffin, Paula J; Zhang, Yuqing; Johnson, William Evan; Kolaczyk, Eric D

    2018-05-17

    Cellular mechanism-of-action is of fundamental concern in many biological studies. It is of particular interest for identifying the cause of disease and learning the way in which treatments act against disease. However, pinpointing such mechanisms is difficult, due to the fact that small perturbations to the cell can have wide-ranging downstream effects. Given a snapshot of cellular activity, it can be challenging to tell where a disturbance originated. The presence of an ever-greater variety of high-throughput biological data offers an opportunity to examine cellular behavior from multiple angles, but also presents the statistical challenge of how to effectively analyze data from multiple sources. In this setting, we propose a method for mechanism-of-action inference by extending network filtering to multi-attribute data. We first estimate a joint Gaussian graphical model across multiple data types using penalized regression and filter for network effects. We then apply a set of likelihood ratio tests to identify the most likely site of the original perturbation. In addition, we propose a conditional testing procedure to allow for detection of multiple perturbations. We demonstrate this methodology on paired gene expression and methylation data from The Cancer Genome Atlas (TCGA). © 2018, The International Biometric Society.

  2. A global × global test for testing associations between two large sets of variables.

    PubMed

    Chaturvedi, Nimisha; de Menezes, Renée X; Goeman, Jelle J

    2017-01-01

    In high-dimensional omics studies where multiple molecular profiles are obtained for each set of patients, there is often interest in identifying complex multivariate associations, for example, copy number regulated expression levels in a certain pathway or in a genomic region. To detect such associations, we present a novel approach to test for association between two sets of variables. Our approach generalizes the global test, which tests for association between a group of covariates and a single univariate response, to allow high-dimensional multivariate response. We apply the method to several simulated datasets as well as two publicly available datasets, where we compare the performance of multivariate global test (G2) with univariate global test. The method is implemented in R and will be available as a part of the globaltest package in R. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only.

  4. The Effectiveness of learning materials based on multiple intelligence on the understanding of global warming

    NASA Astrophysics Data System (ADS)

    Liliawati, W.; Purwanto; Zulfikar, A.; Kamal, R. N.

    2018-05-01

    This study aims to examine the effectiveness of the use of teaching materials based on multiple intelligences on the understanding of high school students’ material on the theme of global warming. The research method used is static-group pretest-posttest design. Participants of the study were 60 high school students of XI class in one of the high schools in Bandung. Participants were divided into two classes of 30 students each for the experimental class and control class. The experimental class uses compound-based teaching materials while the experimental class does not use a compound intelligence-based teaching material. The instrument used is a test of understanding of the concept of global warming with multiple choices form amounted to 15 questions and 5 essay items. The test is given before and after it is applied to both classes. Data analysis using N-gain and effect size. The results obtained that the N-gain for both classes is in the medium category and the effectiveness of the use of teaching materials based on the results of effect-size test results obtained in the high category.

  5. Analysis of Gene Expression Profiles of Soft Tissue Sarcoma Using a Combination of Knowledge-Based Filtering with Integration of Multiple Statistics

    PubMed Central

    Doi, Ayano; Ichinohe, Risa; Ikuyo, Yoriko; Takahashi, Teruyoshi; Marui, Shigetaka; Yasuhara, Koji; Nakamura, Tetsuro; Sugita, Shintaro; Sakamoto, Hiromi; Yoshida, Teruhiko; Hasegawa, Tadashi

    2014-01-01

    The diagnosis and treatment of soft tissue sarcomas (STS) have been difficult. Of the diverse histological subtypes, undifferentiated pleomorphic sarcoma (UPS) is particularly difficult to diagnose accurately, and its classification per se is still controversial. Recent advances in genomic technologies provide an excellent way to address such problems. However, it is often difficult, if not impossible, to identify definitive disease-associated genes using genome-wide analysis alone, primarily because of multiple testing problems. In the present study, we analyzed microarray data from 88 STS patients using a combination method that used knowledge-based filtering and a simulation based on the integration of multiple statistics to reduce multiple testing problems. We identified 25 genes, including hypoxia-related genes (e.g., MIF, SCD1, P4HA1, ENO1, and STAT1) and cell cycle- and DNA repair-related genes (e.g., TACC3, PRDX1, PRKDC, and H2AFY). These genes showed significant differential expression among histological subtypes, including UPS, and showed associations with overall survival. STAT1 showed a strong association with overall survival in UPS patients (logrank p = 1.84×10−6 and adjusted p value 2.99×10−3 after the permutation test). According to the literature, the 25 genes selected are useful not only as markers of differential diagnosis but also as prognostic/predictive markers and/or therapeutic targets for STS. Our combination method can identify genes that are potential prognostic/predictive factors and/or therapeutic targets in STS and possibly in other cancers. These disease-associated genes deserve further preclinical and clinical validation. PMID:25188299

  6. An Assessment of Pharmacy Student Confidence in Learning.

    ERIC Educational Resources Information Center

    Popovich, Nicholas G.; Rogers, Wallace J.

    1987-01-01

    A study to determine student knowledge and confidence in that knowledge when answering multiple-choice examination questions in a nonprescription drug course is described. An alternate approach to methods of confidence testing was investigated. The knowledge and experience survey is appended. (Author/MLW)

  7. MODELING A MIXTURE: PBPK/PD APPROACHES FOR PREDICTING CHEMICAL INTERACTIONS.

    EPA Science Inventory

    Since environmental chemical exposures generally involve multiple chemicals, there are both regulatory and scientific drivers to develop methods to predict outcomes of these exposures. Even using efficient statistical and experimental designs, it is not possible to test in vivo a...

  8. Integrated Site Investigation Methods and Modeling: Recent Developments at the BHRS (Invited)

    NASA Astrophysics Data System (ADS)

    Barrash, W.; Bradford, J. H.; Cardiff, M. A.; Dafflon, B.; Johnson, B. A.; Malama, B.; Thoma, M. J.

    2010-12-01

    The Boise Hydrogeophysical Research Site (BHRS) is a field-scale test facility in an unconfined aquifer with the goals of: developing cost-effective, non-invasive methods for quantitative characterization of heterogeneous aquifers using hydrologic and geophysical techniques; understanding fundamental relations and processes at multiple scales; and testing theories and models for groundwater flow and solute transport. The design of the BHRS supports a wide range of single-well, cross-hole, multiwell and multilevel hydrologic, geophysical, and combined hydrogeophysical experiments. New installations support direct and geophysical monitoring of hydrologic fluxes and states from the aquifer through the vadose zone to the atmosphere, including ET and river boundary behavior. Efforts to date have largely focused on establishing the 1D, 2D, and 3D distributions of geologic, hydrologic, and geophysical parameters which can then be used as the basis for testing methods to integrate direct and indirect data and invert for “known” parameter distributions, material boundaries, and tracer test or other system state behavior. Aquifer structure at the BHRS is hierarchical and includes layers and lenses that are recognized with geologic, hydrologic, radar, electrical, and seismic methods. Recent advances extend findings and method developments, but also highlight the need to examine assumptions and understand secular influences when designing and modeling field tests. Examples of advances and caveats include: New high-resolution 1D K profiles obtained from multi-level slug tests (inversion improves with priors for aquifer K, wellbore skin, and local presence of roots) show variable correlation with porosity and bring into question a Kozeny-Carman-type relation for much of the system. Modeling of 2D conservative tracer transport through a synthetic BHRS-like heterogeneous system shows the importance of including porosity heterogeneity (rather than assuming constant porosity for an aquifer) in addition to K heterogeneity. Similarly, 3D transient modeling of a conservative tracer test at the BHRS improves significantly with the use of prior geophysical information for layering and parameter structure and with use of both variable porosity and K. Joint inversion of multiple intersecting 2D radar tomograms gives well-resolved and consistent 3D distributions of porosity and unit boundaries that are largely correlated with neutron-porosity log and other site data, but the classic porosity-dielectric relation does not hold for one stratigraphic unit that also is recognized as anomalous with capacitive resistivity logs (i.e., cannot assume one petrophysical relation holds through a given aquifer system). Advances are being made in the new method of hydraulic tomography (2D with coincident electrical geophysics; 3D will be supplemented with priors); caveats here include the importance of boundary conditions and even ET effects. Also integrated data collection and modeling with multiple geophysical and hydrologic methods show promise for high-resolution quantification of vadose zone moisture and parameter distributions to improve variably saturated process models.

  9. Identification of binary and multiple systems in TGAS using the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Jiménez-Esteban, F.; Solano, E.

    2018-04-01

    Binary and multiple stars have long provided an effective method of testing stellar formation and evolution theories. In particular, wide binary systems with separations > 20,000 au are particularly challenging as their physical separations are beyond the typical size of a collapsing cloud core (5,000 - 10,000 au). We present here a preliminary work in which we make use of the TGAS catalogue and Virtual Observatory tools and services (Aladin, TOPCAT, STILTS, VOSA, VizieR) to identify binary and multiple star candidate systems. The catalogue will be available from the Spanish VO portal (http://svo.cab.inta-csic.es) in the coming months.

  10. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  11. Experimental Investigations on Subsequent Yield Surface of Pure Copper by Single-Sample and Multi-Sample Methods under Various Pre-Deformation.

    PubMed

    Liu, Gui-Long; Huang, Shi-Hong; Shi, Che-Si; Zeng, Bin; Zhang, Ke-Shi; Zhong, Xian-Ci

    2018-02-10

    Using copper thin-walled tubular specimens, the subsequent yield surfaces under pre-tension, pre-torsion and pre-combined tension-torsion are measured, where the single-sample and multi-sample methods are applied respectively to determine the yield stresses at specified offset strain. The rule and characteristics of the evolution of the subsequent yield surface are investigated. Under the conditions of different pre-strains, the influence of test point number, test sequence and specified offset strain on the measurement of subsequent yield surface and the concave phenomenon for measured yield surface are studied. Moreover, the feasibility and validity of the two methods are compared. The main conclusions are drawn as follows: (1) For the single or multi-sample method, the measured subsequent yield surfaces are remarkably different from cylindrical yield surfaces proposed by the classical plasticity theory; (2) there are apparent differences between the test results from the two kinds of methods: the multi-sample method is not influenced by the number of test points, test order and the cumulative effect of residual plastic strain resulting from the other test point, while those are very influential in the single-sample method; and (3) the measured subsequent yield surface may appear concave, which can be transformed to convex for single-sample method by changing the test sequence. However, for the multiple-sample method, the concave phenomenon will disappear when a larger offset strain is specified.

  12. Multiple pulse nanosecond laser induced damage threshold on hybrid mirrors

    NASA Astrophysics Data System (ADS)

    Vanda, Jan; Muresan, Mihai-George; Bilek, Vojtech; Sebek, Matej; Hanus, Martin; Lucianetti, Antonio; Rostohar, Danijela; Mocek, Tomas; Škoda, Václav

    2017-11-01

    So-called hybrid mirrors, consisting of broadband metallic surface coated with dielectric reflector designed for specific wavelength, becoming more important with progressing development of broadband mid-IR sources realized using parametric down conversion system. Multiple pulse nanosecond laser induced damage on such mirrors was tested by method s-on-1, where s stands for various numbers of pulses. We show difference in damage threshold between common protected silver mirrors and hybrid silver mirrors prepared by PVD technique and their variants prepared by IAD. Keywords: LIDT,

  13. Time-Series Analysis: Assessing the Effects of Multiple Educational Interventions in a Small-Enrollment Course

    NASA Astrophysics Data System (ADS)

    Warren, Aaron R.

    2009-11-01

    Time-series designs are an alternative to pretest-posttest methods that are able to identify and measure the impacts of multiple educational interventions, even for small student populations. Here, we use an instrument employing standard multiple-choice conceptual questions to collect data from students at regular intervals. The questions are modified by asking students to distribute 100 Confidence Points among the options in order to indicate the perceived likelihood of each answer option being the correct one. Tracking the class-averaged ratings for each option produces a set of time-series. ARIMA (autoregressive integrated moving average) analysis is then used to test for, and measure, changes in each series. In particular, it is possible to discern which educational interventions produce significant changes in class performance. Cluster analysis can also identify groups of students whose ratings evolve in similar ways. A brief overview of our methods and an example are presented.

  14. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    PubMed

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  15. [Development of a proverb test for assessment of concrete thinking problems in schizophrenic patients].

    PubMed

    Barth, A; Küfferle, B

    2001-11-01

    Concretism is considered an important aspect of schizophrenic thought disorder. Traditionally it is measured using the method of proverb interpretation, in which metaphoric proverbs are presented with the request that the subject tell its meaning. Interpretations are recorded and scored on concretistic tendencies. However, this method has two problems: its reliability is doubtful and it is rather complicated to perform. In this paper, a new version of a multiple choice proverb test is presented which can solve these problems in a reliable and economic manner. Using the new test, it is has been shown that schizophrenic patients have greater deficits in proverb interpretation than depressive patients.

  16. Effect of slice thickness on brain magnetic resonance image texture analysis

    PubMed Central

    2010-01-01

    Background The accuracy of texture analysis in clinical evaluation of magnetic resonance images depends considerably on imaging arrangements and various image quality parameters. In this paper, we study the effect of slice thickness on brain tissue texture analysis using a statistical approach and classification of T1-weighted images of clinically confirmed multiple sclerosis patients. Methods We averaged the intensities of three consecutive 1-mm slices to simulate 3-mm slices. Two hundred sixty-four texture parameters were calculated for both the original and the averaged slices. Wilcoxon's signed ranks test was used to find differences between the regions of interest representing white matter and multiple sclerosis plaques. Linear and nonlinear discriminant analyses were applied with several separate training and test sets to determine the actual classification accuracy. Results Only moderate differences in distributions of the texture parameter value for 1-mm and simulated 3-mm-thick slices were found. Our study also showed that white matter areas are well separable from multiple sclerosis plaques even if the slice thickness differs between training and test sets. Conclusions Three-millimeter-thick magnetic resonance image slices acquired with a 1.5 T clinical magnetic resonance scanner seem to be sufficient for texture analysis of multiple sclerosis plaques and white matter tissue. PMID:20955567

  17. Influence of phase inversion on the formation and stability of one-step multiple emulsions.

    PubMed

    Morais, Jacqueline M; Rocha-Filho, Pedro A; Burgess, Diane J

    2009-07-21

    A novel method of preparation of water-in-oil-in-micelle-containing water (W/O/W(m)) multiple emulsions using the one-step emulsification method is reported. These multiple emulsions were normal (not temporary) and stable over a 60 day test period. Previously, reported multiple emulsion by the one-step method were abnormal systems that formed at the inversion point of simple emulsion (where there is an incompatibility in the Ostwald and Bancroft theories, and typically these are O/W/O systems). Pseudoternary phase diagrams and bidimensional process-composition (phase inversion) maps were constructed to assist in process and composition optimization. The surfactants used were PEG40 hydrogenated castor oil and sorbitan oleate, and mineral and vegetables oils were investigated. Physicochemical characterization studies showed experimentally, for the first time, the significance of the ultralow surface tension point on multiple emulsion formation by one-step via phase inversion processes. Although the significance of ultralow surface tension has been speculated previously, to the best of our knowledge, this is the first experimental confirmation. The multiple emulsion system reported here was dependent not only upon the emulsification temperature, but also upon the component ratios, therefore both the emulsion phase inversion and the phase inversion temperature were considered to fully explain their formation. Accordingly, it is hypothesized that the formation of these normal multiple emulsions is not a result of a temporary incompatibility (at the inversion point) during simple emulsion preparation, as previously reported. Rather, these normal W/O/W(m) emulsions are a result of the simultaneous occurrence of catastrophic and transitional phase inversion processes. The formation of the primary emulsions (W/O) is in accordance with the Ostwald theory ,and the formation of the multiple emulsions (W/O/W(m)) is in agreement with the Bancroft theory.

  18. OPATs: Omnibus P-value association tests.

    PubMed

    Chen, Chia-Wei; Yang, Hsin-Chou

    2017-07-10

    Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.

  19. Filling the gap in functional trait databases: use of ecological hypotheses to replace missing data.

    PubMed

    Taugourdeau, Simon; Villerd, Jean; Plantureux, Sylvain; Huguenin-Elie, Olivier; Amiaud, Bernard

    2014-04-01

    Functional trait databases are powerful tools in ecology, though most of them contain large amounts of missing values. The goal of this study was to test the effect of imputation methods on the evaluation of trait values at species level and on the subsequent calculation of functional diversity indices at community level using functional trait databases. Two simple imputation methods (average and median), two methods based on ecological hypotheses, and one multiple imputation method were tested using a large plant trait database, together with the influence of the percentage of missing data and differences between functional traits. At community level, the complete-case approach and three functional diversity indices calculated from grassland plant communities were included. At the species level, one of the methods based on ecological hypothesis was for all traits more accurate than imputation with average or median values, but the multiple imputation method was superior for most of the traits. The method based on functional proximity between species was the best method for traits with an unbalanced distribution, while the method based on the existence of relationships between traits was the best for traits with a balanced distribution. The ranking of the grassland communities for their functional diversity indices was not robust with the complete-case approach, even for low percentages of missing data. With the imputation methods based on ecological hypotheses, functional diversity indices could be computed with a maximum of 30% of missing data, without affecting the ranking between grassland communities. The multiple imputation method performed well, but not better than single imputation based on ecological hypothesis and adapted to the distribution of the trait values for the functional identity and range of the communities. Ecological studies using functional trait databases have to deal with missing data using imputation methods corresponding to their specific needs and making the most out of the information available in the databases. Within this framework, this study indicates the possibilities and limits of single imputation methods based on ecological hypothesis and concludes that they could be useful when studying the ranking of communities for their functional diversity indices.

  20. Filling the gap in functional trait databases: use of ecological hypotheses to replace missing data

    PubMed Central

    Taugourdeau, Simon; Villerd, Jean; Plantureux, Sylvain; Huguenin-Elie, Olivier; Amiaud, Bernard

    2014-01-01

    Functional trait databases are powerful tools in ecology, though most of them contain large amounts of missing values. The goal of this study was to test the effect of imputation methods on the evaluation of trait values at species level and on the subsequent calculation of functional diversity indices at community level using functional trait databases. Two simple imputation methods (average and median), two methods based on ecological hypotheses, and one multiple imputation method were tested using a large plant trait database, together with the influence of the percentage of missing data and differences between functional traits. At community level, the complete-case approach and three functional diversity indices calculated from grassland plant communities were included. At the species level, one of the methods based on ecological hypothesis was for all traits more accurate than imputation with average or median values, but the multiple imputation method was superior for most of the traits. The method based on functional proximity between species was the best method for traits with an unbalanced distribution, while the method based on the existence of relationships between traits was the best for traits with a balanced distribution. The ranking of the grassland communities for their functional diversity indices was not robust with the complete-case approach, even for low percentages of missing data. With the imputation methods based on ecological hypotheses, functional diversity indices could be computed with a maximum of 30% of missing data, without affecting the ranking between grassland communities. The multiple imputation method performed well, but not better than single imputation based on ecological hypothesis and adapted to the distribution of the trait values for the functional identity and range of the communities. Ecological studies using functional trait databases have to deal with missing data using imputation methods corresponding to their specific needs and making the most out of the information available in the databases. Within this framework, this study indicates the possibilities and limits of single imputation methods based on ecological hypothesis and concludes that they could be useful when studying the ranking of communities for their functional diversity indices. PMID:24772273

Top